Why this Crawly app doesn't write to disk parsed items nor logs?

I’ve implemented my first web-scraper with Crawly following the tutorial published on the official docs.

I’m at the point where I start a single crawler module from iex and it seems it’s successfully crawling but it’s not writing to disk.

I’ve checked it’s crawling the website and extracting some items using IO.inspect.
I’ve limited the total number of items to make sure it completed gracefully a run.
The Crawly.Pipelines.WriteToFile module is on the pipeline and it’s configured to write to the user-writable folder /tmp/crawly, but at the end of the run it’s as empty as ever.
I’ve enabled logging pointing in the same folder but that’s not written either.

I’ve never debugged an Elixir/Erlang app and I still find the idea a bit daunting.

Anyone has a clue why it’s not writing to disk?

https://git.sr.ht/~protoboolean/unicrawl

I haven’t use crawly so have no direct experience. But I do notice your configuration file is called config.ex when it should be config.exs which may be part of the issue.

1 Like