How to properly use GenServer to poll/parse several servers/websites around once in a minute? No response is needed for a client because data will go to a database.
I know how to do it for a single website. but for several ones… Should I spawn several processes via “spawn” in handle_info – one per website? Or is there a more idiomatic way to achieve the same goal? I want to keep it simple.
I would start a genserver for each website.
Give a check on quantum. It’s very easy to get multiple schedulers running at the same time with it, then you can focus on the behaviour and delegate the multiprocessing and scheduling to the lib.
Why not just use Task.start/1?
Enum.map(data, &Task.start(fn ->
# Process request.
You might also find some answers by searching scrape on this forum.
UPDATE: Sorry, I had to update the post, not to be offensive
I am not a native english speaker.
How is Task better than spawn for my case?
You probably want your coordinating GenServer and the worker processes it spawns to be part of the same supervision tree. That way if the coordinator is restarted, any workers will be terminated too.
Task.Supervisor makes it pretty easy.