I want some thoughts on a way to run a job in a workflow whose job is to clean up resources. Basically I have a workflow that:
- Creates a certain resource on AWS.
- Run 3 operations in parallel
- Clean up the AWS resource.
My question if is there’s a built in way in Oban Pro Worflows to run a job even if one (or multiple) of it’s dependencies fail.
Yes! You can wrap the Workflow
in a Batch
with from_workflow.
Here is an example to show how to set a batch callback worker that will be called if dependencies fail (e.g., any are discarded):
Here’s the batch:
defmodule MyApp.CleanupWorker do
use Oban.Pro.Worker
@behaviour Oban.Pro.Batch
@impl Oban.Pro.Worker
def process(_job), do: :ok
@impl Oban.Pro.Batch
def batch_discarded(job) do|
# Handle your cleanup here
IO.inspect(job.args, label: "Discarded")
end
end
Here’s how you’d add it to your workflow:
workflow
|> Batch.from_workflow(callback_worker: MyApp.CleanupWorker)
|> Oban.insert_all()
1 Like
I think that’s exactly what I was looking for. One quick follow up question. My ClanupWorker
should run if all jobs succeeds or fail and I’m also graving some data from the initial worker that spin up the resources. Is it possible to do something like:
def process(job) do
resources = Workflow.get_recorded(job, :create_resources)
# Clean up resources
end
def batch_discarded(job) do
process(job)
end
Great follow up. In that case, you want to use handle_exhausted.
You could do that, but the process/1
function won’t ever get called. SO, put your cleanup code in the batch_discarded/1
function itself.
no misdirection.
2 Likes
Coming back to this. Is there a way to use the workflow heartbeat to do something custom in my system? I basically need to send a heartbeat to an upstream system while the oban workflow is running