Broadway Processors - is it possible to embed a Genstage pipeline?

From reading the broadway docs it seems like it only supports ‘single’ processor logic (that can run concurrently).

Wondering how to get the benefits of Broadway, but with multiple process steps/stages similar to generic Genstage pipelines.

So, the short answer is: no. Broadway is designed to do “end-of-the-line” data processing. You can process each message concurrently (in the processor), then potentially batch messages, and process those batches concurrently. That’s about it. If you want more complex stages after that, you’ll need to implement your own GenStage pipeline.

In general, maybe I can ask what would be the use case for multiple processing steps?

Makes sense. Here is my ‘fantasy architecture’ I’ve been designing for fun :slight_smile:

Broadway ingests data to environment queues. From there its all Genstage pipelines for data handling. Is it possible/recommended for Broadway to ingest from a phoenix post api? That would require a custom producer with the transform but I don’t see why not.

Yes, you can totally build a producer that produces off of incoming HTTP requests.

1 Like

Nice! Considering the existing Phoenix integration and dashboarding, it would be nice to have an official producer for a post api. Even if its used for quick testing would still be cool.

So far I’m thinking Broadway is a great fit for data ingest / rate limiting / environment mapping. Haven’t researched yet on any other Phoenix solutions.

hmm in my scenario Broadway is sitting behind a phoenix api server. After a bit more reading I think Broadways rate limiting would be global level for for all events. Plug_Attack library seems more granular for Phoenix.