Do you want to know if this is possible at all?
Do you want to hire someone doing this for you?
Do you need some help implementing this?
Do you need help planning this in more detail?
This is a multi tenant importer. where multiple tenants can start/auto-start imports from multiple data providers API’s.
data providers are Amazon & Amazon FBA, Bigcommerce, Ebay, Ecomdash, Magento, ShippingEasy, ShipStation, ShipWorks, Shopify, Teapplix.
Current data is imported using rails delayed jobs. The system can at anytime go to max resource utilisation, due to massive imports running simultaneously on 10 workers.
So we thought of using golang to reduce the pressure. We were actually getting a performance boost in golang with lambda process. But migration to golang is time consuming.
So now we are thinking of using elixir to come to the rescue.
So we need the import to a fault tolerant, recoverable, efficient than rails.
current structure is like: -
data processor(umbrella)
—|----- apps
------------|---- api
------------|---- parser(for each api like shopify, amazon, etc)
------------|---- xml generator for INDIVIDUAL record. So if 100 records are there, then create 100 XML files in concurrent tasks.
------------|---- upload individual XML to s3 and call DB inserter API.
parser will recieve any api request from csv, shopify, etc and should pass data to the specific worker.
Then it should parse data and call xml generator for individual records in chucks of maybe 100 records.
Then push to s3 and call DB API.
def do_stuff(filepath)
filepath
|> parse_data/1
|> Enum.chunk_every(100)
|> Enum.map(&process_batch(&1))
end
defp process_batch(batch) do
batch
|> Enum.map(&Task.async(__MODULE__, :process_xml, [&1]))
|> Enum.map(&Task.await(&1, 10_000))
end
def process_xml(xml) do
# do what You want
end
This should process concurently by chunk of 100. Of course, it’s just pseudo code…
Also I am not sure You need an umbrella with as many contexts. Maybe one module, with 3 functions would be enough.