frumos
July 22, 2020, 6:32am
1
Hello Elixir community!
I have an issue that S3 throttles my requests and app eventually fails.
This is a function uploading stuff and been throttled by S3
def upload_file(bucket, object_key, file_path) do
Logger.info("Upload file: #{file_path} to: #{object_key}")
file_path
|> ExAws.S3.Upload.stream_file()
|> ExAws.S3.upload(bucket, object_key,
# 240 sec
timeout: 240_000
)
|> ExAws.request!()
object_key
end
This is S3 response:
** (EXIT from #PID<0.1257.0>) shell process exited with reason: an exception was raised:
** (ExAws.Error) ExAws Request Error!
{:error, {:http_error, 503, "<Error><Code>SlowDown</Code><Message>Please reduce your request rate.</Message><RequestId>A982186D97E156DC</RequestId><HostId>gZCe7g1Hat41iziw8NYDK8K1Wz9WoHEPpq1DrNv9bVAqDB4Mh+beKYXjE8BMqPPgjeCOnSZ9unQ=</HostId></Error>"}}
(ex_aws 2.1.3) lib/ex_aws.ex:66: ExAws.request!/2
(kona 0.1.0) lib/remo3/aws_s3.ex:38: Remo3.Aws.S3.upload_file/3
(kona 0.1.0) lib/remo3/account.ex:103: Remo3.Importer.Account.finalize_account_file/3
(elixir 1.10.3) lib/task/supervised.ex:90: Task.Supervised.invoke_mfa/2
(elixir 1.10.3) lib/task/supervised.ex:35: Task.Supervised.reply/5
(stdlib 3.11) proc_lib.erl:249: :proc_lib.init_p_do_apply/3
this is ExAws config:
config :ex_aws, :retries,
max_attempts: 30,
base_backoff_in_ms: 30,
max_backoff_in_ms: 60_000
Could you suggest please how can resolve the issue by continue retrying until success? What is missed in my implementation for that?
Thank you.
PS: My S3 partition schema is in good shape and been reviewed by S3 team.
1 Like
NobbZ
July 22, 2020, 7:12am
2
You need to rate limit your calls to uploaf_file/3
.
You seem to be calling it beyond your allowed rate.
There are several ways to do so and do depend on how you are using the function.
frumos
July 22, 2020, 7:20am
3
Sorry I just need to persist to retry … my rate is far less that capabilities of S3 (this has been already discussed internally according to this: https://docs.aws.amazon.com/AmazonS3/latest/dev/optimizing-performance.html )
The reason S3 throttles is that it detect it as anomaly burst with internal mechanism. So my goal is to get retry logic in shape. Thank you.
NobbZ
July 22, 2020, 7:35am
4
If it’s detected as an abnormal burst, you need to limit the rate with which you try to upload.
Just retrying more often will not solve the issue. It will just make it worse, as it even adds more and more requests.
frumos
July 22, 2020, 7:42am
5
Sorry I just need to retry with exponential backoff and this strategy to my understanding comes out of the box here: https://hexdocs.pm/ex_aws/ExAws.html but i am not sure if it even works. At least I see no any actions in the log for that.
kip
July 22, 2020, 8:41am
6
@NobbZ is suggesting that any back off strategy is not going to produce optimism throughput and may in fact contribute to you getting rate limited messages because you will likely make the “burst” behaviour more prevalent as detected by AWS.
Using a rate limiter will maximise request throughput, create a much more event workload for AWS and therefore reduce the change you get rate limited (according to whatever AWS decides your rate should be).
Using a rate limiter with Elixir (and Phoenix if required) is quite easy. This article might be a good place to start.
3 Likes
Try using the non-bang variant of the request function ExAws.request/2
and then match on success/rate limit/failure
You could try spreading out or randomising your prefixes in the object key.
AWS applies rate limitations at the prefix level as well.