amazon s3 - s3 rate limit against website endpoint -


i'm hitting s3 bucket via website endpoint various paths/keys. i'm able ok (200) responses when i'm hitting @ 1,000 requests per second on course of 5 minutes. i'm using popular tool: https://github.com/tsenart/vegeta have confidence in these stats.

this suprising considering documentation says above 800 per second problematic.

is using website endpoint different api call in terms of throttling? 800 real rate limit or crude theshhold?

it's soft limit, , not limit bucket level perspective. read carefully. documentation warns of rapid request rate increase beyond 800 requests per second potentially resulting in temporary rate limits on request rate.

s3 increases available capacity keyspace partition splitting , takes time happen... buckets scale workload.

if requesting same object(s) repeatedly, not imposing load on available resources if hitting 800 unique objects per second , reading between lines, threshold under discussion -- time keys in bucket index. recent hits more accessible cold spots in index.

the problem document highlights of object keys lexically sequential, s3 unable split partitions meaningfully, because creating new objects on 1 side of split or other , working against scaling algorithm of s3.


Comments

Popular posts from this blog

angular - Ionic slides - dynamically add slides before and after -

minify - Minimizing css files -

Add a dynamic header in angular 2 http provider -