The scenario I am working on will be to write/post large data sets to an API. At times there maybe 100,000+ entries to post to an API.
To audit and validate the post, is there an option to throttle the post requests into smaller post requests to prevent the backend API service from being overrun?
For example, segment and throttle the requests to POSTs of 500 lines, log the response and if no errors continue to send the POST requests?
the AWS api gateway provides throttling options that may meet these requirements. Does autoflow support any similar throttling options?
There are two different throttling you are discussing. One is throttling the post request, and the AWS feature you referred to is related to throttling from the server side. We have a plan to add server side throttling but in terms of throttling your post request, you can split your data into chunks and use any of the iteration actions to iterate over those chunks to send the post request. We may be able to provide an iteration action that makes it easier for you to achieve this. Will get back to you on that.
Please login first to submit.