Hi Stackoverflowianers,
the following stack overflow belongs to my brain, not to my program:
I'm developing on a large scale project and now have to push millions of data requests in its database. For that to work I have to start a script which takes out a search text line by line from a .csv file and starts the process... this search word is sent to 6 official apis and gets data back inbetween 0,2 seconds and max. 10 seconds. as one by one is comming back, the script writes the response to the database.
I hope you understand what I mean. So my question is:
In the moment I do it with a cron job every minute... Is it possible to do it without any delay between the api-requests and still have a reliable and constant result?
Or will it be like stopping one process before it is ready because the next wants to take place and so on and so on...?
Thank you in advance for your thoughts on this.
Best regards, AceLine
PS: I do not have the experience to understand what happens at this speed of threads...