I'm doing this certain task that involves sending 6 sets of 8 requests each per user, and a total of about 2000 users. It's a bunch of GET requests, used to send commands.
To speed up the sending, I've constructed 4 curl multi-handles, each holding 8 requests, firing them off one after the other, and then continuing on with the next user. Slight problem of it eating 99% of my CPU, and eating only about 5kb per second on my bandwidth. There's no leaks or anything, but when sending 96000 requests, it lags big time, taking up about a good 3 hours on my dual core AMD Phenom.
Are there any methods I can possible speed this up? Using file_get_contents() instead of cURL ends up being 50% slower. But cURL uses only 5 kbps, and eats my CPU out.