dsifjgogw48491752 2010-11-24 23:06
浏览 81

减少cURL处理器使用率

I'm doing this certain task that involves sending 6 sets of 8 requests each per user, and a total of about 2000 users. It's a bunch of GET requests, used to send commands.

To speed up the sending, I've constructed 4 curl multi-handles, each holding 8 requests, firing them off one after the other, and then continuing on with the next user. Slight problem of it eating 99% of my CPU, and eating only about 5kb per second on my bandwidth. There's no leaks or anything, but when sending 96000 requests, it lags big time, taking up about a good 3 hours on my dual core AMD Phenom.

Are there any methods I can possible speed this up? Using file_get_contents() instead of cURL ends up being 50% slower. But cURL uses only 5 kbps, and eats my CPU out.

  • 写回答

1条回答 默认 最新

  • douzai6337 2010-11-25 01:22
    关注

    Have you tried using fopen() for your requests instead of curl? This could also be putting a load on where ever you are sending the requests? It won't return until the web server finishes the request. Do you need the data back to present the user, if not, can you run the queries in the background? The real question is why are you sending so many request, and it would be far better to consolidate them into fewer requests. You have a lot of variables in this setup that can contribute to the speed.

    评论

报告相同问题?

悬赏问题

  • ¥100 set_link_state
  • ¥15 虚幻5 UE美术毛发渲染
  • ¥15 CVRP 图论 物流运输优化
  • ¥15 Tableau online 嵌入ppt失败
  • ¥100 支付宝网页转账系统不识别账号
  • ¥15 基于单片机的靶位控制系统
  • ¥15 真我手机蓝牙传输进度消息被关闭了,怎么打开?(关键词-消息通知)
  • ¥15 装 pytorch 的时候出了好多问题,遇到这种情况怎么处理?
  • ¥20 IOS游览器某宝手机网页版自动立即购买JavaScript脚本
  • ¥15 手机接入宽带网线,如何释放宽带全部速度