duanbi9202 2010-06-23 14:49
浏览 101
已采纳

PHP多线程CURL能力问题

I have a php script that comes down to the following:

while ($arr = mysql_fetch_array($result))
{
   $url = $arr['url'];
   exec("curl $url > /dev/null &");
} 

$url will represent a remote script.

My question is, what can I expect if I try to cycle through 2,000 URLs.

Will opening that many CURL connections cripple my server? Could I fire all 2000 in less than one minute?

What I am trying to do is prevent my users from having to setup cronjobs by opening connections and running their remote scripts for them.

Can you guys advise? I'm out of my league today.

Hudson

  • 写回答

2条回答 默认 最新

  • dongqindan4406 2010-06-23 14:52
    关注

    Take a look at curl_multi_init. It will not fire multiple processes so it should be softer on your server.

    I would advise you to fire only 3-15 at a time, depending on the load the server can handle and the complexity of the scripts. 2000 at a time will probably make you run out of file descriptors or other limit.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(1条)

报告相同问题?

悬赏问题

  • ¥100 set_link_state
  • ¥15 虚幻5 UE美术毛发渲染
  • ¥15 CVRP 图论 物流运输优化
  • ¥15 Tableau online 嵌入ppt失败
  • ¥100 支付宝网页转账系统不识别账号
  • ¥15 基于单片机的靶位控制系统
  • ¥15 真我手机蓝牙传输进度消息被关闭了,怎么打开?(关键词-消息通知)
  • ¥15 装 pytorch 的时候出了好多问题,遇到这种情况怎么处理?
  • ¥20 IOS游览器某宝手机网页版自动立即购买JavaScript脚本
  • ¥15 手机接入宽带网线,如何释放宽带全部速度