duanlei7101 2011-02-08 20:17
浏览 34
已采纳

PHP:更快的cURL执行

I have an application that uses cURL to grab the contents of several websites. I'd like to optimize this somehow. Would it be possible to implement a singleton design pattern and somehow feed curl the URLs I need contents for at certain intervals -- such that I only instantiate it once?

Right now, I setup and destroy connections for each call. Sample code would be highly appreciated.

Thanks.

  • 写回答

3条回答 默认 最新

  • dongshao5573 2011-02-08 20:26
    关注

    This sounds like unnecessary micro-optimization to me. You'll save a fraction of a microsecond for a process that has to go across the internet to grab a hunk of data from a resource that's already out of your control. If you're simply trying to get the process to run faster, maybe try running multiple downloads in parallel.

    Edit: And/or make sure your curl supports compressed content.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(2条)

报告相同问题?

悬赏问题

  • ¥100 Jenkins自动化部署—悬赏100元
  • ¥15 关于#python#的问题:求帮写python代码
  • ¥20 MATLAB画图图形出现上下震荡的线条
  • ¥15 关于#windows#的问题:怎么用WIN 11系统的电脑 克隆WIN NT3.51-4.0系统的硬盘
  • ¥15 perl MISA分析p3_in脚本出错
  • ¥15 k8s部署jupyterlab,jupyterlab保存不了文件
  • ¥15 ubuntu虚拟机打包apk错误
  • ¥199 rust编程架构设计的方案 有偿
  • ¥15 回答4f系统的像差计算
  • ¥15 java如何提取出pdf里的文字?