dongzhuo1880 2013-01-29 18:17
浏览 88

长时间运行的PHP脚本,加载大量数据

I have a PHP script will be ran on a production web server only once. The script loads a whole lot of db records and then iterates over them, json_decoding a field and checking it. Some fields will be amended and saved back to the database.

The script will consume a lot of memory and will potentially run for 1hr or more.

What should I keep in mind with this type of script with regards to making sure it doesn’t cause the site to go down or any other potential hazards of such scripts.

The basic flow of the script is as follows:

1. load all records from db table
2. iterate over each row
  2.1 json_decode specific field
  2.2 if a matching field is found then
      2.2.1 Change the value 
      2.2.2 Save back to the database
  • 写回答

3条回答 默认 最新

  • dongzhuzhou4504 2013-01-29 18:22
    关注

    If you are dealing with a large amount of data and some heavy processing and your main concern is to maintain the availability of the server for other tasks (such as serving a website), you might want to consider splitting up the task into smaller chunks and having a cron job process each job periodically.

    Other than that you should definitely take a look at the set_time_limit() function. With it you can ensure that your script will not timeout by passing a value of zero -

    set_time_limit(0);
    
    评论

报告相同问题?

悬赏问题

  • ¥15 AT89C51控制8位八段数码管显示时钟。
  • ¥15 真我手机蓝牙传输进度消息被关闭了,怎么打开?(关键词-消息通知)
  • ¥15 下图接收小电路,谁知道原理
  • ¥15 装 pytorch 的时候出了好多问题,遇到这种情况怎么处理?
  • ¥20 IOS游览器某宝手机网页版自动立即购买JavaScript脚本
  • ¥15 手机接入宽带网线,如何释放宽带全部速度
  • ¥30 关于#r语言#的问题:如何对R语言中mfgarch包中构建的garch-midas模型进行样本内长期波动率预测和样本外长期波动率预测
  • ¥15 ETLCloud 处理json多层级问题
  • ¥15 matlab中使用gurobi时报错
  • ¥15 这个主板怎么能扩出一两个sata口