dsjpqpdm620596 2012-12-19 01:19
浏览 18
已采纳

PHP:处理数千个条目,脚本在一定量后死掉

I'm calling an MLS service that responds with 4000+ records ... and I need to process each and every one of them, as well as insert all of the meta data per listing.

I'm able to get to about 135 (* 150 meta records) and then the script apparently stops responding, or at least stops processing the rest of the data.

I've added the following to my .htaccess file:

php_value memory_limit 128M

But this doesn't seem to help me any. Do I need to process chunks of the data at a time, or is there another way to ensure that the script will indeed finalize?

  • 写回答

2条回答 默认 最新

  • douzangdang2225 2012-12-19 01:22
    关注

    It isn't the memory- it's most likely the script execution time.

    Try adding this to your htaccess, then restart apache:

    php_value max_execution_time 259200
    
    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(1条)

报告相同问题?

悬赏问题

  • ¥15 Python中的request,如何使用ssr节点,通过代理requests网页。本人在泰国,需要用大陆ip才能玩网页游戏,合法合规。
  • ¥100 为什么这个恒流源电路不能恒流?
  • ¥15 有偿求跨组件数据流路径图
  • ¥15 写一个方法checkPerson,入参实体类Person,出参布尔值
  • ¥15 我想咨询一下路面纹理三维点云数据处理的一些问题,上传的坐标文件里是怎么对无序点进行编号的,以及xy坐标在处理的时候是进行整体模型分片处理的吗
  • ¥15 CSAPPattacklab
  • ¥15 一直显示正在等待HID—ISP
  • ¥15 Python turtle 画图
  • ¥15 stm32开发clion时遇到的编译问题
  • ¥15 lna设计 源简并电感型共源放大器