drl47263 2011-02-14 09:59
浏览 33

PHP f -n file_get_contents停止执行巨大的html页面

I'm using file_get_contents function to get and parse the html, from huge page (32 000 rows). It works for small/normal pages but with this one it just stops.

It doesn't give any error. It doesn't read the next line in my PHP script, but just stops..I tried to increase the time out time or using cURL but it still does not work.

It works on my local XAMPP but it does not work when I upload it to my hosting. Does anyone know what setting is messed up in my PHP hosting? I think it is some buffer issue..

  • 写回答

1条回答 默认 最新

  • dqf2015 2011-02-14 10:02
    关注

    It is possible that it is caused by memory_limit

    Here is some additional information regarding the directive

    To determine the cause of this you should prepend code such as the following to your script (leave out log statements if you wish to display errors in the browser instead)

    ini_set('display_errors', 1); 
    ini_set('log_errors', 1); 
    ini_set('error_log', dirname(__FILE__) . '/error_log.txt'); 
    error_reporting(E_ALL);
    
    评论

报告相同问题?

悬赏问题

  • ¥15 使用ue5插件narrative时如何切换关卡也保存叙事任务记录
  • ¥20 软件测试决策法疑问求解答
  • ¥15 win11 23H2删除推荐的项目,支持注册表等
  • ¥15 matlab 用yalmip搭建模型,cplex求解,线性化处理的方法
  • ¥15 qt6.6.3 基于百度云的语音识别 不会改
  • ¥15 关于#目标检测#的问题:大概就是类似后台自动检测某下架商品的库存,在他监测到该商品上架并且可以购买的瞬间点击立即购买下单
  • ¥15 神经网络怎么把隐含层变量融合到损失函数中?
  • ¥15 lingo18勾选global solver求解使用的算法
  • ¥15 全部备份安卓app数据包括密码,可以复制到另一手机上运行
  • ¥20 测距传感器数据手册i2c