I have an script that grabs content from third party sites and if the url is not found the site redirects with a 302 header location to a custom not found webpage instead of sending a 404 not found. The script also caches the content returned by curl_exec but i don't want to cache the error pages, so is there a way to log those redirects if i have turned on CURLOPT_FOLLOWLOCATION? How can i solve this situation? I know i could just find the error message using a dom parser and if found just discard it, but i want to know if there is other ways to accomplish this.
3条回答 默认 最新
- dpoxk64080 2013-05-17 12:50关注
I ended up disabling followlocation so i just have to catch the 302 code and if it's present i don't cache the page. Thought there would be a way of catching all codes before curl redirects.
本回答被题主选为最佳回答 , 对您是否有帮助呢?解决 无用评论 打赏 举报
悬赏问题
- ¥15 Arcgis相交分析无法绘制一个或多个图形
- ¥15 seatunnel-web使用SQL组件时候后台报错,无法找到表格
- ¥15 fpga自动售货机数码管(相关搜索:数字时钟)
- ¥15 用前端向数据库插入数据,通过debug发现数据能走到后端,但是放行之后就会提示错误
- ¥30 3天&7天&&15天&销量如何统计同一行
- ¥30 帮我写一段可以读取LD2450数据并计算距离的Arduino代码
- ¥15 飞机曲面部件如机翼,壁板等具体的孔位模型
- ¥15 vs2019中数据导出问题
- ¥20 云服务Linux系统TCP-MSS值修改?
- ¥20 关于#单片机#的问题:项目:使用模拟iic与ov2640通讯环境:F407问题:读取的ID号总是0xff,自己调了调发现在读从机数据时,SDA线上并未有信号变化(语言-c语言)