dseax40600 2018-04-10 14:54
浏览 85

Laravel和Goutte刮刀[关闭]

Iam going to scraper 4 website using Laravel and Goutte .

number of url(s) are 900 and I don't have any idea, how to send url(s)

(I wrote crawler code and don't have any question about this)

but I don't know How to send url(s)? Must I use queue or cronJob or ... ?

Do you recognize any package or tool or idea ? I don't have any idea to send 900 urls, 5 times in a day

  • 写回答

1条回答 默认 最新

  • dongrong3171 2018-04-10 15:03
    关注

    If you wrote crawl code for websites, you can separate the links and store in CSV format file. You should write another script that enable to read with an exact numbers of these urls in CSV file and send you back. It's very easy in Ruby with open.csv library.

    评论

报告相同问题?

悬赏问题

  • ¥15 安卓adb backup备份应用数据失败
  • ¥15 eclipse运行项目时遇到的问题
  • ¥15 关于#c##的问题:最近需要用CAT工具Trados进行一些开发
  • ¥15 南大pa1 小游戏没有界面,并且报了如下错误,尝试过换显卡驱动,但是好像不行
  • ¥15 没有证书,nginx怎么反向代理到只能接受https的公网网站
  • ¥50 成都蓉城足球俱乐部小程序抢票
  • ¥15 yolov7训练自己的数据集
  • ¥15 esp8266与51单片机连接问题(标签-单片机|关键词-串口)(相关搜索:51单片机|单片机|测试代码)
  • ¥15 电力市场出清matlab yalmip kkt 双层优化问题
  • ¥30 ros小车路径规划实现不了,如何解决?(操作系统-ubuntu)