duanjiaonie6097 2013-05-10 21:48
浏览 53
已采纳

如何使用wget来计算网站上的页数?

There are prospective client websites asking us for a re-design. Often when I ask, "How many pages is your website?" the answer I too often get is, "I don't know". It would be simple to count if there are 10 pages on their website, but these are larger websites with perhaps hundreds of pages.

Is there a way to count all the pages on a website without doing it manually? I know wget can be used to download pages from a website, but I don't want to download all their pages, besides that would just give me a collection of files not pages.

Is wget the solution to this? If so, how could it be used to count the pages of a website? If not with wget, is there another solution that would work? Remember, I don't have internal access to their website to do the count, it has to be done from the web. Or is counting the internal links equate to a page?

  • 写回答

1条回答 默认 最新

  • doujiabing1228 2013-05-10 21:55
    关注

    You can easily count web pages if website is static or small.

    but if website is too big like StackOverflow then you can use Google indexing.

    Just goto google and search : site:stackoverflow.com

    it returns the number of pages : About 17,000,000 results

    You can put site: before each website and google show the total pages of that website

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
编辑
预览

报告相同问题?

手机看
程序员都在用的中文IT技术交流社区

程序员都在用的中文IT技术交流社区

专业的中文 IT 技术社区,与千万技术人共成长

专业的中文 IT 技术社区,与千万技术人共成长

关注【CSDN】视频号,行业资讯、技术分享精彩不断,直播好礼送不停!

关注【CSDN】视频号,行业资讯、技术分享精彩不断,直播好礼送不停!

客服 返回
顶部