duanjiaonie6097 2013-05-11 05:48
浏览 53
已采纳

如何使用wget来计算网站上的页数?

There are prospective client websites asking us for a re-design. Often when I ask, "How many pages is your website?" the answer I too often get is, "I don't know". It would be simple to count if there are 10 pages on their website, but these are larger websites with perhaps hundreds of pages.

Is there a way to count all the pages on a website without doing it manually? I know wget can be used to download pages from a website, but I don't want to download all their pages, besides that would just give me a collection of files not pages.

Is wget the solution to this? If so, how could it be used to count the pages of a website? If not with wget, is there another solution that would work? Remember, I don't have internal access to their website to do the count, it has to be done from the web. Or is counting the internal links equate to a page?

  • 写回答

1条回答 默认 最新

  • doujiabing1228 2013-05-11 05:55
    关注

    You can easily count web pages if website is static or small.

    but if website is too big like StackOverflow then you can use Google indexing.

    Just goto google and search : site:stackoverflow.com

    it returns the number of pages : About 17,000,000 results

    You can put site: before each website and google show the total pages of that website

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论

报告相同问题?