2020-12-09 15:10

Crawler usability

Description I was testing out the new crawler. Unfortunately it did not work at all (log reports Couldn't resolve host name for http://foo.local/..... but thats another story.

More interesting is the fact that the UI, although an error occured, does not state any problems at all. Everything is green and says 0 URL(s) erfolgreich indexiert. 0 fehlgeschlagen.

If a user has a big installation with many pages and multiple roots this could be a real problem.

As long as he does not open the debug log, which IMHO should only be available in debug mode, he will never know that something might not work at all. I've seen there's an option to search explicitly for broken links but since you already have the information that something went wrong why not pass it onto the user?


On further testing I noticed that crawling for broken links is quite strange. My Demo-Installation only has a few pages (13 + News) but looking at the numbers and logs Contao crawls, in this case, more than 800 (!) external pages. And again, the messages presented to the user are not quite reasonable since the numbers do not match up at all.

crawler-2020-02-12 132144


  • 点赞
  • 回答
  • 收藏
  • 复制链接分享