weixin_33728268 2013-12-05 23:27 采纳率: 0%
浏览 65

ajax请求和robots.txt

A website has a URL http://example.com/wp-admin/admin-ajax.php?action=FUNCTIOn_NAME. When I click the URL, it executes the ajax function.

When I put the URL in the address bar, it gives a redirect error because the URL doesn't actually take you anywhere, but it definitely still executes the ajax function.

When I use the command line bash call: firefox -new-window http://example.com/wp-admin/admin-ajax.php?action=FUNCTIOn_NAME, it opens a empty page except for the line "Bad user...". After some digging I found that the robots.txt file has "Disalow: /wp-admin/". I am assuming this is why it isn't working in the command line. I have used wget -e robots=off URL before, but there isn't anything to download so it doesn't apply here.

What type of URL is this? (I believe it's dynamic or formula, but not sure)

I want to get the same results with the command line as when I plug the URL into the address bar. Ideas?

  • 写回答

1条回答 默认 最新

  • weixin_33696822 2013-12-05 23:30
    关注

    It's nothing special it just display a that html no matter what. HTTP servers don't have use files. It could be written in c++, java, python or nodejs(probably not).

    评论

报告相同问题?

悬赏问题

  • ¥15 素材场景中光线烘焙后灯光失效
  • ¥15 请教一下各位,为什么我这个没有实现模拟点击
  • ¥15 执行 virtuoso 命令后,界面没有,cadence 启动不起来
  • ¥50 comfyui下连接animatediff节点生成视频质量非常差的原因
  • ¥20 有关区间dp的问题求解
  • ¥15 多电路系统共用电源的串扰问题
  • ¥15 slam rangenet++配置
  • ¥15 有没有研究水声通信方面的帮我改俩matlab代码
  • ¥15 ubuntu子系统密码忘记
  • ¥15 保护模式-系统加载-段寄存器