0

YQL query getting rejected because of robots.txt?

I have a javascript plugin I have written to give to clients so they can embed my sites content into their site. Now this works perfect and uses YQL to avoid the issue with cross domain javascript calls. The problem is this only works some of the time and some time the call fails because of a robots.txt denial. The call never changes so I am having a hard time understanding why this only happens about half the time. Now I control the javascript call and the service side code, so my question is what I have to do to make these calls work 100% of the time? I contacted support for my hosting and they said that my webserver does not have a robots.txt file on it so it should not be denying the call. Has anyone had this issue before and how did you solve it?

Any help would be appreciated

Thanks,

Bryan

by
2 Replies
  • I am also having this issue. &nbsp;My YQL queries using my own open data tables were working fine, then suddenly a redirection to a robots.txt file blocking access that does not exist?! &nbsp;What is going on. &nbsp;It is making my app very unreliable, and appears to be also blocking previously cached content from setting response.maxAge.<br><br>Any help greatly appreciated!
    0
  • Ashley I addressed your issue in your other post.<br><br><br>Thanks -Paul<br>YQL Team
    0

Recent Posts

in YQL