Infuriating, intermittent robots.txt error is back!
I've written a custom function in a Google Docs Spreadsheet that receives a URL from a cell and outputs the title tag from that page. I do this by making a call to Yahoo's YQL Public API and requesting the XPath for the title tag.
This has worked really well in the past. However, I recently started receiving errors, so I went to the YQL Console and tried the call I am making in the spreadsheet with diagnostics enabled.
It turns out that YQL is reporting that the URL I'm querying is restricted by robots.txt. This is not the case as I can confirm there is NO robots.txt on the site whatsoever. At the same time, I can rerun the query in the console an hour later and it will work.
I've searched around it and it seems like this error has dogged YQL for years. Has anyone else experienced this robots.txt error recently?