Now it's working again, but there is another problem. My timeout is reasonably high and i'm getting no results for every third or fourth request. Tested with curl on local server, same timeout range, every single request is successful with same timeout and query set. I know YQL is simple to use but reliability is poor.
Same here: Queries work fine on some occasions while the same queries return empty results at other times. Its not an IP based limit as I tried multiple IPs. I'm querying against Google maps. YQL console works fine.
My guess is that Google Maps limits are reached, and Google stops answering YQL requests.
More than that, the link to the list of YQL servers to trust doesn't return any information: http://developer.yahoo.com/yql/proxy.txt, so there is no way a content provider can actually filter according to the X-FORWARDED_FOR header securely.
So my guess is that YQL proxy server address gets blocked after a while, while the YQL console uses a different IP address and hence is not blocked.
Is there a way to check if this is indeed true?
Also, assuming there are multiple YQL proxy servers, how is the proxy server selected for each query? Is there a way to select the YQL server, or auto-rotate between them, or anything to make sure that YQL service doesn't 'stop working' ?
Using few set of test querys for cca 30 days 24/7, more then big timediff between querys. I keep log of every query, results and if not successful. For first 5 days or so, 90-95% returned results, after that rapid drop to i guess 10-20% success rate. But if i type in yql console, it would be 100% every time.
I returned to iMacros, Greasemonkey, curl and other methods, which do need a little more work, but with them i get full control and 99% success.. Retrieving "information of interest" in periodical timeframes is what i need. Cant rely on chance.
With YQL i have no feedback if query blocked, sites doesnt response/exists etc. I guess if YQL would be pay service, quality would be higher :).