I took a look at the robots.txt, seems like the current version (http://seoyourblog.com/robots.txt) has the Yahoo Pipes 2.0 User-agent on the previous Disallow line, thus we are not recognizing Yahoo Pipes 2.0 as a valid user-agent:
Seems like your star (*) User-agent Allow is conflicting with your Pipes Disallows. The "/" path is both allowed by the star User-agent and disallowed by the Pipes agents it seems like an ambiguous configuration. Try removing the allow "/" on the star User-agent.
We cache the robots.txt for 1 hour but if you pass in a query parameter of debug=true, we will bypass the cache and load the robots each time. That should help with the cache issues during development. This section in the developers guide about debugging also applies to the robots.txt: