So I read an article earlier that caught my attention.
In summary it says any URL that a Google+ user or searcher says is worth a +1 click is captured by Google. I see why for public pages you want to appear in search results it’s a good thing but for URLs that have been added to a Robot.txt file this could potentially be a bad thing.
The robot.txt file is used to stop Google from indexing pages in their search engine. So when the Google robots visit your website to index pages they ignore everything the webmaster lists in the Robot.txt file. However since Google launched the +1 service the robots will include links to pages that have been “+1’d” regardless of whether they’re in the Robot.txt. They won’t index all the content but they will index the URL so that your URL will appear in search results.
While this isn’t particularly new (Google use all ways they can to find your links) the robot.txt file usually keeps you out of the search engine results pages (the SERPS). As the original post describes potentially this could be a bad thing as +1’d URL’s could get into the public SERP. This is a small chance but still the possibility exists that the flaw could be exploited and it means SEO folks will have to do more work than simply the robot.txt file to keep pages out of Google. Although many would argue that has always been the case! 😀
On the other hand the good news is that it appears that for public pages Google are attempting to help users define interesting content themselves. So it appears SEO has a new angle. If the pages are +1’d they are going to improve their ranks in organic search results.