Experimental digging gets Google deeper into sites
An interesting post here from Google lets us know that your friendly Googlebot is now getting deeper into some sites by filling in forms and generating possible ‘real’ URLs. If these result in a crawlable web page and it is deemed to be of value the page is indexed.
This will not have any detrimental effect on a site in any way other than possibly providing more information.
If you do not want your forms crawled the Googlebot will adhere to all robots.txt, nofollow, and noindex directives.
And Google also points out that ‘this experiment follows good Internet citizenry practices’.
Opening up the so called ‘invisible web’ means far more pages and information are now potentially accessible to the search engines. Well, Google. This should be a good thing and while it is not worth being too paranoid if you do not want your forms crawled it may be time to brush up on your robots txt knowledge!