I've been seeing reports in various places over the last few days that Google is going to start indexing 'the invisible web'--web pages that are hidden behind HTML forms that Google normally couldn't index. They intend to do this by submitting some typical queries to the forms that seem to be search forms, and crawling through menu options.
This is a great idea, and the problem of how to decide what is a 'typical' query seems like an interesting one. Google's approach seems to be to use text that's present on the site that contains the form, which is probably as good a method as any. I do wonder, though, whether it would prove worthwhile to have someone manually (or semi-manually, anyway) generating queries on sites that are known to have a lot of information behind forms, like government websites.
I hope that this idea proves useful, and I look forward to better and more comprehensive search results in the future.