I just had a good idea. Why not put a few modules of code in Google (Or name your favorite engine.) that parse the spidered pages and checks them for DTD and CSS validity? Then people could compile search results on pages that truly validate as strict XHTML 1. If you could combine this with normal search strings and then make a point of only visiting or referring your own visitors to sites with valid markup, it could be a very powerful way of forcing the web to clean up their act.
The idea goes further. Suppose you could take something like A-Prompt, Bobby or Crunchy’s Page Screamer and hook it into a search engine. The user could then enter search strings but only get back pages that meet certain minimum accessibility guidelines. This is better than badges because you can get proof of the page author’s claims.
I’ll have to tell my friend to submit this idea to the kind folks over at the WAI.