a cog in the crawler

now that google is helping me to surf faster (works as advertised, by the way), i have effectively become a cog in a huge distributed crawling machine. obviously, this is only the first step (alexa-style traffic analysis is naturally already happening). if you control the proxy that people use, annotation and tagging at internet scale are suddenly becoming feasible. ‘tag this’ button in the google toolbar anyone? this will lead to a repeat of the third voice law suits, but these features are too useful to be derailed by these problems for long. years ago at kpmg, i experimented with the office server extensions annotation system, and i am eager to see it return in a crossplattform way. [update] people have been pointing out the possibilities for adsense (targeted ads based on your surfing history), personalized search and cobrowsing

One thought on “a cog in the crawler”

  1. I have mixed feelings about prefetch (unless it is specified by the page). And as for pushing all my pages through a Google proxy server, no thanks. I’m not a privacy nut but I’m still reluctant to volunteer to put a tap on my line. Also, think about the uproar if Microsoft did this. Google has built alot of goodwill over the past years, and they seem to be cashing in on it.

Comments are closed.