Google just announced the beta service: Web Accelerator. The magic behind this is that the Google Grid is now a proxy itself.
It is even a smart proxy than can send diffs rather than full pages, and can actually gzip the content.
The most interesting part of this how thing, is how Google can use the data. I am not talking about the privacy folks jumping up and down warning you not to use it, rather it can be used for GOOD.
One problem with Google crawling around the web is that the GoogleBot is a bot. It isn’t human.
Now, if enough people use this service, they will be able to work out the important links. If a popular page such as Slashdot put hidden links, Google wouldn’t be fooled as NO humans would be following those links.
This can be huge, and can add to Trust Rank by grokking the human element. Interesting stuff. I am a little sceptical to see how much the tool speeds things up though, but I will give it a shot :)
I wonder if Google and Firefox could be tweaked to work even better together… and I also wonder how it works out how much time the accelerator has saved you :)