bad google!

googlebot decided to index the entire linuxtroubleshooting.com site. Alot. 800 megs worth of spidering.

Bad googlebot!

Robots.txt to block the whole site for now. Apparently the prefered solution is to do some url rewriting and
only block special pages and scripts and whatnot. But I’m not much for correctly writing htaccess files in that awesome RandomApacheConfSyntaxWePulledOutOfSomeOrifice.

I’ll take a look at it tomorrow.