I changed robots.txt and start allow spider/crawler/robot crawling my blog.
I disabled it whenever I first setup this blog, as this is a paid service with quota, and I don’t want to overspend or get my service suspended in mid-month. After 2 or 3 months run godaddy shows that every month I’m utilizing less than 200M bandwidth while quota is 1,500G so I guess I can allow search engine index my blogs.
I don’t know how much use information I have here, in another blog I no longer update (actually it is shutdown already) I have something useful for setting up a Linux box, as I use blog as my notepad. Since I now longer running my own Linux (not 100% true, but none of my Linux boxes are publicly accessible), so I won’t do too much sysadm stuffs.
Let’s see if I can make my blog a Ada + Windows resource center, it’s nice to help others.