I am planning to create my own search engine .for that i have idea to create a crawler/bot that indexes the web but don't have idea where to host it so it can index web 24/7 and add new pages automatically to database ..
I prefer c# as language to code it .
also i want guidance on which type of database will be better for crawl engine Queue .
any idea about till which level I have to crawl any link ?