Microsofts Struggling Search Engine Bing
“For Microsoft, any limits imposed on Google might help it improve the fortunes of its struggling search engine, Bing.”
While Nick and Eric’s article is more about Google than Bing, the statement does make us think about Bing as a search engine…
Poor Indexing, Poor Search Results
For the average web searcher Bing, along with the sister Microsoft search engines Yahoo and Search MSN, provides a poor experience to internet users.
Using Microsoft’s search engines is not a reliable way of finding content on the web one searches for. MS can hardly expect their engine will become the 1st or even 2nd choice of internet users until the results return reflect the content they want to find.
Right now Bing does not provide a good user experience – until it does this search engine is doomed to fail, no matter what efforts MS tries using lawyers to attack Google.
Bing a Webmaster’s Nightmare
For those of us tasked with managing websites Bing is a bit of a nightmare. The crawl rate alone is enough to create problems for small sites hosted on low-end shared hosting. While Bing spider crawl rate can be set simply in robots.txt, it’s one of the few robots rules these search engines seem to obey!
Talking of Bing as a rule breaking spider, what do we mean. These bots seem to decide which robots disallow rules they will obey or ignore. For instance; file extension disallow for images (.jpg, .png, .bmp and so on) get ignored and Bing happily goes about crawling images regardless of the rules.
Another rule they seem to choose to obey or not are path disallow rules – for example you don’t want search engines to index any content with “/abc/ in the path ( domain/abc/post-url ). Bing and the rest of this crummy bunch follow and index these urls anyway (or maybe I should rather say store them in their database regardless…)
The same applies to html robots meta tags – nofollow, noindex is treated as so much garbage…
I real life example is a client’s site that at one time used a 3rd party embed to provide a property listing service – Because the (badly conceived) system from this 3rd party removed listings after they were no longer available without a proper 404 redirect option in place, indexing these pages created thousands of broken links. More than 6 months after this system was removed from the site, Bing/Search MSN continues to look for these links producing several hundred entries in the 404 logs daily. Google has not looked for a single one!
Then we have the poor indexing – Bing may hit your site from 20 different IP’s at the same time, and try to crawl hundreds of pages a second – but you’ll be very lucky if 1/10th of the content on your site becomes available to searchers…
Users Want to Find What They Want
At the end of the day, Internet users will only use a search engine that gives them links to whatever they searched for. If a search engine isn’t indexing content, or only partially indexing a website, how can it possibly return the right content to searchers.
For Microsoft to gain any credibility for Bing, they need to look at they way their search engines work. While Google remains the only reliable search tool, searchers will not be keen to rely on any other engine to search the web.