Uncategorized

MSN Toolbar

June 15, 2005

author:

MSN Toolbar

So I decided to give MSN Desktop Search/Toolbar a whirl yesterday.

After my rather disappointing experience with Google Desktop Search (which I uninstalled within hours of installation), I had fairly low expectations. But so far I completely amazed. Google had better get some of those skunkworks projects into prime-time, because the signs are clear — Microsoft is going at this search thing with such gusto that it will not take very long for it to catch-up and even pass Googliath. 

The attention to detail in the MSN Desktop Search is unlike any I have seen in Microsoft apps before. From a usability standpoint the U.I. is nearly invisible — the hallmark of an intuitive and well-designed U.I. I especially like the preview feature on search results where you can see the search word in application context (example: within an Excel spreadsheet).

I’ll try the desktop search for a few more days and see how it goes. For web searching, I am still sticking with Google, but I’ll keep an eye on MSN. But one thing I am keeping for sure is the IE tabbed browsing.

FireFox has always provided a superior browsing experience in many ways and tabbed browsing was one huge advantage it had over IE. Gone! One thing I never quite figured out in FireFox is how to save the state of the browser when multiple tabs are open so I can restore it the same way. (Of course, if the feature exists and I can’t find it, it’s not much of a feature anyway). The MSN toolbar does have this feature and it lets me save one set of tabs as “My Tabs” which I can restore at the click of a button. Neat. I also like having the option to open each link I click in a new tab. This is great for search engine results. I had fully expected this to be an all or nothing feature, but was pleasantly surprised to find that it is tab-specific. So on tabs where I have search results displayed, I can have this feature on, and turn it off elsewhere.

Now for a little rant.

With all the hype and money and technology being invested in search by Google, Yahoo and Microsoft, why is it so difficult to find things in context and why are search interfaces still so lame. I don’t claim to have a magic solution, but I am certain the keyword approach with control suffix/prefix characters is a crappy interface for search. And don’t get me started on boolean logic. As a developer, I see its value, but for the ordinary individual, it is completely unnatural. Much has been said about Google’s precision. I conduct maybe 50-70 searches in any given day. It is quite rare that the relevant item I am searching for is on the first page of results (and I have the page size set at 30). So either I am an idiot and can’t figure out what keywords to use, or the search technology is still a pile. I suspect that it’s the latter, but of course, I can’t be objective.

There is a simple solution. Google is currently beta testing its Sitemaps concept. (A good idea which will soon result in bad data. More on this some other time.) A better idea would be for Google to publish the specs for a site search webservice. Say what? Read on…

When you’re looking at search results, Google makes relevancy decisions based on its super cool algorithms. One thing I notice quite often is the search results contain page after page of results from the same site. Now, how often is it that you click on one of those results and get exactly the content you are looking for? The answer is “it depends.” If it’s a content-centric site, then chances are good. But if the result is from a forum post or some dynamically generated page, you have to click around a bit to find the information you need. Often, you end up using the site search (if it has that capability) and then the results are more specific and also have abstracts that are in context. It would be so much simpler if Google took a smarter approach.

Let’s say that site www.somesite.com had more than say 20% of the first 50 results. If Google had a published a site search webservice specification, the owners of somesite.com could implement this on their site. When Google detects that one site is responsible for a high number of relevant search results (determined by its own algorithm), it could make a quick call to the site’s webservice with the same keyword, and return the result set (maybe top 10-20 results) in a collapsible pane integrated into the general Google result set. There are search engine aggregators that do this for a finite number of engines. Why not do it for the entire Internet, live and on-demand?????

Now, this approach does not address the keyword search approach, which I opined as being flawed. But it will help in making results more precise. Because no matter how well tuned the Google algorithm is, the site’s own search is usually going to provide better results than Google’s inferred results.

 

Founder NftyDreams; founder Decentology; co-founder DNN Software; educator; Open Source proponent; Microsoft MVP; tech geek; creative thinker; husband; dad. Personal blog: http://www.kalyani.com. Twitter: @techbubble
Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.