2 February 2009 |

Browse Rank vs Page Rank

Microsoft has made a bold move and decided to go head to
head with Google. With the inception of their "Browse Rank"
Google will now have a competitor for its Page Rank algorithm. Google's Page Rank simply assesses a specific page's
importance by how many other Web pages link to it and by the importance of those linking pages. So the
question is has Microsoft come up with a better and more precise algorithm?

"The user browsing graph from Microsoft has been said
to more precisely represent the web surfer's random walk process, and thus is
more useful for calculating page importance.

The more visits of the page made by users and the longer
time periods spent by the users on the page, the more likely the page is
important … We can leverage hundreds of millions of users' implicit voting on
page importance.

"Some websites like adobe.com are ranked very high by
PageRank … because Adobe.com has millions of inlinks for Acrobat Reader and
Flash Player downloads.

However, web users do not really visit such websites very
frequently and they should not be regarded as more important than the websites
on which users spend much more time (like myspace.com and facebook.com)"

In a nutshell the idea from Microsoft is the more visits of
the page made by the users and the longer time periods spent by the users on
the page, the more likely the page is important.

Team Tamar

View all posts by .
  • Pravith Dhanraj

    I commend Microsoft’s attempts at breaking Google’s search engine dominance by bringing forth systems such as Browse Rank. However it will not turn the tide on traffic to websites as Google still has over 60% of the search engine market.
    Since the Florida update, Google has been constantly amending their algorithm and I believe that they have already taken other factors such as average user time on site into consideration for recent page rank sweeps. This has been evident by the higher page rank noted by many webmasters with predominantly video content on their website. I speak from experience.
    Bottom line is that their are no guarantees. User time on site, keyword rich content or incoming links – search engines are supposedly updating their technology to index better sites but no single SEO technique will suffice. SEOs need a bag of tricks (ethical SEO practices) to maintain positions in SERPS and not get wiped out the next time a search engine changes its algorithm.
    Thanks for a great post Singatha.

  • http://profile.typekey.com/kristian_flint/ Kristian Flint

    So, not content with stealing all the nice GUI features from OS X and Ubuntu (and other *nix os’s, yes). Microsoft are now trying to make their search engine more popular by emulating the rivals…
    Where has M$’s innovation gone?!