The changing landscape of (blog) search

Steve Rubel writes about the changing landscape of blog search. Google killed it, he claims, and it seems plausible.

For one, there is good reason why the attractiveness of search engines like technorati has faltered:

The improvements are nice, but I have to admit that I don’t use Technorati nearly as much as I used to. Link authority was a good metric a year ago, but it’s not nearly as worthwhile today when you consider all of the centers of influence one may wish to search and track. Link authority doesn’t tell me who’s an influencer on Facebook or which video artists are rising on YouTube. It was great in 2005, ok in 2006 and really has faded from relevance in 2007. […] While we still use vertical search engines today to dig through news, blogs, video, etc., their days are numbered. The lines are blurrier. Google News, for example, has lots of blogs. More importantly, the big web search engines are going becoming sophisticated enough to make an educated guess as to what information you’re seeking. It won’t care if it comes from the live or static web. It will serve up relevance and soon time-stamped sorting.

Is there anything that will put an end to Google’s dominance? Probably not. But it was never within their own fields that big monolithic companies were beat. IBM still offers some of the best servers. Microsoft is still a quasi-Monopoly in PC OS.

Whoever „beats“ Google will have find a totally new field of activity.

By the way, I love to take sneak preview of what Google is toying with

More and better eyeballs than in regular media.

Marketers Websites attract more eyeballs than other media, according to AdAge:

Believe it or not, those boring corporate websites are pulling in more eyeballs — and more influencers — than the flashy prime time TV shows, print magazines and general interest sites on which marketers advertise.

In figures, to be precise:

Yet the websites of P&G and Unilever now reach nearly 6 million and 3 million unique visitors, respectively, in the U.S. each month, according to ComScore Media Metrix.

But it’s not only about more eyeballs, these eyeballs are also quite interesting for marketers, as they are usually influencers, or at least people who actively engage with the brand in considerable numbers.

Their engagement with corporate and brand sites is well above the norm for the general population. „Visitors to [corporate and brand] websites have a much higher propensity to recommend products,“ said Pete Blackshaw, chief marketing officer of Nielsen Buzzmetrics, whose research shows more than 40% of people who give a brand e-mail feedback are likely to recommend it to others.

Much of it is derived from „regular“ online Advertising:

Much of the traffic to the big package-goods marketers‘ sites appears to be coming the way originally envisioned in the online advertising model: as a response to online display advertising. Search-heavy Google accounts for a relatively small amount of traffic to the P&G and Unilever sites compared with display-ad-heavy Yahoo

That doesn’t surprise me at all, since most of P&Gs and Unilevers products are FMCG, for which I assume users usually don’t search much. Or if people do search for these kind of things, then the FMCG companies haven’t found the right way to effectively keyword advertise.

Downsides of Participation Inequality

Jakob Nielsen has some interesting views about the downsides of the 1% rule that I blogged about. In his article
Participation Inequality: Lurkers vs. Contributors in Internet Communities
he lists those „Downsides of Participation Inequality“

The problem is that the overall system is not representative of Web users. On any given user-participation site, you almost always hear from the same 1% of users, who almost certainly differ from the 90% you never hear from. This can cause trouble for several reasons:

  • Customer feedback. If your company looks to Web postings for customer feedback on its products and services, you’re getting an unrepresentative sample.
  • Reviews. Similarly, if you’re a consumer trying to find out which restaurant to patronize or what books to buy, online reviews represent only a tiny minority of the people who have experiences with those products and services.
  • Politics. If a party nominates a candidate supported by the „netroots,“ it will almost certainly lose because such candidates‘ positions will be too extreme to appeal to mainstream voters. Postings on political blogs come from less than 0.1% of voters, most of whom are hardcore leftists (for Democrats) or rightists (for Republicans).
  • Search. Search engine results pages (SERP) are mainly sorted based on how many other sites link to each destination. When 0.1% of users do most of the linking, we risk having search relevance get ever more out of whack with what’s useful for the remaining 99.9% of users. Search engines need to rely more on behavioral data gathered across samples that better represent users, which is why they are building Internet access services.
  • Signal-to-noise ratio. Discussion groups drown in flames and low-quality postings, making it hard to identify the gems. Many users stop reading comments because they don’t have time to wade through the swamp of postings from people with little to say.
  • In addition, he also lists some point on „How to Overcome Participation Inequality“.
    But the main point still is: you can’t overcome participation inequality. You can only optimise the way content is produced an sorted, trying to make it more suitable and/or relevant for the average users.

    Swickis tap communities for search…

    ZDNet writes about Swicki tapping communities for search and really describes a new way of making search more relevant. This is the question most search engines are currently trying to solve and the model of Swicki seems to be similar to Rollyo:

    Rollyo offers the ability to search the content of a list of specified websites, allowing you to narrow down the results to pages from websites that you already know and trust.“

    … but then again, not quite. Since they don’t only allow for predefined filters, but also measure user behaviour to identify which results will be relevant in the future. This also what might differentiate them from Google: it’s not just about what people „voted for“ by placing a link but also about what they actually visited.

    Swickis combine Web crawling with filters defined by site owners and algorithms that analyze user behavior (keywords and pages accessed) anonymously and automatically, re-ranking results based on the community’s search actions.

    (Just as I typed this I thought: who knows, if Google isn’t already measuring our behaviour anyway? – I mean, how would we be able to tell? In theory, they can measure our clicks on the pages of the search results – but in addition they can track us on any page that has Google Adsense Advertising – which would mean a lot of pages across the hit- and niche-websites of the web.)

    Seems to be an interesting tool – if I have some time over the weekend, I might start my own Swicki search in this blog…