In the late 90's, webmasters became savvy to the mechanics of search engines like AltaVista, especially in leveraging their knowledge of how search engines assigned relevancy and value to web pages. In a bid to gain as many visitors as possible, webmasters employed manipulative tactics to better position their sites within search engines, often for phrases and words that had nothing to do with their sites. As a result, Internet users had a difficult time finding what they were really looking for.
This created the perfect opportunity for Google to enter the market with a new information valuation system. Due to its radical break from the norm -and its great degree of complexity- it was found to be much harder to manipulate, and as a result, provided better and "cleaner" search results for its users.
However, even Google isn't perfect. Until just recently, Google had "thought", based upon the mechanics of it's algorithm, that the President's biography at whitehouse.gov was the most relevant result to return for the search "miserable failure." This was not the result of a political bias within Google's algorithm. Rather, the cause was a concerted -and successful- attempt by individuals who were knowledgeable of Goolge's inner workings, and who duped the Google algorithm into believing that "miserable failure" was genuinely the best term to describe the page.
The "Google bombing" of the President's biography articulates the problem with even the best search engines: no matter how complex or clever the algorithm, no matter how much its engineers have accounted for exploitative practices, there will always be people who find ways to manipulate algorithmically derived search results.
As an echo to Google's own shakeup of the search scene a decade ago, user regulated content has emerged as powerful and accurate alternative.
Tapping users to regulate and assign value to content is an idea employed (and largely pioneered) by sites like Digg, Reddit, Fliker, and Del.icio.us. To reveal an inner Star Trek geek, these site harness the "Collective" – real people, not algorithms – to rate and tag information, ensuring that the best and most relevant results for that community of users are returned.
While better, we are now faced with the problem of bias. Anyone who frequents these communities will find that they too have inherent predispositions. The Digg community, for example, skews politically to the left, and technologically towards Apple.
Like Digg or Reddit, Wize relies upon the opinions of consumers to determine what are “the best” products, but unlike those communities we take a holistic approach toward product reviews.
We understand that users, and the communities they build, coalesce around common beliefs, and therefore, unintentionally around certain biases. That’s the reason we don’t rely only upon a singular viewpoint, but instead a multitude of reviews culled from communities across the Web.