Google this week admitted that its staff will pick and choose what appears in its search results. It's a historic statement - and nobody has yet grasped its significance.
Not so very long ago, Google disclaimed responsibility for its search results by explaining that these were chosen by a computer algorithm. The disclaimer lives on at Google News, where we are assured that:
The selection and placement of stories on this page were determined automatically by a computer program.
A few years ago, Google's apparently unimpeachable objectivity got some people very excited, and technology utopians began to herald Google as the conduit for a new form of democracy. Google was only too pleased to encourage this view. It explained that its algorithm "relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page's value. "
That Google was impartial was one of the articles of faith. For if Google was ever to be found to be applying subjective human judgment directly on the process, it would be akin to the voting machines being rigged.
For these soothsayers of the Hive Mind, the years ahead looked prosperous. As blog-aware marketing and media consultants, they saw a lucrative future in explaining the New Emergent World Order to the uninitiated. (That part has come true - Web 2.0 "gurus" now advise large media companies).
It wasn't surprising, then, that when five years ago I described how a small, self-selected number of people could rig Google's search results, the reaction from the people doing the rigging was violently antagonistic. Who lifted that rock? they cried.
But what was once Googlewashing by a select few now has Google's active participation.
This week Marissa Meyer explained that editorial judgments will play a key role in Google searches. It was reported by Tech Crunch proprietor Michael Arrington - who Nick Carr called the "Madam of the Web 2.0 Brothel" - but its significance wasn't noted. The irony flew safely over his head at 30,000 feet. Arrington observed:
Mayer also talked about Google’s use of user data created by actions on Wiki search to improve search results on Google in general. For now that data is not being used to change overall search results, she said. But in the future it’s likely Google will use the data to at least make obvious changes. An example is if “thousands of people” were to knock a search result off a search page, they’d be likely to make a change.
Now what, you may be thinking, is an "obvious change"? Is it one that is frivolous? (Thereby introducing a Google Frivolitimeter™ [Beta]). Or is it one that goes against the grain of the consensus? If so, then who decides what the consensus must be? Make no mistake, Google is moving into new territory: not only making arbitrary, editorial choices - really no different to Fox News, say, or any other media organization. It's now in the business of validating and manufacturing consent: not only reporting what people say, but how you should think.
Who's hand is upon the wheel, here?
None of this would matter, if it wasn't for one other trend: a paralysing loss of confidence in media companies.