Defining the truth

News last week that Google is considering ranking web sites based on accuracy. This would in effect filter the sites that users see (all for “benign reasons” of course):

For some time, those of us studying the problem of misinformation in US politics – and especially scientific misinformation – have wondered whether Google could come along and solve the problem in one fell swoop.

After all, if Web content were rated such that it came up in searches based on its actual accuracy – rather than based on its link-based popularity – then quite a lot of misleading stuff might get buried. And maybe, just maybe, fewer parents would stumble on dangerous anti-vaccine misinformation (to list one highly pertinent example).

It always sounded like a pipe dream, but in the past week, there’s been considerable buzz that Google might indeed be considering such a thing. The reason is that a team of Google researchers recently published a mathematics-heavy paper documenting their attempts to evaluate vast numbers of Web sites based upon their accuracy. …

This piece covers the same news (again in a positive frame):

I recently wrote that the search engine is now using curated results when people search on vaccine-preventable diseases, to prevent the rampant spread of misinformation.

It seems Google may be preparing to use this science-based method of ensuring evidence-based results into other arenas, including climate change. Right now, Google uses various methods to rank search engine results, including the number and “authority” of sites that link to other sites. But this method is easy to game, giving pseudo- or anti-science sites higher credence in Google’s results page. Researchers at Google have published a paper proposing instead using “knowledge-based trust,” where the facts of the site are compared to what is commonly known to be true among trusted sources. AsNew Scientist says, “Facts the web unanimously agrees on are considered a reasonable proxy for truth.” You can claim, for example, that the Earth if flat, and get links from popular sites about it, but in this new system you won’t get much Google love. As you shouldn’t.

This has climate change deniers worried. As they should be. Since they rely on ridiculous, oft-debunked claims, their Google ranking could drop.

I have very mixed feelings about this. On the one hand improving the quality of information returned by searches would be good. We all benefit from higher quality information about health, less climate denier nonsense, less visibility for hate blogs (as one wag observed filtering for accuracy should pretty much make Whaleoil invisible).

But on the other hand, who decides what is accurate? (Who designs and weights the algorithms – will they find “the truth” or just the best funded / loudest voices?) How scientific, how open, how reliable will that process be? Google’s power to shape our lives is already immense, do I really want a profit driven American company effectively defining the truth for me? There is something to be said for seeing the raw debate and developing the skills to assess the validity of information for ourselves.

Whatever – Google is going to do what Google is going to do. But they should brace themselves for lawsuits. A lot of people make money out of nonsense, if Google makes them invisible they are going to sue. The Koch brothers and their denier industry won’t go quietly, for example. Interesting times.

Powered by WPtouch Mobile Suite for WordPress