Written By:
Anthony R0bins - Date published:
10:31 am, March 17th, 2015 - 12 comments
Categories: Deep stuff, internet -
Tags: google, lies, truth
News last week that Google is considering ranking web sites based on accuracy. This would in effect filter the sites that users see (all for “benign reasons” of course):
For some time, those of us studying the problem of misinformation in US politics – and especially scientific misinformation – have wondered whether Google could come along and solve the problem in one fell swoop.
After all, if Web content were rated such that it came up in searches based on its actual accuracy – rather than based on its link-based popularity – then quite a lot of misleading stuff might get buried. And maybe, just maybe, fewer parents would stumble on dangerous anti-vaccine misinformation (to list one highly pertinent example).
It always sounded like a pipe dream, but in the past week, there’s been considerable buzz that Google might indeed be considering such a thing. The reason is that a team of Google researchers recently published a mathematics-heavy paper documenting their attempts to evaluate vast numbers of Web sites based upon their accuracy. …
This piece covers the same news (again in a positive frame):
I recently wrote that the search engine is now using curated results when people search on vaccine-preventable diseases, to prevent the rampant spread of misinformation.
It seems Google may be preparing to use this science-based method of ensuring evidence-based results into other arenas, including climate change. Right now, Google uses various methods to rank search engine results, including the number and “authority” of sites that link to other sites. But this method is easy to game, giving pseudo- or anti-science sites higher credence in Google’s results page. Researchers at Google have published a paper proposing instead using “knowledge-based trust,” where the facts of the site are compared to what is commonly known to be true among trusted sources. AsNew Scientist says, “Facts the web unanimously agrees on are considered a reasonable proxy for truth.” You can claim, for example, that the Earth if flat, and get links from popular sites about it, but in this new system you won’t get much Google love. As you shouldn’t.
This has climate change deniers worried. As they should be. Since they rely on ridiculous, oft-debunked claims, their Google ranking could drop.
I have very mixed feelings about this. On the one hand improving the quality of information returned by searches would be good. We all benefit from higher quality information about health, less climate denier nonsense, less visibility for hate blogs (as one wag observed filtering for accuracy should pretty much make Whaleoil invisible).
But on the other hand, who decides what is accurate? (Who designs and weights the algorithms – will they find “the truth” or just the best funded / loudest voices?) How scientific, how open, how reliable will that process be? Google’s power to shape our lives is already immense, do I really want a profit driven American company effectively defining the truth for me? There is something to be said for seeing the raw debate and developing the skills to assess the validity of information for ourselves.
Whatever – Google is going to do what Google is going to do. But they should brace themselves for lawsuits. A lot of people make money out of nonsense, if Google makes them invisible they are going to sue. The Koch brothers and their denier industry won’t go quietly, for example. Interesting times.
Does this mean that all the top links will be behind science journal paywalls? Or to the relevant Wikipedia entry?
There won’t be enough hours in the day to scroll down to find National Party’s new ranking.
When Google results are just treated as a toy, or at best a survey of popular knowledge about a topic, then this seems quite a good idea. However, many people (and seemingly this is Google’s intent) treat Google results as the beginning and end of research about a topic. I see this lots in undergraduate research for assignments.
If anything, anointing Google results with a “truth” algorithm will discourage people from thinking about what they source via Google even more.
Presumably, the unscrupulous will eventually figure out how to game the “truth” algorithm too (and trigger a consequent arms race of modifications to the “truth” algorithm).
I often use Google as a research aid at the well past graduate level. If you know how to use it, it’s great. I have a few concerns about their truth algorithm, so I’ve downloaded the paper and I’ll see if I can understand it.
There’s a space for such a search function on some fronts, ie, as the post reports, on scientifically verifiable areas, although, even science has its orthodoxies (received wisdoms) and isn’t cut and dried.
It’s within the social/political sphere that I’d be more worried. Although, even as it stands my searches and your searches are ‘siloed’ by google so that we ‘get what we want’ as it were.
I believe that duckduckgo doesn’t perform searches based on historical search data.
I’d settle for a search function that allowed for a simple and obvious exclusion of certain domain names or regions and that limited search results from single sites to some user defined number. I don’t want a screed of (say) BBC results. Just one or two will do…opening up prominent result displays to more sources. And from that one or two, each from a range of sources, I can decide which, if any, sites to do a site search on.
It would be quite a challenge. At first, I would have thought they’d use citation weighting – the kind of thing we see with services like Google Scholar or Scopus – but that’s factored into their web crawler already. As a librarian, I encourage users approaching their resources with a critical eye. Google is a powerful tool, but should be used wisely.
wisely
Indeed. Anyone who looks for information to support their existing beliefs will find it, which is why I make a point of searching for neutral or contradictory info.
adding pdf to search terms helps too
I reckon we’re there.
I fear we are witnessing the “death of expertise”: a Google-fueled, Wikipedia-based, blog-sodden collapse of any division between professionals and laymen, students and teachers, knowers and wonderers – in other words, between those of any achievement in an area and those with none at all. By this, I do not mean the death of actual expertise, the knowledge of specific things that sets some people apart from others in various areas. There will always be doctors, lawyers, engineers, and other specialists in various fields. Rather, what I fear has died is any acknowledgement of expertise as anything that should alter our thoughts or change the way we live.
http://thefederalist.com/2014/01/17/the-death-of-expertise/
An algorithm that defines “accuracy”?
Good science relies and depends upon healthy scepticism and robust debate, which must, perforce include judgement.
Google has over reached itself if it thinks it can define accuracy.
Accuracy requires testing of all relevant data.
Google has no capability of gathering all relevant data on any topic.
Was the gathering “accurate”?
This is sillycon valley going up its own fundament
what about non quantifiable, subjective topics such as
religion,
ethics,
sexuality,
politics,
economics?
This has been around for a while. In my opinion Google are the US governments pal. There is a lot to show they are hardly impartial although they should be. I never use their search engine and never will.