'Misinformation is so common after mass shootings that Google has had to tweak its algorithm to compensate, a senior search engineer at the company has revealed.
Pandu Nayak, who joined the company 14 years ago to work on its search engine, told the Guardian that mass murders presented an increasing challenge for the search engine to deliver accurate results.
“In these last few years, there’s been a tragic increase in shootings,” Nayak said. “And it turns out that during these shootings, in the fog of events that are unfolding, a lot of misinformation can arise in various ways.
“And so to address that we have developed algorithms that recognise that a bad event is taking place and that we should increase our notions of ‘authority’, increase the weight of ‘authority’ in our ranking so that we surface high quality content rather than misinformation in this critical time here.”
Authority, by Google’s definition, means pages that comply with the company’s search quality evaluator guidelines, a 166-page document (PDF) that the company distributes to its 16,000 search quality raters.
Those employees are responsible for checking tweaks to Google’s algorithm to ensure that they give the best results, rating search results on two scales: one that marks whether the searcher’s needs are met (if the search is for “Google Jobs”, for instance, a maps result showing the location of Google’s head office “fails to meet” needs, while the company’s career’s page “fully meets”), and a second that marks the page’s quality, defined over 80 pages of the guidelines with “very high quality MC” (main content), “very high level of E-A-T” (expertise, authoritativeness, trustworthiness) and “very positive reputation”.
The search quality guidelines were first published in 2013, but the raters have long been a core part of how the company judges changes to its algorithm. Only recently, however, have they been explicitly turned towards keeping hate speech, misinformation and fake news out of search results. In 2017, Google added the explicit ability for raters to flag search items as “upsetting-offensive”, after the Guardian and Observer began a series of stories showing how the search engine promotes extremist content.
One story in particular highlighted how a search for “did the Holocaust happen” returned, as its top result, a link to the white supremacist forum Stormfront, explaining how to promote Holocaust denial to others. That search is now included in the evaluator guidelines as an example of content to flag up, and Nayak highlighted the improvement.'
Read more: Google tweaked algorithm after rise in US shootings
Did you like this article?
Thank you for your vote!
The extraordinary story of Big Tech 'moderators' paid shit to censor the Internet - a must watch
From our advertisers
19 hours ago
You Will Wish You Watched This Before You Started Using Social Media | The Twisted Truth
From our advertisers