- Google says it found no evidence that Google Images was ranking the Pakistani flag in response to toilet paper.
- The word Bhikari returned images of Pakistan Prime Minister Imran Khan.
- Google says this is due to the false news stories published as the pages contain words relevant to the search.
Google has said that it is investigating a matter related to Google Images in which a quick search for “best toilet paper in the world” results in images of the Pakistan national flag, and a search for the word “Bhikari” (beggar) shows images of the country’s Prime Minister, Imran Khan. The matter blew up after people started sharing screenshots of the search results on social media following the February 14 attack in Kashmir’s Pulwama district that left at least 40 soldiers dead and several other injured.
“While we continue to investigate the matter, we have not found any evidence that Google Images was ranking the Pakistani flag in response to this particular search. Many news outlets wrote about an old screenshot from a meme website that is inconsistent with our UI and dates back to 2017, and we have not seen any independent verification that these results ever appeared as depicted. Since these news stories published, images from those articles are now ranking for this query, as the pages contain words relevant to the search,” Google said in a statement.
Google uses automated programmes called spiders or crawlers and it also has a huge index of keywords. Google uses web crawlers to organise information from webpages and other publicly available content in the Search index. Google ranking systems sort through billions of webpages in the Search index to give users useful and relevant results. So if initially people searched “Toilet Paper” and “Pakistan Flag” together, Google’s algorithms would include those words (together) in their keyword database.
As more and more people searched for the same query (thanks to the social media propaganda), the search words became more relevant. With the help of its algorithms, Google makes sure to offer search results in a range of formats, like articles and images, to help people quickly find the information that they are looking for. So because of the published news stories, the “Pakistan flag” result corresponds to a query of “Toilet paper” and Google algorithms return these photos on Google Images. The explanation is the same for the results of the “Bhikari” word search.
This is not the first time that Google Images has returned controversial results. Back in 2015, Google search results for the “Top 10 Criminals of the world” featured pictures of Prime Minister Narendra Modi. At that time Google had apologised and said that the image search results were drawn from multiple news articles which showed images of Prime Minister Modi and his statements with regard to politicians with criminal backgrounds. The company also clarified that none of the news articles link Modi to criminal activity.
A similar question was raised when Google CEO Sundar Pichai was testifying before Congress in December last year. A Congressman asked the executive why image results for “idiot” reveal a page of US President Donald Trump's photos. To this, Pichai first distanced the company of any political bias saying that Google’s algorithms do not favour any specific ideology or demographic and have always returned most relevant results. He said that the photos on Google Images are reflected based on “what's happening out there”, that is, what people are searching and on the basis of the trending news stories.
“What is important here is we use the robust methodology to reflect what is being said about any given topic at any particular time. And we try to do it objectively, using a set of rubrics. It is in our interest to make sure we reflect what's happening out there in the best objective manner possible. I can commit to you, and I can assure you we do it without regards to political ideology. Our algorithms have no notion of political sentiment in it,” Pichai explained. In the past, Pichai had clarified that it was “impossible” for any individual or group of individuals to manipulate its algorithms.