A new tool shows how Google’s results change around the world

[ad_1]
A Google spokesperson said the differences in results were not caused by censorship and that content about the Tiananmen Square massacre was available through Google Search in any language or local. Tourist images gain prominence in some cases, a spokesman said, when the search engine detects intent to travel the motorbike, it is likely for search engines that are closer to Beijing or written in Chinese. Searching for Tiananmen Square from Thailand or the US using Google’s Chinese setting also requires the latest clean images of the historic site.
“We place the results in your region and language so you can quickly access the most reliable information,” the spokesperson said. Google users can tune their results by adjusting their location setting and language.
Search Atlas collaborators also built maps and displays that show how search results around the world change. One shows how the search for images of “God” yields bearded Christian images in Europe and America, images of the Buddha in some Asian countries, and Allah to write Allah in the Persian Gulf and northeastern Africa. A Google spokesperson said the results reflect how his translation service translates the English term “God” into words that have a more specific meaning for some languages, such as Allah in Arabic.
Other information limits tabled by researchers are not directly located within national or linguistic boundaries. The results of “How to Cope with Climate Change” tend to divide island nations and countries on continents. In European countries, such as Germany, the most common words in Google’s results were related to policy measures related to energy conservation and international agreements; In islands like Mauritius and the Philippines, the horrors and immediacy of the threat of a changing climate were cited as results or damage such as rising sea levels.
The Search Atlas was presented last month at an academic conference on Designing Interactive Systems; its creators are testing a private beta of the service and are looking at how to expand access to it.
Search Atlas cannot reveal that different versions of Google represent the world differently. The company’s ranking system keeps profits tight, and the company says little about how to tune results based on geography, language, or a person’s activity.
Whatever specific reasons Google shows or doesn’t show, they can easily forget the power, Search Atlas cocreator Ye says. “People ask search engines about things they would never ask a person about, and the things they see in Google results can change their lives,” says Ye. “How can I have an abortion?” the restaurants around you, or how you vote or get vaccinated. “
Experiments conducted by WIRED showed how people in surrounding countries could channel very different information about a hot topic to Google. When WIRED asked the Search Atlas about the ongoing war in the Tigray region of Ethiopia, Google’s Ethiopian edition noted Facebook pages and blogs criticizing Western diplomatic pressure to end the conflict, suggesting the U.S. and others were trying to weaken Ethiopia. The results of the surrounding Kenya and the US version of Google are significantly highlighted by the coverage of new explanations from sources such as the BBC The New York Times.
Ochigame and Ye are not the first to point out that search engines are not neutral actors. Their project was inspired by the work of Safiya Noble, co-founder and director and director of the UCLA Center for Critical Internet Research. Her 2018 book Oppression algorithms examined how Google’s search for words like “Black” or “Hispanic” yielded results that reflect and reinforce social biases against certain excluded people.
Noble says the project can provide a way to explain the true nature of search engines to a wider audience. “It’s very difficult to visualize search engines in non-democratic ways,” he says.
[ad_2]
Source link