There’s a growing possibility that Google's artificial intelligence tools will amplify the many biases found in society, even unbeknown to its creators, writes Farhad Manjoo in the NYT. Read on:
Let’s get this out of the way first: There is no basis for the charge that President Trump leveled against Google this week — that the search engine, for political reasons, favored anti-Trump news outlets in its results. None.
Mr. Trump also claimed that Google advertised President Barack Obama’s State of the Union addresses on its home page but did not highlight his own. That, too, was false, as screenshots show that Google did link to Mr. Trump’s address this year.
But that concludes the “defense of Google” portion of this column. Because whether he knew it or not, Mr. Trump’s false charges crashed into a longstanding set of worries about Google, its biases and its power. When you get beyond the president’s claims, you come upon a set of uncomfortable facts — uncomfortable for Google and for society, because they highlight how in thrall we are to this single company, and how few checks we have against the many unseen ways it is influencing global discourse.
In particular, a raft of research suggests there is another kind of bias to worry about at Google. The naked partisan bias that Mr. Trump alleges is unlikely to occur, but there is a potential problem for hidden, pervasive and often unintended bias — the sort that led Google to once return links to many pornographic pages for searches for “black girls,” that offered “angry” and “loud” as autocomplete suggestions for the phrase “why are black women so,” or that returned pictures of black people for searches of “gorilla.”
I culled these examples — which Google has apologized for and fixed, but variants of which keep popping up — from “Algorithms of Oppression: How Search Engines Reinforce Racism,” a book by Safiya U. Noble, a professor at the University of Southern California’s Annenberg School of Communication.
Dr. Noble argues that many people have the wrong idea about Google. We think of the search engine as a neutral oracle, as if the company somehow marshals computers and math to objectively sift truth from trash.
But Google is made by humans who have preferences, opinions and blind spots and who work within a corporate structure that has clear financial and political goals. What’s more, because Google’s systems are increasingly created by artificial intelligence tools that learn from real-world data, there’s a growing possibility that it will amplify the many biases found in society, even unbeknown to its creators.
Google says it is aware of the potential for certain kinds of bias in its search results, and that it has instituted efforts to prevent them. “What you have from us is an absolute commitment that we want to continually improve results and continually address these problems in an effective, scalable way,” said Pandu Nayak, who heads Google’s search ranking team. “We have not sat around ignoring these problems.”
For years, Dr. Noble and others who have researched hidden biases — as well as the many corporate critics of Google’s power, like the frequent antagonist Yelp — have tried to start a public discussion about how the search company influences speech and commerce online.
There’s a worry now that Mr. Trump’s incorrect charges could undermine such work. “I think Trump’s complaint undid a lot of good and sophisticated thought that was starting to work its way into public consciousness about these issues,” said Siva Vaidhyanathan, a professor of media studies at the University of Virginia who has studied Google and Facebook’s influence on society.
Dr. Noble suggested a more constructive conversation was the one “about one monopolistic platform controlling the information landscape.”
So, let’s have it.
Google’s most important decisions are secret
In the United States, about eight out of 10 web searches are conducted through Google; across Europe, South America and India, Google’s share is even higher. Google also owns other major communications platforms, among them YouTube and Gmail, and it makes the Android operating system and its app store. It is the world’s dominant internet advertising company, and through that business, it also shapes the market for digital news.
Google’s power alone is not damning. The important question is how it manages that power, and what checks we have on it. That’s where critics say it falls down.
Google’s influence on public discourse happens primarily through algorithms, chief among them the system that determines which results you see in its search engine. These algorithms are secret, which Google says is necessary because search is its golden goose (it does not want Microsoft’s Bing to know what makes Google so great) and because explaining the precise ways the algorithms work would leave them open to being manipulated.
But this initial secrecy creates a troubling opacity. Because search engines take into account the time, place and some personalized factors when you search, the results you get today will not necessarily match the results I get tomorrow. This makes it difficult for outsiders to investigate bias across Google’s results.
A lot of people made fun this week of the paucity of evidence that Mr. Trump put forward to support his claim. But researchers point out that if Google somehow went rogue and decided to throw an election to a favored candidate, it would only have to alter a small fraction of search results to do so. If the public did spot evidence of such an event, it would look thin and inconclusive, too.
“We really have to have a much more sophisticated sense of how to investigate and identify these claims,” said Frank Pasquale, a professor at the University of Maryland’s law school who has studied the role that algorithms play in society.
In a law review article published in 2010, Mr. Pasquale outlined a way for regulatory agencies like the Federal Trade Commission and the Federal Communications Commission to gain access to search data to monitor and investigate claims of bias. No one has taken up that idea. Facebook, which also shapes global discourse through secret algorithms, recently sketched out a plan to give academic researchers access to its data to investigate bias, among other issues.
Google has no similar program, but Dr. Nayak said the company often shares data with outside researchers. He also argued that Google’s results are less “personalized” than people think, suggesting that search biases, when they come up, will be easy to spot.
“All our work is out there in the open — anyone can evaluate it, including our critics,” he said.
Search biases mirror real-world ones
The kind of blanket, intentional bias Mr. Trump is claiming would necessarily involve many workers at Google. And Google is leaky; on hot-button issues — debates over diversity or whether to work with the military — politically minded employees have provided important information to the media. If there was even a rumor that Google’s search team was skewing search for political ends, we would likely see some evidence of such a conspiracy in the media.
That’s why, in the view of researchers who study the issue of algorithmic bias, the more pressing concern is not about Google’s deliberate bias against one or another major political party, but about the potential for bias against those who do not already hold power in society. These people — women, minorities and others who lack economic, social and political clout — fall into the blind spots of companies run by wealthy men in California.
It’s in these blind spots that we find the most problematic biases with Google, like in the way it once suggested a spelling correction for the search “English major who taught herself calculus” — the correct spelling, Google offered, was “English major who taught himself calculus.”
Why did it do that? Google’s explanation was not at all comforting: The phrase “taught himself calculus” is a lot more popular online than “taught herself calculus,” so Google’s computers assumed that it was correct. In other words, a longstanding structural bias in society was replicated on the web, which was reflected in Google’s algorithm, which then hung out live online for who knows how long, unknown to anyone at Google, subtly undermining every female English major who wanted to teach herself calculus.
Eventually, this error was fixed. But how many other such errors are hidden in Google? We have no idea.
Google says it understands these worries, and often addresses them. In 2016, some people noticed that it listed a Holocaust-denial site as a top result for the search “Did the Holocaust happen?” That started a large effort at the company to address hate speech and misinformation online. The effort, Dr. Nayak said, shows that “when we see real-world biases making results worse than they should be, we try to get to the heart of the problem.”
Google has escaped recent scrutiny
Yet it is not just these unintended biases that we should be worried about. Researchers point to other issues: Google’s algorithms favor recency and activity, which is why they are so often vulnerable to being manipulated in favor of misinformation and rumor in the aftermath of major news events. (Google says it is working on addressing misinformation.)
Some of Google’s rivals charge that the company favors its own properties in its search results over those of third-party sites — for instance, how it highlights Google’s local reviews instead of Yelp’s in response to local search queries.
Regulators in Europe have already fined Google for this sort of search bias. In 2012, the F.T.C.’s antitrust investigators found credible evidence of unfair search practices at Google. The F.T.C.’s commissioners, however, voted unanimously against bringing charges. Google denies any wrongdoing.
The danger for Google is that Mr. Trump’s charges, however misinformed, create an opening to discuss these legitimate issues.
On Thursday, Senator Orrin Hatch, Republican of Utah, called for the F.T.C. to reopen its Google investigation. There is likely more to come. For the last few years, Facebook has weathered much of society’s skepticism regarding big tech. Now, it may be Google’s time in the spotlight.
Let’s get this out of the way first: There is no basis for the charge that President Trump leveled against Google this week — that the search engine, for political reasons, favored anti-Trump news outlets in its results. None.
Mr. Trump also claimed that Google advertised President Barack Obama’s State of the Union addresses on its home page but did not highlight his own. That, too, was false, as screenshots show that Google did link to Mr. Trump’s address this year.
But that concludes the “defense of Google” portion of this column. Because whether he knew it or not, Mr. Trump’s false charges crashed into a longstanding set of worries about Google, its biases and its power. When you get beyond the president’s claims, you come upon a set of uncomfortable facts — uncomfortable for Google and for society, because they highlight how in thrall we are to this single company, and how few checks we have against the many unseen ways it is influencing global discourse.
In particular, a raft of research suggests there is another kind of bias to worry about at Google. The naked partisan bias that Mr. Trump alleges is unlikely to occur, but there is a potential problem for hidden, pervasive and often unintended bias — the sort that led Google to once return links to many pornographic pages for searches for “black girls,” that offered “angry” and “loud” as autocomplete suggestions for the phrase “why are black women so,” or that returned pictures of black people for searches of “gorilla.”
I culled these examples — which Google has apologized for and fixed, but variants of which keep popping up — from “Algorithms of Oppression: How Search Engines Reinforce Racism,” a book by Safiya U. Noble, a professor at the University of Southern California’s Annenberg School of Communication.
Dr. Noble argues that many people have the wrong idea about Google. We think of the search engine as a neutral oracle, as if the company somehow marshals computers and math to objectively sift truth from trash.
But Google is made by humans who have preferences, opinions and blind spots and who work within a corporate structure that has clear financial and political goals. What’s more, because Google’s systems are increasingly created by artificial intelligence tools that learn from real-world data, there’s a growing possibility that it will amplify the many biases found in society, even unbeknown to its creators.
Google says it is aware of the potential for certain kinds of bias in its search results, and that it has instituted efforts to prevent them. “What you have from us is an absolute commitment that we want to continually improve results and continually address these problems in an effective, scalable way,” said Pandu Nayak, who heads Google’s search ranking team. “We have not sat around ignoring these problems.”
For years, Dr. Noble and others who have researched hidden biases — as well as the many corporate critics of Google’s power, like the frequent antagonist Yelp — have tried to start a public discussion about how the search company influences speech and commerce online.
There’s a worry now that Mr. Trump’s incorrect charges could undermine such work. “I think Trump’s complaint undid a lot of good and sophisticated thought that was starting to work its way into public consciousness about these issues,” said Siva Vaidhyanathan, a professor of media studies at the University of Virginia who has studied Google and Facebook’s influence on society.
Dr. Noble suggested a more constructive conversation was the one “about one monopolistic platform controlling the information landscape.”
So, let’s have it.
Google’s most important decisions are secret
In the United States, about eight out of 10 web searches are conducted through Google; across Europe, South America and India, Google’s share is even higher. Google also owns other major communications platforms, among them YouTube and Gmail, and it makes the Android operating system and its app store. It is the world’s dominant internet advertising company, and through that business, it also shapes the market for digital news.
Google’s power alone is not damning. The important question is how it manages that power, and what checks we have on it. That’s where critics say it falls down.
Google’s influence on public discourse happens primarily through algorithms, chief among them the system that determines which results you see in its search engine. These algorithms are secret, which Google says is necessary because search is its golden goose (it does not want Microsoft’s Bing to know what makes Google so great) and because explaining the precise ways the algorithms work would leave them open to being manipulated.
But this initial secrecy creates a troubling opacity. Because search engines take into account the time, place and some personalized factors when you search, the results you get today will not necessarily match the results I get tomorrow. This makes it difficult for outsiders to investigate bias across Google’s results.
A lot of people made fun this week of the paucity of evidence that Mr. Trump put forward to support his claim. But researchers point out that if Google somehow went rogue and decided to throw an election to a favored candidate, it would only have to alter a small fraction of search results to do so. If the public did spot evidence of such an event, it would look thin and inconclusive, too.
“We really have to have a much more sophisticated sense of how to investigate and identify these claims,” said Frank Pasquale, a professor at the University of Maryland’s law school who has studied the role that algorithms play in society.
In a law review article published in 2010, Mr. Pasquale outlined a way for regulatory agencies like the Federal Trade Commission and the Federal Communications Commission to gain access to search data to monitor and investigate claims of bias. No one has taken up that idea. Facebook, which also shapes global discourse through secret algorithms, recently sketched out a plan to give academic researchers access to its data to investigate bias, among other issues.
Google has no similar program, but Dr. Nayak said the company often shares data with outside researchers. He also argued that Google’s results are less “personalized” than people think, suggesting that search biases, when they come up, will be easy to spot.
“All our work is out there in the open — anyone can evaluate it, including our critics,” he said.
Search biases mirror real-world ones
The kind of blanket, intentional bias Mr. Trump is claiming would necessarily involve many workers at Google. And Google is leaky; on hot-button issues — debates over diversity or whether to work with the military — politically minded employees have provided important information to the media. If there was even a rumor that Google’s search team was skewing search for political ends, we would likely see some evidence of such a conspiracy in the media.
That’s why, in the view of researchers who study the issue of algorithmic bias, the more pressing concern is not about Google’s deliberate bias against one or another major political party, but about the potential for bias against those who do not already hold power in society. These people — women, minorities and others who lack economic, social and political clout — fall into the blind spots of companies run by wealthy men in California.
It’s in these blind spots that we find the most problematic biases with Google, like in the way it once suggested a spelling correction for the search “English major who taught herself calculus” — the correct spelling, Google offered, was “English major who taught himself calculus.”
Why did it do that? Google’s explanation was not at all comforting: The phrase “taught himself calculus” is a lot more popular online than “taught herself calculus,” so Google’s computers assumed that it was correct. In other words, a longstanding structural bias in society was replicated on the web, which was reflected in Google’s algorithm, which then hung out live online for who knows how long, unknown to anyone at Google, subtly undermining every female English major who wanted to teach herself calculus.
Eventually, this error was fixed. But how many other such errors are hidden in Google? We have no idea.
Google says it understands these worries, and often addresses them. In 2016, some people noticed that it listed a Holocaust-denial site as a top result for the search “Did the Holocaust happen?” That started a large effort at the company to address hate speech and misinformation online. The effort, Dr. Nayak said, shows that “when we see real-world biases making results worse than they should be, we try to get to the heart of the problem.”
Google has escaped recent scrutiny
Yet it is not just these unintended biases that we should be worried about. Researchers point to other issues: Google’s algorithms favor recency and activity, which is why they are so often vulnerable to being manipulated in favor of misinformation and rumor in the aftermath of major news events. (Google says it is working on addressing misinformation.)
Some of Google’s rivals charge that the company favors its own properties in its search results over those of third-party sites — for instance, how it highlights Google’s local reviews instead of Yelp’s in response to local search queries.
Regulators in Europe have already fined Google for this sort of search bias. In 2012, the F.T.C.’s antitrust investigators found credible evidence of unfair search practices at Google. The F.T.C.’s commissioners, however, voted unanimously against bringing charges. Google denies any wrongdoing.
The danger for Google is that Mr. Trump’s charges, however misinformed, create an opening to discuss these legitimate issues.
On Thursday, Senator Orrin Hatch, Republican of Utah, called for the F.T.C. to reopen its Google investigation. There is likely more to come. For the last few years, Facebook has weathered much of society’s skepticism regarding big tech. Now, it may be Google’s time in the spotlight.
No comments:
Post a Comment