OCT. 12, 2018
Updated Oct. 17, 2018 10:18 a.m.
Digital technology is reshaping media and culture. Our scholars explore how to build and use these new tools responsibly.
Safiya Umoja Noble had just begun researching the inner workings of search engines in 2009 when a colleague quipped, “You should see what happens when you Google ‘black girls.’”
Assistant Professor of Communication Safiya Noble.
“I am a black girl,” Noble recalled. “I have a daughter. I have lots of nieces. I immediately did the search.”
She got back a page full of pornography.
“That started me on a deeper line of inquiry about the way in which misrepresentation happens for women of color on the internet, and the broader social consequences of that,” said Noble, assistant professor of communication. “This was happening on the internet, a forum where people think of the information that they come across is objective, neutral and credible.”
Noble had previously spent 15 years in marketing and advertising, working for some of the largest Fortune 100 brands in the United States. As she was leaving corporate America and beginning graduate school at the University of Illinois at Urbana-Champaign in the late 2000s, she started scrutinizing the rise of digital technologies — Google in particular. She noticed that many of her peers were touting the “liberatory” effect that Google was having on the information space.
“I got a lot of pushback from people in academe and people in industry who said, ‘Google is the peoples’ company,’ and to critique them was unfair because they were doing so much more than any other company had done to make information accessible,” Noble said.
“But as this total diversion of public goods and knowledge into privately held organizations unfolded, there were questions to be asked — like, ‘Who will truly lose?’”
Noble is among several USC Annenberg faculty asking these kinds of probing questions about our evolving landscape of digital media and information systems. Their research explores the importance of establishing a strong ethical framework to bear on new modes of information-sharing — both on the part of the profit-seeking firms creating these new tools, and on the part of individuals as responsible and sophisticated users of the tools. While scholars differ on how these new-media ethics might be implemented, they agree that academia must have a role in developing workable solutions.
For Noble, her research led to her latest book, Algorithms of Oppression: How Search Engines Reinforce Racism. One of her major findings: The results of Google searches are not value-neutral.
“In the book, I interrogate the fundamental building blocks of technology,” she said. “Computer language is a language, and as we know in the humanities, language is subjective and can be interpreted in myriad ways.
“It is the responsibility of companies, users and regulators to ask: What’s ethical, what’s moral, what’s right, what’s oppressive, what’s fair, what’s socially just, what fosters civil rights, and what erodes human rights? All of those conversations live in the domain of ethics.”
But most who work in the information and tech industries, she argues, don’t possess the ethical training, knowledge, expertise or, in many cases, the inclination to think about what they are stewarding and how they are influencing public opinion. If people are willing to shift their attention away from journalists and academic scholars and instead rely heavily upon these media platforms to provide trusted information, Noble wonders who is guiding them.
“One of the most important frameworks for computer scientists is the concept of universal design, yet the philosophical underpinnings of this universality often preclude women and people of color,” she said. “And these challenges, of course, are deeply tied to ethical structures.”
Noble recognizes that over the years some engineers have tried to apply a traditional sense of ethics as they design. Nevertheless, the specific ways that vulnerable people — women and people of color — are often in the crosshairs of some of the worst abuses of technology remain to be addressed, she said.
There is no golden age
Professor of Communication and Director of Doctoral Studies Tom Hollihan.
Tech companies are far from the first to face scrutiny for how the choices they make in disseminating information can sway thoughts and opinions. Ethics in communication, or the lack of them, can mean the difference between news and propaganda.
According to Thomas Hollihan, we can trace 21st- century concerns about ethics in mass communication back to the Sophists of ancient Greece, who not only insisted they could teach persuasion, but virtue as well.
An authority on political rhetoric and a former USC debate coach, Hollihan notes that, as they sought to attract and win over their pupils, Sophists increasingly relied upon a deliberate use of fallacious reasoning and exaggerated claims. This drew harsh criticism, especially from Plato.
“Plato was hostile to rhetoric,” Hollihan explained. “He thought that the orators had inflamed the passions of the people and didn’t think the public was really suited to evaluate arguments.”
Then along came Aristotle.
“He provided a much more pragmatic and systematic view of persuasion,” said Hollihan, professor of communication and director of doctoral studies. “He talked about the psychological characteristics of audiences and the topics that most influence them.”
Aristotle proclaimed rhetoric to be an instrument that can be either harmful or profoundly helpful, depending upon the virtue and value of the rhetor who uses it.
In other words, intent is key. Just as Aristotle underscored the need to assess the “virtue and value of the rhetor,” Hollihan believes that, when it comes to media platforms, a fundamental evaluation of their intrinsic goals should be the starting point.
Is the platform acting to fulfill narrow self-interests or promoting shared communal interests? Does it fundamentally operate with goodwill toward the people it is trying to influence? Is the platform honest and faithful in presenting the information people need to make a good decision, or has important information been withheld?
“The bottom line is, you need ethics, character and good purpose,” Hollihan said. “There are lots of examples in history of people who have been effective in winning over audiences. But, if you do it based on fear, anxiety of the other or willful ignorance, then I don’t think you can celebrate it as good rhetoric, even if it’s successful rhetoric.”
Turning to his expertise in political communication, Hollihan points to the emergence of propaganda studies during World War I.
When President Woodrow Wilson issued an executive order establishing the Committee on Public Information in 1917, he gave the new federal agency authority to use every medium available at the time to persuade Americans it was in their best interest to support the war.
“That was the first concentrated attempt to manipulate political opinion in the United States in a very systematic way,” Hollihan said. “We created this public effort to actually persuade people, in this case, to support the war: to buy Liberty Bonds, to enlist and to get behind the war effort with enthusiasm.”
At every step, society has seen technological advances leveraged to influence the masses. But the convenience or the entertainment value of these technologies has always been sufficient to overcome the initial anxiety innovation produces.
“We move on and adapt,” Hollihan said. “People adapt, society adapts. Technology adapts. And social science adapts. New developments will continue to occur, and eventually the entire situation we’re in now will turn out to have been a blip in history. I don’t think we want to be too nostalgic about a perfect golden era that never really existed.”
It’s all about your audience
Henry Jenkins agrees that as technologies emerge, we always find a way to push beyond them; that constant back-and-forth struggle is the nature of the world.
In considering the nuances of tech’s modern-day powers to persuade, however, he differs from some of his colleagues. Jenkins objects to ascribing too much power to one side — the creators of these digital tools — and treating the other — the users — as if they had no power to change the situation.
Jenkins, Provost Professor of Communication, Journalism and Cinematic Arts, flat-out rejects the premise that people are “hypnotized” or “captured” by their devices. Human beings, he argues, never engage in any activity that’s meaningless to them. The pursuit of meaning is a core human urge, Jenkins insists, and it’s the social scientist’s job to understand how and why an activity is meaningful.
His research documents how young people, far from being hapless victims hijacked by their devices, are bending digital media — and every other kind of media — to their will. Jenkins’ 2016 book, By Any Media Necessary, is based on interviews with 200 youth activists who are breaking new ground in political discourse through a media-saturated vocabulary rooted in pop culture.
Jenkins’ current project, funded by a MacArthur Foundation grant, looks at youth activism through the lens of civic imagination. His 16-person research team at USC Annenberg monitors more than 30 youth-run social movements around the world, documenting how they use popular culture and new media to further their political goals. A collaborative book, Pop Culture and Civic Imagination, will be released next year.
The media skills of youth activists were on full display last spring after the mass shooting at a high school in Parkland, Florida, as outraged students took America’s gun debate into their own hands. “They went seamlessly from social media to a CNN town hall meeting to a face-to-face meeting with the president to a march on Washington and speeches on the Mall,” Jenkins said. “They even communicated through the patches on Emma Gonzalez’s jacket.”
From his perspective, the present moment is not about corporations manipulating young people. Rather, young people are taking advantage of available resources, including digital technology, and using them effectively to bring about new kinds of networked change.
Provost Professor of Communication, Journalism, Cinematic Arts and Education Henry Jenkins.
“That meaning may be translated into cash and exploited by corporations, but the starting point is something that kids really deeply desire to do,” Jenkins said. “My own ethical commitment is to start to figure out what it means.”
Social media, for example, fills a hole in the social fabric frayed by hypermobility. Today, Americans relocate on average 12 times across their lifespan. This had been increasing generation by generation across the 20th century. To Jenkins, people using Instagram, Snapchat or Facebook are not so much slaves to these platforms as social beings using technology to build up communal cohesion and maintain social ties across long distances.
“I think of it as bringing your social network with you wherever you go — like the turtle brings the shell on its back,” Jenkins said.
One practice that has media critics worried is unethical promotion of videos. The 15-second countdown that queues up the next YouTube video is not long enough, psychologists say, for the human mind to make a reasoned decision about stopping or continuing.
Again, where critics see manipulation, Jenkins is on the lookout for meaning. And here he speaks from personal experience, as an avowed Netflix binge-watcher.
“Yes, I sit and watch one show after another,” he admits, “but I’m not randomly watching. I’m exploring a list of 30 shows I want to watch, because there’s that much good TV being produced. I’ve chosen them from a broader range of media content than I have ever had available. And those shows are tied to all kinds of conversations I’m having as a fan within the web-based fan communities I participate in. Some other fans are rewriting the shows, remaking them as fan fiction art, and re-creating them as cosplay.”
The important point, Jenkins said, is that, persuasive technology notwithstanding, digital content consumers are constantly making self-interested choices and curating their playlists.
“Social media drives an awful lot of YouTube circulation,” he said. To Jenkins, far more interesting than the tricks platforms use to hold captive audiences is the logic by which viewers decide what video content to recommend through their social media network.
Focusing on the platforms gives a distorted view of their power to control our brains, he believes. Studying the audience side reveals a cascade of conscious and creative responses being made by consumers, and all of the choices are meaningful to the people making them.
“So, can I be distracted?” Jenkins asked. “Yes.
“Can I be fed misinformation? Yes, though online communities actively debunk false information that circulates through social media.
“Can I be confused, distracted, pulled in different directions? Yes. And companies can definitely make money off choices I’m making.
“Still, I see many, many potent examples of conscious choice-making throughout the media landscape. The kind of disempowering rhetoric that seems to be dominant at the current time is not helpful for understanding and explaining the behavior we’re actually seeing.”
More posted on USC: https://annenberg.usc.edu/news/feature/search-ethics