New U-M Center Aims to Combat Spread of Fake News, Info Apocalypse

We’re all painfully familiar with the now ubiquitous term “fake news.” It first popped up during the 2016 presidential election, when media observers began seeing a proliferation of dubious news stories online that seemed engineered for maximum disruption and chaos.

Since the election, the term “fake news” has evolved to be an almost meaningless catchall phrase used to dismiss news stories the reader finds objectionable, so now anyone can call anything fake if it suits their purposes. With all these competing narratives flying around, how does one find truth in a post-fact world? How will this country ever find its way if nobody can agree on a shared reality?

And as we’ve seen with the recent school shooting in Florida, there’s a segment of the U.S. population that believes the horrific attack killing 17 people at Stoneman Douglas High School was nothing more than political theater performed by government-supported crisis actors despite mountains of evidence to the contrary. What do we do when even the most provable facts are dismissed by swaths of the citizenry, who are convinced those facts are part of a political agenda?

These are just some of the big questions that the University of Michigan’s new Center for Social Media Responsibility will wrestle with. The initiative is being led by Garlin Gilchrist II, a U-M alum and data scientist who served as the city of Detroit’s director of innovation and emerging technology for two years before resigning to run for Detroit City Clerk in 2017—a squeaker of a race he ultimately lost only after a recount. Prior to his gig in Detroit, he spent some time as a social media manager for the Obama administration.

Social media is a captivating tool that has the power to reunite high school sweethearts, rally help for people in precarious circumstances, and even spark revolution. Given the reach of social media—the latest Pew Research poll shows 67 percent of Americans get at least some of their news from social media—Gilchrist would like to see the tech industry examine how their products are affecting the national (and global) psyche, and agree on best practices for snuffing out fraud, misinformation, and wildly untrue political propaganda.

Gilchrist says social media companies are beginning to take small steps toward fixing the problem already, but a few filters and algorithms aren’t going to stop what he characterizes as an “infocalypse,” or information apocalypse. The center defines infocalypse as “a state when fake news and altered videos on social media and elsewhere on the web effectively end social reality as we know it”—and Gilchrist warns that we’re fast approaching the point of no return.

“We do have time to stop it,” Gilchrist maintains. “We can take action to slow it or, perhaps more realistically, help people recognize it when they see it.”

U-M’s School of Information, which houses the new center, has years of social media research waiting to be “activated,” Gilchrist says. His team plans to build tools and offer usable data to help content creators and consumers as well as social media platforms to thwart misinformation and get insight into what’s appearing in their news feeds.

“For example, we could tell them the percentage of fake news that is ending up in their networks,” he adds. “So many things in life are a matter of perception, so we need to measure and design a framework for trustworthiness.”

Gilchrist says the center will convene working groups to monitor progress, formulate principles and create metrics that empower the tech industry to push for responsibility on the part of those who own social media companies, and increase social media user savvy.

The center also plans to define what social media responsibility is. Because so many people now get their news online, and that’s only going to increase as younger readers continue to turn away from traditional print and TV outlets, Gilchrist says his team will seek partnerships with social media companies as well as traditional media outlets. He also feels encouraged by recent developments indicating companies like Twitter (NYSE: [[ticker:TWTR]]) are beginning to take their role as de facto media outlet more seriously.

“In the past few weeks, we’ve seen Twitter call for responses from [social media] researchers,” Gilchrist says. “We’re seeing companies make more public calls like that. I do think it’s an important time, and I’m thankful that U-M is organizing a center to respond.”

Gilchrist has a message for free speech advocates who worry that adding more social media controls will have a chilling effect on open discourse.

“It’s a matter of what you’re trying to design for,” he says. “Most people doing this work want different perspectives to be well represented and not silenced if they have an unusual view. We can optimize the design of social media systems to make sure diverse voices are heard.”

If you’ll be in Austin, TX, for SXSW next week, Gilchrist and Aviv Ovadya, a technologist he’s working with on social media responsibility, will present a session titled “Infocalypse: The Future of Misinformation and How We Stop It” at 2 p.m. CDT on March 13 at the conference’s interactive track.

Author: Sarah Schmid Stevenson

Sarah is a former Xconomy editor. Prior to joining Xconomy in 2011, she did communications work for the Michigan Economic Development Corporation and the Michigan House of Representatives. She has also worked as a reporter and copy editor at the Missoula Independent and the Lansing State Journal. She holds a bachelor's degree in Journalism and Native American Studies from the University of Montana and proudly calls Detroit "the most fascinating city I've ever lived in."