Tag Archives: social media

Will AI destroy democracy?

Jay Yonamine
Jay Yonamine

Some political scientists and democracy scholars think that it might. The thinking goes something like this: inequality will rise as jobs continue to be automated, which will cause distrust in the government and create fertile ground for authoritarianism.

Jay Yonamine is uniquely qualified to weigh in on this issue. He is a data scientist at Google and has a Ph.D. in political science. He has an interesting perspective on the relationship between automation and democracy, and the role that algorithms and platforms play in the spread of misinformation online.

In some ways, this conversation makes the counterargument to our conversation with Penn State’s Matt Jordan about the relationship between social media and democracy. The conversation with Matt is worth revisiting for two perspectives on some of the most complicated questions facing democracy today.

Additional Information

Episode with Matt Jordan: Facebook is not a democracy

Profile on Jay from Sync Magazine

The Fourth Age by Byron Reese – a look at the relationship between technology, humanity, and democratic values

Yuval Noah Harari on the relationship between technology and tyranny in The Atlantic

Discussion/Reflection Questions

  • What do you see as the relationship between AI and democracy?
  • Should Google and other platforms regulate the contact that users see?
  • Do you feel that you have control over the content you see on Google and other sites?
  • Are you concerned about AI’s impacts on democracy?

Interview Highlights

[3:40] How do you define AI?

AI is has to be something that’s not just a human brain relying on itself. Most of the time, when folks think about AI, what they mean is computers, which is to say a computer is doing the thinking or doing the analysis as apposed to a human brain. How I think of intelligence is the ability to make nontrivial, falsifiable, accurate predictions. I think most folks would agree that the act of a robot by itself is not necessarily artificial intelligence, but  the AI aspect of a robot would actually still be the, sort of computer engine that interprets the world and makes predictions

[6:25] What is the relationship between AI and democracy?

A few things have happened simultaneously that might not be as causal as maybe we might believe. There’s definitely been an increase in populist-based politicians in the United States and abroad and a move towards more heavy handed political ideologies. And then of course there’s also been a fairly rapid growth in the prevalence of AI and machine learning in our day-to-day. It’s not clear that those two are connected, but you can see the reasons why people draw their connections. And I think primarily they revolve around news, and around platforms, and around the increase ease of sharing information, and around the increase ease of sharing disinformation.

[8:26] Does one influence the other?

What’s interesting to me as a political scientist and someone who has studied the history of political institutions and political dynamics is for almost all of history, increased access to information and increased access to create and assimilate information has almost always driven an increase in what you might call liberal democratic values. Free speech, democracy, things that have generally been held up as good. And it’s almost always been some autocratic force that has fought against the spread of information that’s going back to the printing press.

What’s interesting now is we’re seeing for the first time, the possibility of that actually shifting. We’re now starting to see that the ease of access to information and the ease of creating and assimilating information might actually now be contributing to the spread of more antidemocratic values.

[10:03] Is AI’s impact on democracy being discussed at tech companies?

The degree of regulation is definitely a hot issue. It’s an immensely complicated issue and one with no easy answers. There’s folks who are arguing for increased regulation ti decrease the spread of misinformation, create a better informed populous, aversion to some of the antidemocratic stuff that we’ve been seeing.

But the counter to that is that you don’t want some centralized control over what can be shared and by whom. And so there’s definitely merits to that argument as well. And it’s an immensely complicated challenge. If you’ve got a team of experts in the room and, and gave them, a handful of pieces of content, I suspect they would have a hard time even reaching consensus. And then when you imagine that scale that a lot of companies operate at it’s, it’s tens of thousands of hundreds of thousands of millions of pieces of content a day, a week, a month.

[13:24] How are companies balancing these big issues with their day-to-day work?

What a lot of companies are trying to do is, hire or create teams and departments and groups whose full time job is just to think about these types of ethical issues. And then create scenarios where those voices have sufficient authority or discretion to actually impact product roadmaps. Companies are big, complex organisms and it’s hard to introduce that type of, of thinking in a really productive way. It’s not like there’s a blueprint where you can say, “Oh, well this is how company A did this in ’98” and now there’s someone who wrote a book on the best practices for introducing ethics and normative guidelines into an AI-based product.

[18:31] How should candidates be talking about these issues in 2020?

It’s very easy to be optimistic about the societal benefit of technological adaption here’s the self driving story where it’s feasible to imagine a world where 50 years from now there’s one one hundredth of the car fatalities that there are today. So that I think is a pretty easy, legitimate story to tell about the benefits of innovation. The counterargument is that when someone comes up with some new device, it displaces a meaningful number of jobs and what do you do with those people? To go back to self-driving cars, we could see a very quick reduction in the number of truck drivers that are needed in the coming years, which is a major industry in a lot of places.

The optimist would say that new jobs will be created to do things like work on the self-driving cars and trucks and do additional road maintenance because the quality of the roads will become increasingly important, but it remains to be seen whether that will actually happen and those jobs will actually be created.

Facebook is not a democracy

Matt Jordan

We have access to more information now than at any other time in history, but we trust that information less than ever before. A Gallup survey recently found that 58 percent of respondents felt less informed because of today’s information abundance. As with a lot of things in life, too much of a good thing might not be so good after all.

If you’ve followed any of the recent news about Facebook — from Mark Zuckerberg’s comments about Holocaust survivors to the decision to ban InfoWars — you’ve probably heard the company make claims about giving its community a voice and other things that sound very democratic. However, as Matt Jordan explains in this episode, that is not the case at all.

At the end of the day, Facebook is a company and its goal is to make a profit. The result of that, Matt argues, is an algorithm-fueled avalanche of information that mixes news with opinion and fact with fiction to reinforce existing thoughts and feelings rather than exposing us to new ideas and perspectives.

Matt has also spent time studying the history of the term fake news and found that it goes back much farther than Donald Trump. He talks about how fake news in 2018 looks different than it did in 1918 and what responsibility journalists and news consumers have to push back against it.

Matt is an associate professor of media studies at Penn State and co-director of the Social Thought Program. For a look at how journalists are working in this media landscape, check out our interview from last season with Halle Stockton of PublicSource, a nonprofit news organization in Pittsburgh.

Note: This episode was recorded before Alex Jones and InfoWars were banned from Facebook, YouTube, iTunes, and other platforms.

Recommended Reading

Matt’s article on the history of fake news

How can democracy thrive in the digital age? From the Knight Foundation Commission on Trust, Media, and Democracy

Discussion/Reflection Questions

  • Do you think that Facebook’s mission of building community is a cover up for collecting and selling data?
  • Have the scandals surrounding Facebook and other social media over the past year changed your view of the platforms or how you use them?
  • What role should social media play in a democracy?
  • Do you think we’ll ever move away from a scenario where people use social media as their primary news source?

Interview Highlights

[3:23] How did you get drawn towards studying the concept of fake news or fake information?

Jordan: I’m always looking to see how structure of media are impacting cultural conversations. This concept of fake news has become a coercive presence in our media that has continued through 2016 and into 2018.

[4:20] Has the impact of fake information and news continued to today?

Jordan: It has continued through the body of the car into the interior.

[4:35] Can you tell us about your work chronically the history of the term “fake news”?

Jordan: Despite Trump claimed the credit for creating the term, which is a very “Trump” move, it actually came about with the end of the 19th century as a way to lie in media. It was a creation of the muckrakers who used misleading and fake information in the media to impact how the conglomerates at the time controlled media. They came up with this term to discredit both completely fake reporting as well as reporting that left out important information on a particular report. Therefore, the label of “fake news” became something that liberals and progressives used to discredit the large for profit news sources.

[6:00] Are there examples of the use of fake news in this manner today?

Jordan: I think you can look at Fox News and use the label of “fake news” to describe what they do. For example, they often times omit important information from their daily coverage. There will be a major story of the Mueller investigation but Fox will spend their time covering a small student protest on a college campus and the push back they got from liberals as an example of liberals not allowing free speech on campus.

[7:40] Media has changed considerably over the last one hundred years. What impact has sea ranch engines like Google had on the development of fake news?

Jordan: One example of how these creations have impacted our media consumption is a term that Mark Zuckerberg likes, which is information that is “relevant to us”. This is information that we would be predisposed to believe gives our personal sensibilities and opinions. This creates feedback loops where we’re just being exposed to information that we tend to believe or agree with ideologically. The new profit incentive is to keep us interacting.

[10:41] How do media outlets balance this dedication to the truth while allowing all people to access their platform?

Jordan: Facebook is shrewd enough that all of their talking points are related to democratic ideals. For example, their use of the term “voices” referees to the democratic ideal that we should allow more voices into the public discussion on an particular issue. This is a struggle for people to talk about democratic theory. While we say we are going to have “voices” involved in a conversation is not to say that this is inherently going to lead to a fruitful or productive conversation on a topic just because more sides are represented. Do we need more wars? Do we need more Alex Jones out there? Probably not. It is important to remember that outlets like Facebook benefit form having more Alex Jones type characters getting people going emotionally. This makes the media more sticky. It is interesting when they use these democratic ideals of “voices” to cover up their own interests, which is to create more emotional news which increase their profits.

[12:33] Given that, what do you see as the path forward?

Jordan: Whenever you have an explosion of new media formats and sources of information, you’re going to have an explosion of misinformation and fake news. At each time this has happened in history, the government has had to come in and regulate who can use the platforms to spread information. Therefore, those who say that the market can help clean up these news sources are simply wrong. Mostly because the markets have little financial incentive to limit certain sources of media as part of cleaning it up.

[15:36] In the Trump administration, do you see a place where government oversight can take place?

Jordan: The only way this would happen is if the democrats sweep in the midterms and the Trump administration ends up on the back end of the national mood. This would mean they would want to limit messages to try to get on the right side again. I simply don’t see this happening. The right has done a good job of making people think that regulation is simply bad across the board. It is going to be a difficult battle to get people to warm up to the idea of introducing government regulation to current mass media platforms like Facebook. The system we have today of fact checking media simply can’t keep up. For example, I heard in a recent sort of ad hoc way that we receive more information today than those alive in the 15th century would receive in an entire lifetime.

[19:17] What responsibility do journalists have in these situations?

Jordan: They have a huge responsibility. The purpose of freedom of the press has always been to limit the ability of those in power to get away with lying to the people. However, in order for them to get the public’s trust back, they have to show that they are working to find the truth rather than to protect the interest of their shareholders. The future for these journalists is at the local level. People really love their local news. While national coverage is an emotional wrestling match, it is the coverage of local affairs that people can relate to which gets them to come out and get involved in government. I think local media that can inform people of things that directly relate to them in their communities is how journalism can lead the fight against the current media climate that fake news is flourishing in.