Fake: Searching for Truth in the Age of Misinformation
07/01/20 | 56m 40s | Rating: TV-PG
Fake: Searching for Truth in the Age of Misinformation provides viewers with tools to help discern fact from fiction in news reports drawing on common sense, expert opinions and the universal standards of journalism. Learn the media literacy skills to dissect breaking news and utilize critical thinking before adopting radical stances.
Copy and Paste the Following Code to Embed this Video:
Fake: Searching for Truth in the Age of Misinformation
This program is made possible with support from Connecticut Humanities. As misinformation and so called fake news continues to be rapidly distributed on the internet, our reality has become increasingly shaped by false information. The so called fake news. Fake news. You are fake news. Discussions of democracy and discussions of propaganda have always gone together. The digital space in general, when websites were sort of coming about had a massive change. Social change things even more in the sense that there was a lot that people could come across by chance, and by other sharing with them. If we can't discriminate between serious arguments and propaganda, then we have problems. What does it mean to be a literate citizen in today's world? And from our perspective its media literacy, right? They you really need to understand all of our communication tools, all the different types of technology. We have to be able to consume and create using all of them. Imagine if you walked into a library today and instead of everything being arranged, there were 2 million pieces of paper just flying around in the air and you grab one of them, and you have no idea who wrote it, who financed it. Was it legitimate? Is it illegitimate? That's the internet. You know, it seems hard to imagine that democracy can function that well when there's widespread disagreement on basic facts.
NARRATOR
The daily avalanche of legitimate news from cable TV, social media and endless websites, often includes news reports that look real but are actually jokes or hoaxes and propaganda. Given the volume of news available, how can the average person separate fact from fiction? (indistinct chatter) There were several fatalities at the scene both students and staff. The Sandy Hook Elementary school shooting shocked the country, and for a moment like other tragic events in our recent history brought the country together. Unfortunately, it didn't take long before Alex Jones, the founder of Infowars began to peddle a conspiracy theory about the shooting. Jones has repeatedly claimed the massacre was a giant hoax, carried out by crisis actors in a broad scheme to trample on Second Amendment rights. Whether Jones actions were motivated by greed, malice or as he later claimed, psychosis, his story exposed the risks of misinformation in a deeply divided environment, where the breaking news cycle is thriving and tech companies control how we receive and distribute information. Two families of children who died in the 2012 Sandy Hook Elementary shooting are now suing radio host Alex Jones for defamation. The controversial host of Infowars has long suggested the media faked the information about this shooting. They stage Sandy Hook, the evidence is just overwhelming. It took me about a year with Sandy Hook to come to grips with the fact that the whole thing was fake. It's almost political theater. It's, you know, agitprop kind of performance art in the most bastardized and sadistic way. They've got the kids going in circles in and out of the building with their hands up I've watched the footage, and it looks like a drill. Ultimately someone like Alex Jones, given the architecture of the internet is probably going to make more money as an outrage influencer, than he would as a small local news organization. It's not actually about children anymore, it's stripped of any notes of empathy or sympathy or cognitive understanding about how things happen. But what happens is there's an alternate framing that attempts to make sense out of, you know, an event in a way that fits their own worldview. Given how crazy this stuff seems. Why is it that people could come to believe it? Shouldn't our reasoning abilities allow us to see that this content is obviously not true? This classic effect from cognitive psychology called the illusory truth effect is the finding that just hearing a statement repeated makes it seem more plausible. We had people read some stories then do some distractor tasked thoughts and random surveys for five minutes, and then we have them rate the accuracy of a bigger set of stories, some of which we showed them in the beginning and some of which we didn't. And they rate the ones that we showed them at the beginning as more accurate than the ones that don't. If they hadn't seen the headline before, about 18% of the headlines got rated as true, but if they had just been shown it five minutes earlier that went up to 24%. When it comes to the kind of partisan misinformation that circulates on social media which is what we've been focusing on, it's really not that reasoning powers are getting hijacked, it's just that people are not bothering to reason in the first place, and people are just kind of going with their intuitive gut responses. And what we found is when people stop and think a little bit more, they're actually substantially better at telling what's true versus false. One perspective of fake news when it comes to democracy is that it's just about electoral manipulation. It's about like duping voters and trying to steal elections. I think fake news and its effects are powerful but also much more sophisticated and a little bit more subtle than that. We're here today to discuss online imposters and disinformation. Researchers generally define misinformation as information that is false but promulgated with sincerity by a person who believes it is true. Disinformation on the other hand, is shared with the deliberate intent to deceive. Members of the committee, thank you for having me here today. As you know, this problem is nuanced and complex. I've been looking at disinformation campaigns for many years. I want to highlight that while we tend to focus on fake content, the most sophisticated actors I have seen operate online actually tend to use authentic content weaponized against their targets. Today, I think there's so many different ways that an organized group can manipulate a conversation. What we use internally is called the ABC framework, and basically what it says is there are three different ways in which something could be disinformation. It can be disinformation because its deceptive actors, right? Perhaps it looks like it's a just normal activist but in reality it's a Russian military officer, that's a deceptive actor. Or it could be disinformation because of a deceptive behavior, right? For the actor is exactly who they say they are but the way the campaign is amplified, the way the message is amplified is what makes it disinformation. That's when you use a troll farm in order to flood the internet with messages to make it look like there's some collective action, but really there is not, it's coordinated. And then there is C for content. Sometimes the content itself is deceptive, right? It could be a fake image or it could be a fake video, something in which the content itself is the vector for deception. So, in reality it's a lot of different ways to do disinformation. In the run up to the 2020 presidential election, the United States is headed into what could be one of the most extraordinary years of claims and counterclaims, misinformation, and a renewed public perception that our democracy and truth itself are under attack. When we asked Americans about made up news themselves, is that 50% say that it's a very big problem facing the country today, and that places it above things like the environment or terrorism or some of these other major issues. So they definitely see it as a very big problem, they think it's getting in the way of the country being able to function well, of leaders being able to effectively make decisions and do their work, of Americans being able to stay informed about current events. This is not a simple problem and it's not a new problem. It's the problem that so much of democratic political theory in its 2000 plus years has been devoted to addressing. Epidemic of malicious fake news. We know that the Russians were propagating fake news through Facebook and other outlets. The expression fake news is a terrible expression and should never be used. Fake news suggests that there's some new thing that wasn't there before. What we face is the problem of propaganda. Propaganda as a concept begins at the very beginning of discussions of democratic political philosophy. In the 20th century, we have propaganda arising in the First World War. In wartime propaganda is always needed in order to represent the enemy as some sort of uncommon villain that is so beyond the pale, and so fearsome that you need to risk your life in order to protect your family. Then we have the national Socialists who turned propaganda into an art form. So the problem of propaganda has always been central to any discussion of democracy, because democracy allows people to say what they want. Plato in book eight of The Republic, says that democracy will lead immediately to tyranny, because democracy has at its core freedom of speech. You can't have democracy without the freedom of speech. How do we deal with this, consistently with remaining a democracy. So this is the central problem of democracy. We can actually remove a lot of the partisan politics from this and say, "It's not about the left, "it's not about the right, it's about being "able to trust your sources of information." The public puts most of that onus on journalists and the news media to sort of solve that problem of made up news, although most of the public thinks it will get worse over the next five years than they think it will get better. Decades earlier we could all rely on and trust the fact that what news anchors were telling us on television was socially accepted fact. But with the rise of social media and going from web 1.0 to 2.0 to web 3.0, where anyone can create a social footprint that mirrors that of a media organization, the ability to trust that information is almost completely lost on us. As tech giants and media conglomerates fight for control of the 24 hour cycle, independent newspapers, the backbone of local and regional journalism remain the most trusted source of news around the country. Is it the threat of Russian maneuvering or the death of trusted local journalism that poses the biggest risk to reliable news in America. Even the best local newspapers have struggled to fully adapt their business models and newsrooms to this new media landscape. Thousands of reporters and editors have been cut in the past decade, greatly diminishing the capacity of independent newspapers to consistently cover their communities in depth. The Gazette's been around for 125 years we just celebrated our 125th, and it has been a family newspaper the whole time. Right now we're owned and controlled by a brother and a sister. We really, I think are the quintessential, you know, locally owned family business. It's definitely the worst that's ever been in terms of questioning whether the media is in bed with somebody or is corrupt. When you ask questions about trust, you'd see you ask about local news organizations and they tend to get the highest level of trust from Americans about 25%, that would say they have a lot of trust. National drops down to close to 20%, but when you ask about social media specifically, you're down in the single digits when it comes to trust. There's still not a lot of understanding once you pull back the curtain on the newspaper business, I'd say the media business in general. As a reporter I don't have an opinion, it's not my job to have an opinion. You know, my job is to present the facts. A newspaper has an opinion section which is different than the editorial section, not many people understand that that's not the opinion of the reporters, but just a small group of people who form those.
GPS VOICE
Head Southwest on Maxon Road extension toward Van Der Bogart Street. Schenectady is a, it's a classic American Rust Belt city. A lot of the, you know, sub, the housing stock and some of the city's neighborhoods is substandard. We've been reporting on these issues because several prominent buildings have been condemned and we find that people are still living there despite safety issues, but where can they go if they are low income. I know somebody is home cause we saw them come out as we were driving by. There's kids inside. This is local reporting man it's a lot of just waiting on people's porches and being inconvenient. Our paper like a lot of papers once upon a time, made a lot of profit from, from our print advertising business. We had tons of very loyal advertisers, there was this shift to digital journalism we were slow to respond. The digital space in general, when websites were sort of coming about had a massive change to the way news was structured and sort of breaking up that bundle. But social change things even more in the sense that there was a lot more that people could come across by chance, and by others sharing with them. Sort of that network of sharers and the bumping into news as opposed to sort of one specific time I was going to sit down and take it in, through this dedicated organized fashion. We have fewer resources, that's a reality. We have fewer, far fewer editors. And so there's in many cases in newsrooms, far fewer staff didn't use to exist, and the news cycle itself has become minute by minute. Breaking news tonight the drama unfolding on Capitol Hill. We are breaking in on a very busy news day. Busy news night, busy news week. Fast moving developments on fast moving fires. And so there's a constant feel of the need to stay up to date with whatever is breaking or happening at the moment, and with fewer staff being able to sort of turn away and spend time on a focused dedicated story, becomes harder for news organizations to be able to do.
NARRATOR
It would be impossible to understand today's news environment without understanding the role and impact of satire and late night comedy in the current 24 hour news cycle. Thank you, and what kind of real news have you heard out there? (audience laughter) We've always wanted to do an exhibit about the power of satire and free expression in the First Amendment, and this gave us a perfect opportunity to do it. What better moment than the year before a major presidential election to talk about the power of politics and satire. And now for our continued comprehensive coverage of the final blow. (audience laughter) You're out of order, he's out of order, this whole trial is sexy. (audience laughter) President Clinton's historic impeachment trial begins Thursday, and the most important issue facing the United - back 20 years ago, when Jon Stewart started doing his show, and people dubbed him fake news because they weren't real journalists. They were doing sort of reporting, they were interviewing people, they were gathering facts and adding humor to it and people called it fake news. Well, that term has become a much more malevolent term these days. -Waterboarding is how we baptize terrorists. (crowd cheering) -Huh? (audience laughter) We start this exhibit actually back at the beginning of our country, British colonists were making fun of the, of British rule and of King George, so sort of that element of wanting to make fun of people in power is part of our American DNA. And so Jon Stewart comes on the scene in 1999 and kind of makes it into a real cultural powerhouse. I think it evolved. Early on was doing I guess you could kind of say a little bit that juvenile humor, that you know, is so much a part of shows like Saturday Night Live and others, and then he really kind of twisted it to kind of get it into like let's talk about some more important issues. That's what I want to get to, is that, that vision of what 21st century government looks like outside of the polarization of it. Individuals have to be really careful about the way that they experience information. Does the headline support the facts? Is the story even a legitimate story? You have to ask, "What's the source for this information?" "Does this sound exaggerated?" "Does this sound ridiculous?" And then search a little bit more. Review it and think critically about it. And I think that's what shows like The Daily Show and John Oliver and Sam Bee and Hasan Minhaj are trying to get people to do. Take a look at the facts and then really think about it. Really look at the hypocritical nature that politicians have in some cases. Really ask hard questions about the information that you're getting, and about the people who are telling you things. If I walk over there and sit next to Mr. Johnson and carry my phone, does Google know that I was sitting here and then I moved over there? I genuinely don't know what type, knowing what type - -I'm shocked you don't know, I think Google obviously does. Are you familiar with the general data protection regulation by the European Union? -I think the fear of what happened with the social media platforms, the fear that people now have of like, privacy issues and data mining and all of these things, has just led people to find ways to counteract the problems that we're seeing. And a lot of people are turning to media literacy and the media literacy educators and community to be that answer. Okay, so first things first. Let's get to your folders. It's kind of a new and enhanced version of literacy, right? So it's really asking the question, you know, what does it mean to be a literate citizen in today's world? The idea's that you're teaching people to not just learn how to read and write, but you're learning- you're teaching them how to be able to negotiate all source of media forms. All of our people communication tools, all the different types of technology, we have to be able to consume and create using all of them. If you're going to start teaching about reading and writing, you should be teaching about how the digital environment operates, and you know, you should be teaching all of that with an eye towards critical understanding. As students spend more and more time in the digital world, the concept of digital citizenship is seen as an important area of educational knowledge, not only in the United States but around the world. So, digital learning the general definition is any type of learning that incorporates technology usage within it. Where a more practical use of digital learning or how digital learning is more formally known across education is in the type of practices, of teaching kids how to use computers in a responsible and respectful manner. Go ahead and minimize the game so that we can see your code Maya. And team let's get ready to share some shout outs and suggestions for Maya. I see several students want to give you a shout out and suggestion. They don't know how to use this technology, and when they get online and start to use the internet, there's a whole world that's open to them, right? There's so many things that they're exposed to and there's so many things that they can explore. And unfortunately without the proper guidance kids tend to make bad decisions, they tend to not understand how their actions in the internet will affect them later on in their life. A lot of times we have the misconception that because our kids are so exposed to technology that they automatically know how to use it and they know what to do, and that is false. Kids need guidance in how to use the technology just like anything else. As we examine the role of media literacy in the lives of our children, we must also reconsider its position in the lives of our parents and grandparents, who've witnessed the drastic shift from a print culture to a social media culture, vulnerable to similar threat. I think that there is something very much to be said about the generational gap of how people have used the internet over time. I remember using Napster as a kid, and also understanding, like, what is aboveboard and what is not. You know for older generations, and this is what we've seen in a lot of the academic literature, there is more of kind of a susceptibility or vulnerability to kind of consuming more of it, and then also over time believing it. And that's not to say that older generations are searching for this content by any means, they are being targeted repeatedly. Over 65, that population shared more fake information than any other population during the 2016 election. And I think culturally and generationally they grew up in a time where, you know, if you read it, it was true. So I'm going to rest my finger where it says open seven days a week, right on the word open and I'm going to let go. And now what I've done is- The dirty word that nobody wants to use is ageism, right? We live in a society where we have a bombardment of negative stereotypes about people who are older. And we have kind of accepted that somehow we're going to segregate our society on the basis of age. You know, maybe 15 or 20 years ago seniors and older adults were driving a lot of our community and our civic dialogues. So, when you went to a public meeting or a political event, it was 70 year olds who were often the people that had the experience and the confidence to speak out on policy issues. But today we've had this sort of moment where all of those dialogues have shifted online, and a lot of the tools and environments that people are using are now digital. The speed at which technology changes puts a challenge on us as a population and as a society to continue to educate people outside of formal education. So, the question is how do we, how do we teach, you know, how do we deal with that divide? Because it's there. -I read, I learn how to do things that I'm interested in. If I'm looking for news I go to look for news stories that I'm interested in, I use it, you know, for my daily life. Pulling that right handle either to the right and down or- What we need today is to get seniors trained and empowered and included in the digital conversations about our country and our communities futures, so that they can bring those perspectives back in. I think that a lot of media literacy education is focused on messages and the importance of interpreting messages and being mindful of the messages that you create. Or her claim is that we perceive the printed word as more credible than the visual text. Do we agree with that? One of the reasons I love media literacy, I love teaching media literacy, I love talking about media literacy is because it is so broad and there's so many different ways to practice it, but of course that makes it hard to scale, right? Because I can't just go into a school district and say, "This is the way you have to do media literacy." With that said though, the fact that it is adaptable, the fact that it can be flexible for different communities in different contexts is a positive. When we're online we already have the notion that a lot of things are like fake news or like not every - you can't like trust what you see online. But when it comes to printed I feel like there's so many people that look over it over and over and over again before it gets printed. So I feel like - I guess it may not even be 100% true that printed has more accurate news, but I think it's a perceived notion that it is. It's a very interesting point that you just made. The process that something like this goes through to get to print versus the ease at which we can share information online, I think it's an excellent point. The challenges of scaling media literacy often kind of show up in this idea of "Oh, teachers have too much to do," "Teachers that are, you know, in elementary school "or middle school or high school in the United States, "have so much on their plate that you can't "also on top of it give them media literacy to teach." So, the way that we frame it is media literacy is a way to teach, it's not a subject to teach. What's been so interesting about the fake news conversation and the misinformation and disinformation conversation, is that we're making an assumption that the problem is the misinformation and disinformation. The problem is much broader than that. Even if we eliminated everything that's fake, even if Facebook could magically, you know, make all the disinformation disappear, we would still need media literacy, we still have so much to understand. We should have always been asking questions around journalism just because- It's hard to unify the country though, with the news media being so split up. (people talking over each other) People tend to be connected to other similar minded people and so then that can lead to polarization, because, you know, if the two separate groups are only talking to each other, you know, they can sort of get feedback loops, so they get more and more extreme. If have two parties in echo chambers. but one party is able to draw some members of the other party into their own echo chamber then the people that could get drawn in, even though they don't like that party they're more likely to vote in that direction, because they think everybody else is going to vote that way. But it is important to point out what is a blatant left-wing double standard in this country. The hypocrisy on the Republican side for the last few years I can't even fathom. As adults continue to find new ways to disagree, some kids are working hard to learn better ways to communicate and debate the more challenging topics of our time. The New Haven Urban Debate League, a student organization at Yale University, provides free debate and communication skills, critical thinking, teamwork and advocacy that students will use for the rest of their lives. This practice is a little bit different than most practices just because we're done in terms of tournament for this semester. It's going to be a little bit more laid back. We'll go over what we saw in the last tournament and kind of what we learned a lot of it. They want to win the round and they want to succeed. So, even if that means arguing against something they really believe in, they're going to try their best to see the argument from that side. So, I think in that way the competitive drive is helpful. (Indistinct chatter) So you guys know that were, that was it, that was our last tournament of the semester. (clapping) Debate is you know, exactly what fights polarization. We have two kids debating against two kids. They get about 15 minutes to prep their arguments before they debate against each other, they don't get to choose which side of the argument they're on. The issues that we give them, we talk about them in advance but they don't actually know the resolution, what they're actually arguing until right before the round. Debate is learning how, learning about both sides of the argument. So, you know, no matter what you believe in, you're going to have to get up and talk about something, and try your hardest to win the argument from that side. Like let's say that you get something wrong online about like a political candidate or something, that can affect how you vote in elections and that can like really damage the kind of coherency of our news sources. -There's a competitive incentive to win. So you're trying your hardest to find the best arguments for and against your position. And our first contention was that banning fake news or censoring fake news destroys the purpose of social media. So we did- Awesome. Okay, so if you look at your notes from the last round, we're going to do like a quick look at how their final round went, and what they've said and the arguments that they made, and then we'll go over into what everybody else said if that's good. Debate is about being able to justify your opinions. So, you know, I remember in high school having really strong opinions about things, and then you go into a debate and you really have to explain why. And so I think that gives people a really healthy way of looking at their own opinions and the opinions of others. And I think also a really, really important part of debate is learning effective communication. I see a lot of adults who get very riled up when they speak about something that they believe in, and they,they, you know, jump to arguments that really aren't arguments and they start jabbing at others. And what I've learned is it's not just learning the information, understanding how to utilize that, how to process it very quickly, it's also how to effect- how to very effectively communicate that, so that your not only are you getting your point across, but you're creating a good conversation. I think we're very aware that we are the generation that is going to see the consequences of what happens to our democracy, what happens to our planet, and there is a growing sense of urgency. It's really up front and personal in terms, in terms of them realizing that things that they're debating right now are things that are in their hands currently at the same time. I can be doing something about this if I wanted to, because I can tell right now that I have the brainpower to do something about this. So maybe I should. The work that they're doing for an extracurricular activity kind of forces them to look at reality. -You know, we look around the industry and we're... I mean for lack of a better word horrified about what we see in terms of, you know, involvement of hedge funds, or anybody who may be, is interested in the business for a purely business standpoint. The Daily Gazette has about 200 employees all together. It's a mix of part time and full time. We still have to, you know, pay attention to the finances, but we're also exploring other avenues and we're not alone in this, about finding ways to pay for reporters and staff members through various grant money. Right now we're in discussions with a few different philanthropies. So we're looking at all these kind of models that, that might help. We're going to do as long as we can do it, and we're going to try to, you know, do it independently as much as possible. The traditional media forgot two things in the last 20 years. First of all, they forgot that their content is valuable and therefore if it's valuable, they should not be giving it away for free. Once you give it away for free you become solely dependent on advertising, which brings us to the second problem which is if you're solely dependent on advertising, the most important thing is to have some catchy or deceptive headline in order to get a page view. Capitalism makes news deviate from their mission. Under successful democratic times the news is boring. People are making concessions, one group wants this and other group wants this, the politician says, No one's going to turn on the news to watch that, there is no money to be made and straightforwardly doing your democratic job. And it's only 499 per month. Now you can watch One America News Network live anytime, anywhere in the world. Go to our Facebook page and click here to become a supporter today. One of the things about the architecture of the internet is that it provides a series of spaces, for content creators to monetize not only the content that they create, but also the visitors that they bring to those websites. If this bill were to pass, would this prohibit the sale of the Bible that teaches these things about sexual morality? Well, literally according to how this law was written, yes it would. The misinformation factory model is so successful because it can be easily replicated, streamlined, and often requires very little expertise to operate. Meanwhile, legitimate local news organizations, which often rely on similar Ad supported infrastructure and industries for their livelihood are suffering. So, oftentimes a lot of the hyper partisan social media pages and sites that portray themselves as news-like, but are actually just pushing disinformation are able to use social media to bring people to the website, which then brings them money through paid banner advertisements that are usually programmatically placed by big Ad exchanges like Google. The structure of the social networks controls what information you have access to. People who get into it for political reasons, can make it much more of their career because of the financial prospects that are often involved. Sometimes what you really need is just good investigative journalism, right? A lot of these campaigns are exposed by investigative journalists and by the media, and what they've done is to follow the money, is to pick up the phone, is to try to understand who's really behind something. I 100% support everyone's right to free speech, but freedom of speech doesn't mean you're entitled to profit from that work. We've entered an era where brands don't know where their Ad dollars are going. When a brand purchases an Ad with Google, it uses an algorithm based on key words to target consumers. However, sometimes these Ads can be placed on websites that are unfavorable without the brand knowing. There was a study that came out earlier this year from the Global Disinformation Index that found that $235 million every year is going to fund disinformation. Each dollar that's going to fund disinformation is $1 that is not going to find legitimate sources of news, and that's really problematic. The other thing is that brands never asked to be on disinformation sites, so how is it that that much money is going towards disinformation? There's something up. I went on Breitbart News Breitbart.com for the first time after hearing about it throughout the election cycle, and I wanted to see for myself what this website was about. And when I went on the site for the first time, I was shocked to see Ads for brands and companies that I shop with, that I frequent, that I'm a customer of advertising on this website. I had a hunch, a very strong feeling that they had no idea they were on this website. So I started by writing a medium article, like just a blog post online bringing attention to the fact and that's how I met my partner, Matt Rivitz, who had already started an account a couple of weeks before called Sleeping Giants. We would just take a screenshot of their Ad on Breitbart, and we would, you know, we would just tweet at them with that screenshot saying, "Hey, did you know you're "appearing on this racist website?" And it really was just a question. It turns out nobody knew, not a single brand was aware that their Ads were appearing on that site. But almost across the board they were horrified to find out that they were. So we generally got responses pretty quickly, brands would just... It was Twitter so they would just respond to us saying, "Thanks for letting us know, we will be sure to take it down." Breitbart lost 90% of its Ad revenue within three months. It hit them really hard. We had no idea we were that effective. What we did know is that brands were slowly sort of waking up to where their advertising was going. When they sign up and turn on their Google Ads or their Facebook Ads, those companies have promised them that they will not serve their Ads on any website or any publication that is objectionable or hateful, and they've really reneged on that promise. With Breitbart that was the first time that a lot of brands understood this, but what the bigger problem here that I see, is that there's hundreds of thousands of more Breitbarts out there that tech companies are not doing anything about, that they're not paying attention to. And unfortunately a lot of brands still don't understand just how massive this problem is. I think on a more structural level, you know, this is about freedom of speech versus freedom of reach. Do you have the right to say something online? Sure, one could argue that, however do you have the right to monetize and amplify it by breaking the rules? That is an area that I think that we can make significant progress on. five or 10 years ago. Consumers need to understand that we can make that difference just by speaking up. There's definitely this tension in the United States between the freedom to access and publish information, and the desire to have, you know, the issue of made up news somehow be addressed, and there's a real tension that exists there. And in most cases when we ask this, the public did not want the government to take these kinds of steps if it was going to be a risk to the freedom to access and publish information, they were more willing for technology companies to do so. One man with total control of billions of people stolen data all their secrets, their lives, their futures. As you may have already realized this is not Mark Zuckerberg. Computer generated videos like this one are known as Deepfakes after a 2017 Reddit user of the same name, began posting doctored videos on the site. To address their concerns, their hopes and their dreams. Obviously, the potential for serious harm with these Deepfakes is quite great, on elections, international states, for diplomatic purposes and even for our private lives. That's why we as a country need to take swift action and invest in the research and the tools for identifying and combating the deepfakes, and create a national strategy immediately especially for election integrity and ahead of the 2020 presidential election. We already know Russia's intentional campaign to spread disinformation throughout the last one, and I don't even want to imagine what Russia or China or just private players, the havoc they could wreak on our elections and our personal lives. Thanks Cheryl for bringing attention to the problems of deepfake technology, and go Navy beat Army. So I will say media manipulation is really not a new thing. that was about 20 years after the first photography was ever made for, in human history. You know, about 100 years later with the computers, with the internet and digital camera, we have photoshops, and then we start to see fake photographs, and making photographs, photographs is much easier. Forrest Gump is one famous example where they actually put Gump into this video sequence. Congratulations, how do you feel? I got to pee. I believe he said he had to go pee. Making fake videos are difficult but it's also possible, so it's usually what Hollywood big studios can make them with big budgets. My general research interest is in computer research and machine learning with a special focus on digital media forensics, which is essentially tell if a piece of digital media including image, video or audio has been manipulated digitally in some ways. With the abundance of online media we share, anyone is a potential target of a deepfake attack. Blinking is a subconscious activity that normal person usually blinks between, you know, eight seconds, six seconds to 10 seconds. And if you have a longer video that nobody... that the person doesn't, does not blink, this probably gave us some kind of some cue that this may not be a real video. The face, the head is moving in 3D but the face is actually pasted on with a 2D transform, and a 2D transform always some discrepancy. Well, I think there is no doubt that this technology is going to grow as the year... as time goes by. And there's a huge amount of interest actually on this, on the research side of generating more realistic images as close as possible to human voice or human faces. It can be used in either a good way or bad way depends on who's going to use it. It's really hard to give a sense of the growing and global scale of the issue, but here are a few recent examples. Today a report by my colleagues over at the Oxford Internet Institute highlighted that more than 70 countries currently use computational propaganda techniques to manipulate public opinion online. Since October 2018 Twitter has disclosed information around more than 25,000 accounts associated with information operations in 10 different countries. On Facebook over 40 million users have followed pages that Facebook has taken down for being involved in what they call coordinated inauthentic behavior. Now politicians increase their spends, increase their budget for social media operations, and there's a budget for the legitimate official campaigns. So, these are the maintenance of the official profile on Facebook but also Instagram, on Twitter, and YouTube, but they also have budget for these underground Black Ops operations, where it's about creating these communities. In reality if I'm a government trying to manipulate social media, what I want to do is I really want to hide inside organic groups of people, right? And it's going to be really hard to go and look at a group and say, "Actually, those accounts "are not real people, are not part of an organic "campaign, they're part of an information operation "that a state is conducting." So in order to do that we look at patterns at a large scale because if you have a small number of accounts, who are trying to replicate the organic diversity of a large number of accounts, there are different ways in which they're going to fail. And we say, "Okay, well, that campaign doesn't move "like an organic campaign of people just coming "online together." And so sometimes you have, you know, a campaign with people who are behind it, who didn't try to hide very much. And so for instance you can look at the email addresses that are behind an account or the IP address where they're coming from. But sometimes it can be really difficult because you can have actors who are really keen to hide their identities. In cyber security we call it OPSEC, operational security. It can be very difficult to tell who's behind an account. Digital forensics can be, can be quite hard in some cases. To what's being called a bold move by Twitter CEO Jack Dorsey, announcing plans to ban all political Ads as the 2020 campaigns ramp up. Would I be able to run advertisements on Facebook targeting republicans in primaries saying that they voted for the Green New Deal? I mean if you're not fact checking political advertisements, I'm just trying to understand the bounds here, what's fair game? Congresswoman I don't know the answer to that off the top my head. So you don't know if I'll be able to do that? You know, we talk about the disruptions and media climate over the past 15 years and, you know, Facebook has been a huge contributor to that, you know, just the fact that it's played like a universal role in people's lives. What is the right intervention here? And one advocate is let's just ban these people, let's just take this down, let's lobby Facebook to invest in content moderation, which is precisely about finding, like, hurtful words and banning those people forever from the platform. And we say that that's important that's one component of it, but that's also about taking down content and fake news that is already out there, and it's already out in the wild it has already been circulated, right? This could be easily weaponized particularly in communities, in national contexts where there's authoritarian aspiring leaders. And I think Duterte has taken out all the rules, he has green lighted extrajudicial executions, he has harassed and bullied opponents. He's insulted the President of the United States and the United Nations. We see this in Malaysia, we see this in Indonesia, Thailand and the Philippines where there are attempts to regulate fake news. And the solution to fake news is actually even worse than the actual disease, to actually silence the opposition, to muffle people who are expressing political dissent. And that's why in our research we're trying to argue against content regulation, which we can't rely on the government to tell us what is fake news or not. And so the trust level that sort of gut reaction to trusting the news I get on social media is very low. We also ask about sort of, do you think it's generally accurate or inaccurate? And you have more people that would say it's largely inaccurate. A majority said also it's not really helping me be more informed about the events of the day. But at the same time, they turn to it and the main reason they say they turn to it is for convenience. You know, there are, there are sort of these caution flags that people may have in their head, but they're definitely still going to use it. In the newspaper you could skip from story to story, but the default is once you start reading a story the default thing is keep reading that story. Whereas in social media the default is get immediately onto the next story and you have to put an extra effort to then click out, open it up and go through the story in its entirety. A lot of things in that space come to you as opposed to you needing to seek them out. I think it's hard for the artificial intelligence algorithms to keep up with the continually changing target of what makes something misinformation. Every time we detect a new technique, we see people inventing another technique. What we have been working on a lot is the power of the Wisdom of Crowds. The classic example of this was, you know, from the early 1900s, at some county fair there was an ox, and a guy would have people guess what's the weight of the ox, and any individual person had no idea. Once he averaged the answers of a whole bunch of people it was exactly right. And this has been shown in all different kinds of domains, and so I think the big question right now is how well does the Wisdom of Crowds work in identifying fake news and misinformation? But I think that the advantage of the Crowd approach is you don't need trained experts. If your Facebook for example, you have a lot of money to throw at this problem if you are so inclined, and so you can hire a big crowd if the crowd doesn't have any, have to have any particular expertise or skills. Right now we have the destruction of truth itself. Right now we have people trying to say "there is no truth", it's just "Which side are you on?" That's just trading the information space as a complete game. There are always going to be very loud voices on the fringes of both sides advocating things that will generate, you know, conversation. But if we focus on how, you know, they can sustain financial operations by doing this, that's an area that I think that everyone can agree on deserves action. It has to be human first, machine second. On the one hand there is a role for automation, for artificial intelligence, for machine learning, because the scale of the problem calls for that. But on the other hand, like it has to be guided by, by humans of course, right? Like by deep expertise by people who truly understand the nuances of the problem and the trade offs that come with it. NewsGuard is a browser plugin currently and it works on Safari and Chrome and Firefox and Edge, and it's the result of humans looking at news and information websites and trying to figure out whether or not those websites are credible or transparent. What we try to do it NewsGuard is to restore some order and some sense of context. We've hired dozens of journalists who read and review and rate websites. It's a process that typically involves five or six layers of people starting with the person who drafts what we call the nutrition label and the rating. These different shields, these are our reviews and if you hover over it, you can see that we think this website USnews.com is generally reliable, and we explain the different criteria whether it's a yes or no for those things. And we also have a full report, we call it a nutrition label. We've rated about 3,700 news sites in the United States, Italy, France, Germany, and the United Kingdom. And in each of those countries those sites are responsible for at least 90% of the news and information consumed online in those countries. If you come to a website maybe you're not familiar with, you'll see our red shield and so we think that this website severely violates basic standards of credibility and transparency. And again we sort of explain all our rationale for any of those nodes, that also shows up in Facebook feeds and Twitter feeds. So, if you come across a website and a headline that you've not heard about or don't know anything about, we've probably rated it and you can, you know, make your own decision, if you think that website is reliable or not. What we've designed this whole thing to be is the opposite of an algorithm. First of all we're completely accountable, all of our work is right there, we're completely transparent. We call for comment if any news site seems not to be living up to even one of the nine criteria, even if it's relatively minor, we call for comment. The last differences, unlike an algorithm, we want people to game our system. We want news sites to see well, if I did this, if I had a correction's policy for example, I can get a higher score from NewsGuard. And so far over 600 different news sites here and around the world have changed something about what they do in order to get a higher score. As the volume of data grows so does the chance of handling misinformation, that challenges both the machine and human ability to uncover the truth. So, it's really important I think, to zoom out and look at this problem set as something that's really about consumer protections and access to information. Too many factors, too many viewpoints, too many arguments. But what if there were right answers? What if they've been right here under our noses all this time, and we've been too busy trying to prove ourselves right to notice. The role of consumers is to become more knowledgeable consumers of information, to be more knowledgeable about all those headlines that they see on their newsfeed. That's the always important piece of it, you need to be asking a lot of questions about what it is that you're doing, what are you engaging with, whether it's a written text, a video text, a sound text. I'm hoping that a lot of this resources and strategies is more accessible to all schools, and all schools understand how to implement it better. If we pretend that the problem is entirely new, we will forget the old solutions. The old solutions tell us to eliminate inequality, to address poverty, to educate our citizenry, to make people less susceptible to fear and anxiety. Information campaigns right before elections are problematic and have been problematic for a long time. It's like social media didn't invent that, but it's certainly possible that social media exacerbates it by making it easier for things to really spread widely. Everyone is really intent on limiting the actual impact of these kinds of disinformation networks. However, the landscape is constantly changing on a regular basis, so it's a constant arms race between the disinformation networks and the Ad exchanges and other kind of platforms that do not want disinformation spreading across the internet. Going forward, if you're going to deal with these informational problems, you're going to have to make the people less susceptible to them. This program is made possible with support from Connecticut Humanities.
Search Episodes
Donate to sign up. Activate and sign in to Passport. It's that easy to help PBS Wisconsin serve your community through media that educates, inspires, and entertains.
Make your membership gift today
Only for new users: Activate Passport using your code or email address
Already a member?
Look up my account
Need some help? Go to FAQ or visit PBS Passport Help
Need help accessing PBS Wisconsin anywhere?
Online Access | Platform & Device Access | Cable or Satellite Access | Over-The-Air Access
Visit Access Guide
Need help accessing PBS Wisconsin anywhere?
Visit Our
Live TV Access Guide
Online AccessPlatform & Device Access
Cable or Satellite Access
Over-The-Air Access
Visit Access Guide
Passport

Follow Us