image title

Digital-age tools and technology give rise to fake videos

March 1, 2018

Cronkite School's Dan Gillmor, Eric Newton share on the direction of video and what individuals and tech companies need to do

About the only thing more dangerous than a fake news story is a fake news video.

Fake news videos aren’t new, but they are on the rise and more realistic than ever due to technological advances. What used to be a fairly big production and cost thousands of dollars can now be achieved with a selfie stick and a smartphone. That may not sound like a big deal, but when politics, propaganda and bad intentions enter the fray, the potential to cause harm is staggering and potentially irreparable.  

ASU Now spoke to Dan GillmorDan Gillmor, an internationally recognized author and leader in new media and citizen-based journalism, teaches digital media literacy and is director of News Co/Lab, an initiative that works with journalists, teachers, librarians, technologists and others to elevate news literacy and awareness in our culture. and Eric NewtonEric Newton is a global leader in the digital transformation of news. As the Innovation Chief, he drives change and experimentation at Cronkite News, the news division for Arizona PBS. He also serves Knight Foundation as a consultant, working on special projects and endowment grants., who launched News Co/Lab in October, a collaborative lab inside the Walter Cronkite School of Journalism and Mass Communication that aims to help the public find new ways of understanding and engaging with news and information. They believe fake videos soon will be “trivially easy, inexpensive, and all too believable.”

Man in white shirt

Dan Gillmor

Question: The Los Angeles Times recently reported that false videos will become so accurate “they will defy reality.” How long have fake videos been around, and what’s the usual tone and nature of them?

Dan Gillmor: Media hoaxes aren’t new. What’s new in the digital age is the advent of tools that make fraudulent photos, audio and video easy to make and easy to believe — putting words in people’s mouths that they didn’t say, and showing them doing things they didn’t do. Tom Hanks’ Forrest Gump character, in a film showing him with presidents, took time and money. Soon it’ll be trivially easy, inexpensive and all too believable.

Eric Newton: Of course, Hollywood and "War of the Worlds" are entertainment, not news. When believable fake video becomes common in the world’s news stream, we will be breaking new ground — and unless we have figured out what to do, we will be in real trouble. As the power of artificial intelligence increases, counterfeit audio and video files will become harder and harder for people to recognize. In time, people (including journalists) will simply not be able to tell a fake video from a real one. That’s a huge problem.

Q: Is there inherently more danger with a fake video than a fake news story, and if yes, why?

DG: Yes — for now, anyway, because people seem to think it shows something that really happened, such as videos of police shooting people and the countless scenes captured by witnesses with cameras during and after disasters. But people will have to adjust their thinking, and additional verification, beyond what’s apparently shown in the video, will become even more important.

EN: That’s right: Seeing is believing, until it isn’t. Fake news video could start a panic. Or worse. Example: Warren Buffett seems to say he is selling all his stocks (when he isn’t) and advises everyone to get out of the stock market right now (which he didn’t). A political leader calls on supporters to take up arms and occupy local police stations (when he or she didn’t). Worse still, in this confusing environment, an authentic video showing a police shooting can easily be dismissed as “fake” by politicians when it really happened. If no one believes anything, the worst of humanity can hide in plain sight. In the long run, that’s far worse than a subgroup believing in a conspiracy theory.

Q: The article speaks to a scenario where someone like North Korean leader Kim Jong Un could announce a missile strike or make a doomsday proclamation. What safeguards are in place right now to determine whether such a video is fake?

DG: The U.S. has all kinds of technologies that watch for missile launches, and Kim has already made plenty of belligerent statements. But even if the danger of a nuclear war isn’t huge in this scenario, a video of this kind — if spread and believed widely before debunking — could destabilize markets and cause other kinds of trouble. The safeguards are fundamentally our insistence (and especially our leaders’ insistence) on wanting proof before trusting such things.

EN: Military technology does not directly help the average person. And do we really want government employees dictating what’s true? We need detection software, available to everyone. If AI can create fake video, AI can detect the fakes. In the future, software that detects digital misinformation may be as common as anti-virus software, spam blockers, ad blockers, fraud blockers and the like. We can’t wait until the fakes walk among us; this needs development now.

Q: Platforms like Facebook, Google and Twitter have promised to police themselves when it comes to fake news. Have you seen evidence of this, and what advice should they heed in regard to fake videos?

DG: They’re trying, and not always succeeding. We have to ask whether we want the tech platforms to be arbiters of truth, however. For sure, the platforms should participate in, and maybe lead, a global initiative to develop better detection and verification tools. This is easy to say and almost certainly difficult to do. They should be helping their users (and third-party developers) create add-on tools to help us be the arbiters of what we see. And they should be much more transparent about what they do, and how they do it.

Man in beard and suit smiling

Eric Newton

EN: Tech companies can make a big difference. We wouldn’t partner with them or accept their funding if we thought otherwise. Tech companies have the capacity to lead the way in developing filtering tools that prevent their own products being used for evil purposes. People should be given choices about how much filtering they want, just as people should have the right to examine and change data that private companies have acquired from them. For consumer choices to be effective, however, a lot of other folks need to step up: journalists can become more transparent and community-engaged, educators and librarians can make it their business to know and teach the fundamentals of all modern literacies, and each of us can learn to share news with more care and to push back against misinformation.

Q: How can the public better educate themselves on fake videos so they don’t get duped?

DG: Start by understanding that malicious actors are trying to deceive you — that they are talented and have time and resources. Be relentlessly skeptical of just about everything. The more sensational it is, the more skepticism is required. The more you want to believe something bad about someone or something you dislike, the more skeptical you should be. Wait for secondary evidence. Society needs to put critical thinking at the core of education, as a lifelong skill we constantly develop and improve.

EN: We should find and use news sources not because someone shared them with us, but because we know them and agree with how they verify and clarify news. Those sources should win and keep our trust by being clear about how and why they do what they do. When we hear or see a story that is “too good to be true” or in other ways makes us wary, we need to check it with one of our trusted sources.

Q: Are we doomed to live in a world where we can believe nothing, trust nothing?

DG: No. We have to trust someone along the way. We’re going to have to learn who’s more trustworthy and believable that not, recognizing that everyone makes mistakes. We have to demand better from institutions that want our trust, and we have to recognize our own responsibility in this changing information ecosystem.

EN: A good place to start is the News Co/Lab’s website, newscollab.org. Check out some of the best practices from newsrooms wanting to earn your trust — and ask your local news organizations to try them. Look at the best practices in education — and ask your local schools to teach the fundamentals of news and media literacy, civics literacy and digital literacy.

 
image title

ASU's vision of future: Learning across lifespan — anytime, anywhere, any age

March 1, 2018

Crow: As change accelerates, ASU must be a place where people return again and again to build the skills for multiple, shifting careers

Editor's note: This story is being highlighted in ASU Now's year in review. Read more top stories from 2018 here.

Arizona State University is setting out to disrupt the old concept of higher education by offering learning to everyone across the lifespan.

The university is already working to harness virtual reality and emerging technologies as it advances toward a “national service university” prototype, according to ASU President Michael Crow.

He laid out the university’s vision of the future Thursday in a TED Talk-style discussion at the new Student Pavilion on the Tempe campus.

“You cannot derive the kind of change you want. You have to build a new model,” he said.

After centuries of gradual progress, the velocity of change in our society is accelerating.

“Now we’re seeing change that occurs in single generation — within a person’s lifetime as a worker,” Crow said, showing photos of shoebox-size mobile phones from the 1990s.

“Sixty-five percent of children entering primary school today will ultimately end up working in completely new job types that don’t exist yet” — not just different jobs, but completely different types of jobs, he said.

ASU President Michael Crow onstage

“Sixty-five percent of children entering primary school today will ultimately end up working in completely new job types that don’t exist yet,” ASU President Michael M. Crow said Thursday. Photo by Charlie Leight/ASU Now

In 2018, the budget is $3.1 billion, the four-year graduation rate is 50 percent (71 percent for A students, 46 percent for B students) and ASU is ranked 22nd in research expenditures, with $545 million annually (ahead of No. 26 Ohio State University and No. 28 UCLA). Crow showed how far ASU has come, going back to 1988, when the university’s budget was $415 million and the four-year graduation rate was about 14 percent. Then, ASU was ranked 105th in research expenditures (behind such schools as No. 69 Hawaii-Manoa and No. 94 Mississippi State).

In 1988, the state invested $9,770 per student (in 2017 dollars) and in 2018, the state spends $3,141 per student.

Crow said that the current funding from the state covers about a third of the cost to educate an in-state student, and he would like to see that investment increase to about half the cost.

“We can make up the rest with the institution operating in the knowledge market, the research market and the international student market,” he said.

“And every once in a while, we’d like two miles of freeway. For the cost of two miles of freeway, we can build these ‘Star Trek’-type research buildings, and out of that comes everything you can possibly imagine,” he said.

“And after that we want to be held accountable for what we do, driven toward a certain set of goals and then to be left alone.”

Crow noted that over the past 30 years, ASU’s student body has evolved to more closely reflect the demographics of the state, with 50 percent of the current freshman class made of underrepresented minorities, compared with about 10 percent in 1988. The percentage of undergraduates who qualify for the federal Pell Grant rose from less than 3 percent 30 years ago to 34 percent now.

Thirty years from now, by 2048, ASU will have created new ways of engaging with learners through technology, he said. But not recklessly.

“We do need to be careful about technology. We’re finding ways to enhance learning, not replace learning. We’re finding ways to enhance reality, not replace reality,” he said.

The new “national service university” model will be less rigidly connected to age than the current system of preschool and then K-12 followed by technical school or university and then a career.

“We’re evolving a model capable of being of service to all learners, at all stages of work and learning, from all socioeconomic backgrounds, through education, training and skill-building opportunities,” he said.

The university will be a “knowledge core” that offers its resources in different ways, digitally or immersive, to typical and nontraditional students across the lifespan, short term or constant. Content would be personalized, collaboration would be global and advising would rely on artificial intelligence.

ASU is already doing this. More than 10,000 students have accessed the Me3 app, which uses an algorithm to help students hone in on a career and college path. The Global Freshman Academy is a series of online first-year university courses that can be taken for free and paid for only if credit is desired. An interdisciplinary student team is working on Axio, a “friendly AI companion.”

Video by Ken Fagan/ASU Now

The new way of learning would help people who never finished college, Crow said.

“There’s a pejorative term in higher education to make us feel superior to other people: dropout. Those are the people who should call us failures. We failed the student, and we make it almost impossible for that person to go back and finish college,” he said.

Crow said that with change happening so fast, the goal is to create lifelong learners, with the university a place where anyone, regardless of age or location, can learn “anytime, anywhere, anything that you want.”

“I hope to be part of a team capable of building these learning platforms where it doesn’t make any difference how old you are, you’re able to learn any topic at any speed.”

Top photo: ASU President Michael M. Crow speaks onstage at a community conversation at the Student Pavilion on the Tempe campus Thursday evening. Photo by Charlie Leight/ASU Now

Mary Beth Faller

Reporter , ASU News

480-727-4503