image title

ASU students win global hackathon with idea to control COVID-19 spread

ASU students win global hackathon with proposal to control virus using data.
December 11, 2020

Winning proposal combines thermal imaging, artificial intelligence to find 'hot spots'

Two Arizona State University undergraduates won an international hackathon competition for their idea to harness data to stop the spread of COVID-19.

Austin Brown and Hunter Silvey won the “data revolution” track of the ASU/Devex Global Student Hackathon, which was announced Dec. 10 during the 2020 Devex World virtual conference.

Their proposal, titled “A Thermal COVID Approach,” suggested combining artificial intelligence with images from internet-connected thermal cameras to predict when an area might experience an outbreak. The cameras would be mounted at high-traffic areas like grocery store entrances to collect aggregate thermal data. The algorithm would be trained to recognize patterns between spikes in the thermal data and COVID-19 testing results from the area.

“Representatives of a given area could then be given useful, accurate, real-time data about potential outbreaks in their area and be able to take measures to combat that, such as limiting occupancy or increasing social-distancing requirements or forcing lockdowns,” said Brown, who is majoring in computational mathematical sciences. Silvey is a mechanical engineering systems major.

Brown said that because the temperature data would be aggregated from everyone, individual privacy would be maintained – a key consideration and a requirement of the hackathon challenge.

Brown and Silvey competed against teams from universities around the world as part of the 2020 Devex World conference, of which ASU was a sponsor. The conference, originally scheduled to be held in person in June, instead was virtual. Devex is a media platform for development, health, humanitarian and sustainability professionals around the world.

With the virtual conference, ASU and Devex still wanted to showcase student innovation and came up with the hackathon idea, according to Stephen Feinson, associate vice president for international development at ASU.

Feinson, whose team pursues large-scale global research and training opportunities, worked on creating the hackathon with other units at ASU, including the Thunderbird School of Global Management and Devils Invent, a series of engineering design challenges.

After the pandemic hit, Anthony Kuhn, director of Devils Invent, organized several remote hackathons, so he used his expertise to develop a playbook for the other universities who were participating in the Devex event.

Students around the world, including several teams at ASU, worked through one weekend to solve one of five challenges.

“It’s easy to forget how much can be done in one day,” Kuhn said. “It shows the students how when they buckle down and don’t have to switch contexts among different classes, they can get really far in the design process in 48 hours.”

Besides ASU, the other winners were Kwame Nkrumah University of Science and Technology, Ghana; University of Sao Paulo, Brazil; De La Salle University, Philippines; and Tec de Monterrey, Mexico.

Sanjeev Khagram, director general and dean of the Thunderbird School of Global Management at ASU, spoke at the conference during a session about how the pandemic is transforming higher education around the world.

“It’s an incredible opportunity,” he said. “As with every other industry, the pandemic has only accelerated what we call the fourth industrial revolution – the digital transformation of the world.”

To scale up digital educational opportunities, universities will have to form broad partnerships with other universities, the public sector and the private sector, including technology companies.

“We all know that if there’s a single lever that can improve the chances of everyone in the world, it’s education,” he said.

Top photo of a thermal imaging camera courtesy of

Mary Beth Faller

Reporter , ASU News


image title

Social media fake news can be navigated with common sense tools

December 11, 2020

ASU expert says to take a breath and check sources before reposting

Staying current with reliable news about subjects like election security, pandemic mask effectiveness and vaccine safety is an overwhelming prospect for most people.

Few can follow the scientific journals and reputable — though competing — opinions in national news outlets. Social media offers a “quick glance” way to navigate the overwhelming volume of information. We rely on friends, family and co-workers on social media to do some of that legwork, and our cognitive biases about the pandemic being a hoax or the election being stolen are validated by what they post.

Nadya Bliss, executive director of Arizona State University’s Global Security Initiative, has an arsenal of tools that can be deployed to negotiate what can be a minefield of dangerous misinformation on social media. Bliss oversees the initiative's efforts to address complex, interdependent security challenges, including an effort to research and combat disinformation. As a member of the Computing Community Consortium, she recently contributed to a white paper identifying key research directions to detect and address the growing disinformation practices used to manipulate public opinion.

Recently, a photo-substantiated report about an abandoned school bus allegedly filled with ill-gotten voting machines outside Phoenix went viral on social media. It turned out the bus was actually filled with office equipment and not a single voting machine. Even though the local police department investigated and issued reports that the equipment had nothing to do with elections, the resharing and conspiracy theories continued to proliferate – with nearly 800 shares of the original post on Facebook and nearly 2,000 shares on Twitter.

ASU Now talked to Bliss about the strategies we can all use to avoid misinformation.

Question: How do we differentiate between what is speculation and what is news on social media?

Answer: Really, this comes down to the sources and how much you trust them. Is the source a trusted news outlet that you know and respect, and that adheres to the basic norms of journalistic integrity? Or, as in this case, is it a random person you’ve never met who happened to be at the gas station and jumped to a conclusion that fits his or her personal political beliefs?

In the case of the "bus full of voting machines" incident, checking with the Buckeye Police Department Facebook page shows that the bus contents were not voting equipment.

This is a perfect example of how important it is to cross-reference things you see on social media with trusted organizations and news outlets. If it’s a post about the pandemic, check the CDC website for COVID-19 or a reputable medical source like the Mayo Clinic. A good source for voter fraud information is your secretary of state’s information page and Rumor Control at the Department of Homeland Security’s Cybersecurity & Infrastructure Security Agency. Social media companies generally do not review posts for accuracy, while official institutions or news outlets do.

I know it can be labor intensive to take the time to check the accuracy of a post or a source, but it’s worth it to do the extra checking. The spread of false information, largely by social media, is a major problem in this country, and as citizens we should all do our best to avoid contributing to that problem. If you can’t validate a story through a trusted news source or institution, don’t share it.

A study by MIT a couple years ago found that it takes a true news story six times as long to reach 1,500 people on social media as a false story. The purveyors of misinformation have become really good at manipulating both human emotions and the algorithms that govern what shows up on social media feeds.

Q:  How do conspiracy theories, such as the assertion that COVID-19 was created as a bioweapon, continue to flood social media, and sometimes traditional media, channels?

A: Conspiracy theories flourish, at least in part, because they fit a narrative and because they help people make sense of a complicated, messy world. For example: If I already don’t trust politicians and think the government wants to control its citizens, and then see government-imposed quarantines and mask-wearing mandates because of a disease I have not personally been impacted by, it’s an easy jump to believing a story on social media about how the disease was manufactured in order to control the populace.

Conspiracy theories like these are also attractive because they offer a sense of inclusion – the believers in QAnon feel like they know a truth the rest of us don’t, and there is something empowering in that.

So a big component of why disinformation or conspiracy theories spread is human in nature. Another component today of course is the platforms used to spread false information. Social media is not the only way disinformation is spread – there are plenty of print and broadcast outlets that spread false information as well – but social media obviously plays a major role in the disinformation ecosystem.

Social media does pretty minimal vetting of the information posted to its websites. If a post is espousing violence it may be blocked, for example. But most disinformation efforts today are crafted to avoid being blocked, and often include some elements of truth along with the lies to make it more believable and thus more likely to be shared widely.

There is an instinct to trust the information coming from your network – from your friends or co-workers – but they are just as susceptible to spreading misinformation as the rest of us. This is why it is so important to have trusted media sources to check and validate what you are seeing. For instance, NPR doesn’t depend on advertising revenue or reader clicks to drive its reporting, which usually makes it my go-to news source. There are many reputable media outlets, but sometimes it’s difficult to discern what’s fact and what’s opinion. It’s important for people to determine up front which category the article falls into.

Q.  What’s your best advice to help people process the information they see on social media?

A: First, take a breath. If you see something that sparks a sense of outrage or an emotional response, don’t automatically hit “share.” Disinformation is often crafted to trigger an emotional reaction. Slow down, check the source, and make sure what you are about to share to your network is accurate.

If you can’t tell if it’s accurate or not, my advice would be to not share.

For example, a credible post will include a link to the original source.  A repost of a screenshot with no identifiable source or date is a good indicator that the information may not be reliable.

The attributed source should have the verified expertise or inside insight to make the assertion being presented. 

Second, understand the incentive structures at play. The social media companies are businesses that make more money by increasing user engagement – more likes and shares and other clicks. False information spreads more rapidly than true information, in part because the people who crafted that misleading meme are aiming to trigger an emotional response. So while social media companies are taking some efforts to stem the spread of disinformation on their sites, there is some alignment of the incentives of both of these entities – the social media company and the purveyors of disinformation.

Another thing to understand is how much of a disadvantage we, social media users, are at compared to the people trying to sow discord by spreading disinformation. They are using these powerful tools to spread their false information widely and make it seem credible, while all we have for defense is our mind.

That’s why there needs to be a national effort to build up our citizens’ defenses against these efforts – on topics like media literacy education and critical analysis skills. Disinformation is a real national security threat, and we all need to do our part to recognize it when it comes across our social media feeds and avoid spreading it further.

Top photo courtesy of

Terry Grant

Media Relations Officer , Media Relations and Strategic Communications