image title

Professor: How TikTok can affect marginalized communities

March 8, 2022

Panel series discussed the online platform, experience for Black, disabled, fat and queer creators

If you jumped on TikTok Monday morning, you might have seen a video of a toddler talking to herself in front of a mirror, a rooster wearing blue pants or a three-minute makeup tutorial: Fun, harmless stuff from the social media site that boasts more than 800 million active users, including 60 million in the United States alone.

But in a panel series held the last two weekends, the focus was not on silly, 30-second videos, but on the experience on TikTok for marginalized creators.

The Online Creators Association (TOCA, a labor advocacy group for TikTok content creators) organized the discussions regarding the experience for Black, disabled, fat and queer creators.

The series was co-hosted by Sarah Florini, an associate professor of film and media studies in the Department of English at Arizona State University, and Elizabeth Grumbach, director of digital humanities and research at the Lincoln Center for Applied Ethics,

ASU News talked to Florini about the panels, how TikTok treats marginalized communities, and the TikTok research project she hopes to begin later this year in conjunction with TOCA.

Question: What was the overall purpose of having these panels?

Answer: A couple of years back, there was information leaked that TikTok was deliberately suppressing the content of certain creators who were fat, visibly disabled or visibly queer. And that it would cap the amount of views that those creators could get. TikTok framed it as an anti-bullying measure. If people can’t see you, they can’t bully you, which kind of attacks the wrong end of the problem. The panels really just talked about their experiences on the platform over the past years, with how it’s algorithmically curated and how content moderation works and the harassment they face, but also the wonderful community they’ve been able to create.

Q: What did you learn from the discussions?

A: What we heard from all four of the panels was that content moderation dealing with people who are harassing and bullying tends to disproportionately impact people from historically marginalized groups. So, we see again and again that a Black person will be called a racial slur, and then the video that they make responding to the slur, where they say the thing that they were called, gets taken down for a community standard violation, but the original video calling them that doesn't. And we're also seeing that there are mass reporting campaigns against all of these sorts of marginalized groups. Let’s say you’re a disabled creator. A bunch of people just report your video and say this is offensive. TikTok will take it down. You can sometimes appeal to have it put back up, like this isn’t an actual violation of the guidelines, but even if it goes back up, it still counts as you having something taken down. And TikTok has a rule. If you have so many videos taken down, your account gets banned.

Q: How does that mass reporting play out?

A: Let’s say I don't like you, and you make a video. I get together with a bunch of my friends and we all, from our own TikTok accounts, flag videos for TikTok to say that this video violates community guidelines. So, if we coordinate, everybody's going to flag your video, then TikTok might take your video down. So, you can see how it's sort of rife for exploitation. And since having a video taken down counts against you, regardless of whether it should have been taken down, if my friends and I decide we really, really don't like you, we’re going to keep doing this. We can eventually get you to lose your account. And then the account that you've spent a couple of years building up with all these followers, you have to start over.

Q: Does that mass reporting on TikTok differ from other social media platforms?

A: It’s much harder to have content taken down on Facebook, Twitter or YouTube. I don’t necessarily object to having content taken down. It's just the uneven application, because there is a lot of disinformation, white supremacy... a lot of really, really misogynistic content that seems to stay up. So, like I said, it's a very uneven application, and the guidelines are very vague, and there seems to not be a lot of consistency. Because TikTok has been so aggressive about taking videos down, it becomes a really wonderful weapon to silence folks that you don't like.

Q: Have marginalized groups complained to TikTok, and what has the company’s response been?

A: Basically, TikTok has not been a lot of help. One of the reasons that we wanted to have these panels is because TikTok has tried to organize similar panels, but people who have a lot of critical things to say about TikTok tend to not get on those panels or are not getting invited back. So TikTok will often recognize that there are problems, try to adjust the policies, try to say they're working on the problems, but the problems don't seem to get any better.

Anecdotally, we heard about how TikTok will just not respond to them, or how they just can't get through to anybody in the company to help them. One of the people on the panel was a female ocean scientist. That’s what her account is about. And she has been relentlessly targeted by people. There are videos telling her that she should commit suicide, saying she should take a toaster bath, all of these kinds of things. And she can't get them taken down. But if she has a video that says anything about them, those things get taken down. She said several people who emailed on her behalf about the situation got responses from TikTok, but she didn't get any response. So, they would respond to people asking about her situation, but not to her.

I don't know what's happening on the inside with TikTok, but creators have found TikTok to be responsive in kind of like a macro (public relations) way, of like, "Yes, we really want to do better, and we’re going to talk to these creators and see how we can do better." And I believe that there are people in the company who really want that, but when it comes to actually how it plays out every day, they are not very responsive.

Q: You’re awaiting a grant from the National Science Foundation to begin your research project in developing a toolkit to better understand TikTok’s algorithm. Before we delve into the project itself, how does the algorithm work?

A: TikTok has an algorithm that selects content to show to people. When you open the app, you have what's called your “for you” page, your FYP, and those are videos that TikTok has decided that you would like. They base it on generic things like the average user age and your location. Compared to all of the other social media platforms, the algorithm is remarkably sensitive and effective at figuring out what you like and what you want.

Q: In what ways will the toolkit help users?

A: So much of the content we get in terms of media, content and news, as well as our social interactions, whose posts we’re seeing, who had a baby, whatever, are filtered by these algorithms. They’re deciding who they think you are and what they think you want to see and to give you more of that. And particularly with the ongoing disinformation crisis that we're having, it is really important for people to begin to understand. So having a toolkit that helps you figure out how to research what the algorithm is doing, it just gives you a better sense of how it's working. How is it selecting things for me? How is that different over time? Because these companies will change and shift how that works. Particularly if you do it over a period of time, you can start to see the changes in what you're getting served. And if you're a content creator, and particularly people from marginalized communities who rely on social media as a mode of civic engagement in advocacy, being able to investigate the algorithm and how it's shifting over time will be really useful for them to be able to have their content seen.

Q: What’s the timeline on the project?

A: It’s a three-year project. We’re going to collaborate with TOCA to design the toolkit, and then we're going to have them test it out, take feedback, revise, etc. And then in the last six to eight months, we have a pretty extensive communication plan that uses traditional academic channels but also social media, particularly TikTok, to push out not only the toolkits but any findings that we have along the way.

Top photo: Courtesy cottonbro/Pexels

Scott Bordow

Reporter , ASU News

 
image title

Study: Too much screen time harmful on molecular level

March 8, 2022

ASU study first to look at gut health, screen time connection

If you don’t work with your hands, chances are you stare at a screen most of the day. Then at the end of the day, you go home and stare at a bigger screen to relax. In between the two, you probably steal glances at a tiny screen.

Obesity, sleep problems, chronic neck and back problems, depression and anxiety have all been associated with spending too much time staring at screens.

Now, for the first time, a study shows that screen time is harmful on a biological and physical level.

“It's a really challenging problem because so much of our society is dependent on these devices with screens,” said Corrie Whisner, lead author of the study and an associate professor in Arizona State University’s College of Health Solutions.

Several studies have tried to correlate screen time and mental health, but this is the first study to examine the effects of high screen time at the molecular level.

The study found high screen time is “significantly” associated with Type I diabetes, obesity, chronic fatigue syndrome and various manifestations of inflammatory bowel.

The World Health Organization and the American Academy of Pediatrics recommend no more than one hour of screen time per day for children 5 and under. There is no consensus on a safe amount of screen time for adults, but connections between well-being and digital technology use show harmful effects on individuals engaging in more than two hours of daily screen time.

“Screen time really matters for internal metabolic processes that we often don't think about,” Whisner said. “I think most of human nature is really driven towards instant gratification and what we do. And we don't think about how just everyday life really adds up and has these cumulative effects on health. And so knowing that microbes respond to our behavior and what we do matters for them as much as what they do matters for us, that we can think that, OK, well, if I don't care about screen time for myself, maybe I could think about what it's actually doing for all these tiny little partners that share a body space with me.”

The study examined 60 college students, collecting data on nutrition intake, screen time and physical activity. Researchers collected fecal samples to gather biological data.

With screen use rising in all age groups, there’s more public and scientific interest in the effects.

“This work is the first investigation of the effects of screen time at the molecular level,” said co-author Paniz Jasbi, a doctoral candidate in exercise and nutritional sciences at ASU. “Our results indicate that individuals with more than 75 minutes of daily screen time had microbiome and metabolome profiles consistent with obesity, Type I diabetes, myocardial infarction, chronic fatigue syndrome and a host of digestive disorders."

Top image: Courtesy FrankundFrei/Pixabay

Scott Seckel

Reporter , ASU News