Military veteran and mom pursues her passion for African American history


April 24, 2023

Editor's note: This story is part of a series of profiles of notable spring 2023 graduates.

After serving for 21 years in the United States Army, raising children and sending them off to college, Tashieka Russell felt it was important to finish her education that she started before the military. Portrait of Tashieka Russel in an outdoor setting. Tashieka Russell is the Dean’s Medalist for the School of Social Transformation for spring 2023. Photo courtesy of Meghan Finnerty Download Full Image

Russell, who lives in Seattle, was inspired by her life experiences to pursue a degree in African and African American studies through ASU Online.

“My mother and grandmother were active in the civil rights movement. I grew up hearing stories of how my grandmother worked with Martin Luther King Jr., the march on Washington and their civil rights work in Pittsburgh,” she said. “Then in the military, I did a lot of work on the history of minorities and cultural heritage.”

She did not let being an online student deter her from experiencing all the opportunities and resources ASU offers. Russell studied abroad in Amsterdam, where she learned more about the global slave trade.

As the Dean’s Medalist for the School of Social Transformation, Russell is grateful to ASU for allowing her to continue and finish her education and for providing unforgettable experiences.

Question: What’s something you learned at ASU — in the classroom or otherwise — that surprised you or changed your perspective?

Answer:  Thanks to my academic advisor, I found out I could go to a community college in my area in person to get extra help in my math classes. I am not a math person and ... that was a huge help.

Q:  Why did you choose ASU?

A:  I knew I had to take online courses because being a wife and mother was my top priority. When I did my research, I was looking for a program that would allow me to pursue my education and not interfere with my life outside of school. ASU has a top reputation for online programs, and it was a good fit for the path I wanted to take.

Q:  Which professor taught you the most important lesson while at ASU?

A:  A lot of names stick out to me. A few of them challenged me to step out of my comfort zone when it came to learning. Professor Aribidesi Usman comes to mind; he was a hard teacher and expected a lot from us, but he challenged us to put in our best effort.

Q:  What’s the best piece of advice you’d give to those still in school?

A:  It’s not always easy, but I would tell them to establish relationships with classmates, professors, academic advisors, whoever. It’s part of your success; no one can do everything alone.

Q:  What are your plans after graduation?

A:  I will continue my education and pursue my master’s degree in museum studies and research contributions of African Americans within the U.S. military. My dream after that would be to work at the Smithsonian and share my love for history and African American culture with anyone who wants to learn.

Q:  If someone gave you $40 million to solve one problem on our planet, what would you tackle?

A:  I would invest in our youth — I mean every aspect of their lives by ensuring they have food, shelter and physical and mental health services. If I could put money into positive outcomes for our youth, I would do that.

Marketing and Communications Coordinator, The College of Liberal Arts and Sciences

ASU graduate aims to demystify AI through children's book


April 24, 2023

Editor's note: This story is part of a series of profiles of notable spring 2023 graduates.

How do we understand the story of artificial intelligence? Student Kacy Hatfield Kacy Hatfield Download Full Image

Kacy Hatfield is a student in the Herberger Institute for Design and the Arts who aims to make the story of AI accessible to all – and is doing so by writing a children’s book. Kacy is graduating this May with a degree in digital culture, and is an undergraduate researcher for the Lincoln Center for Applied Ethics.

After graduation, she will pursue a master's degree and has been invited to participate in the Machine Intelligence Group at Draper Labs

She shared more about her college journey below. 

Question: Tell us a bit about your experience at ASU and how you came to study digital culture.

Answer: I actually came to ASU as a biochemistry major; I love chemistry and math, but the career path wasn't exactly what I wanted. I then explored career and creative job opportunities where I found digital culture and in just three days I was hooked and made the switch. And there is still so much of what I love in studying AI, and I get to integrate my love for chemistry and math into that. 

I actually hadn't even heard of machine learning until spring 2021, and after my professor introduced it to us, I asked her for book recommendations. From then on, I was obsessed with AI.

Q: What inspired you to pursue undergraduate research? 

A: Well, I actually did my honors thesis shortly after I learned about AI and machine learning. I decided that I wanted to pursue it, even though I really didn’t know much about the subject, and I pitched it to several professors I wanted to work with, who all were very supportive. I defended my thesis almost exactly a year after I had first learned about machine learning, and I just had such an amazing time working on my thesis that I wanted to continue doing research. 

I then found the Lincoln Center for Applied Ethics, which had an undergraduate research opportunity on responsible AI. I met with Research Program Manager Erica O’Neil over Zoom, and I thought it would be the perfect continuation of my work. It’s amazing to keep doing research on this, not just to learn but to ultimately come away with more questions. 

Q: You're working on a really fascinating project, in which you’re developing a children’s book on AI. Can you share more on this project? 

A: The premise is an illustrated children’s book that tells the story of an algorithm named Pip — like the command in Python Programming Language — and Pip has to classify seashells on the beach. How Pip classifies them starts out in very simple terms, and as waves wash up on the shore, more advanced terminology is revealed. There’s also a character named Epoch — another term in Python — as well as a character that represents the human in the loop. All of them are placed very strategically to represent what would take place if a machine learning algorithm were to be integrated in this area. 

Image of a book cover for Your Pal, PipThe goal is to help people feel less scared about machine learning. I often see AI described as a black box; something that that people can't see into, and can't understand. But I think the test of a good machine learning algorithm – and a good programmer — is to translate that black box into something that is easily understood.

Part of the reason I love machine learning is because even if I dedicate my entire life to studying AI, I will never have a fully comprehensive grasp of it, because it's just always expanding and advancing so fast. I think that's key in why people feel uncertainty about machine learning, especially when the Hollywood narrative of AI is the humanoid robot that is going to take over. The thing is, these technologies are amoral, not immoral. 

My goal as a researcher is to start mitigating skepticism around the subject of machine learning through this book. And this starts with younger people, but the book is also meant to be used by people of all ages. 

Q: How has your time in the responsible AI research group related back to your work? 

A: I love being in this research group. It's actually my second semester; last semester I did a project on the risks and mitigations of AI-powered autonomous spacecraft, which is another one of my interests. It’s so awesome to be part of a group of people that have such different backgrounds and different approaches to AI. There are so many interdisciplinary perspectives and topics brought up in discussion. 

I think that in terms of responsible AI – and a lot of people may disagree with me on this – it is integral for a programmer to also be able to see the ethical implications of whatever they're employing into the world. There’s often the argument that we should wait five years before evaluating those possible impacts; when I am working on programming, I'm immediately thinking about how it may affect the real world and be used. 

Machine learning is like a mirror - it's going to reflect whatever we give it, and humans are not perfect. This is why I think the conversations on ethics have to go hand in hand with the research itself, and it's really interesting to see how it comes about on all different fronts.

Q: What comes next for you in your career and future?

A: That’s the age-old question, isn’t it? I always have a list of problems that I can research! This may be a nerdy confession, but I love doing research even in my free time. I hope to direct that energy into the pursuit of a master's degree and possibly even a PhD. I have also been invited to join the Machine Intelligence Group at Draper Labs in summer 2023 as an undergraduate engineer, which is a very exciting opportunity. 

The amazing thing about this field is it's always changing, and in some regard, I will always feel like a student. And because the study of AI is so new, I feel like taking the ethical and programming approach at the same time would be a lot easier to integrate than something that's already established. I hope to keep these skills as best practices in the future. 

There is a lot of skepticism around AI and machine learning, and often I hear people say that it’s too complicated or complex. Everybody has the capability to understand AI, and it's not as scary as it seems. Even though it's been tremendously skewed for entertainment, which makes it easier to vilify, there are so many benefits to using machine learning, and we can employ it in the right ways to augment our human experience and not hinder it.

Karina Fitzgerald

Communications program coordinator , Lincoln Center for Applied Ethics

602-543-1225