ACM Scholar Spotlights: Ștefania Tudor

Welcome to ACM Scholar Spotlights, a series where we shine a light on the inspiring paths of women in computing who received the 2024 ACM-W scholarship. Through their own words, we hear how they found their place in tech, what motivates them, and how they’re making an impact. These stories reflect not only technical achievement, but also courage, curiosity, and community.

In this edition, we feature Ștefania Tudor, a student from Romania whose early curiosity, deep-rooted cultural awareness, and growing expertise in ethical AI are paving the way for her journey in technology.


Introduction

Hello! My name is Ștefania Tudor, and I just finished my second year of my bachelor’s degree in Informatics at Transilvania University of Brașov.

The Spark of Curiosity

Computer science has always had an impact on me, ever since I was little. When I was three years old, my dad brought home our very first computer. He was very enthusiastic to show me all its parts and capabilities, and I was instantly curious to test its limits.

What sparked then, over time, snowballed into a passion for computing that was only fueled by my very capable teachers throughout high school. These two women soon turned into my mentors and role models, standing proud atop a male-dominated field and inspiring us to push through and be visible in all that we do.

Growing Up in a Paradox

My home country, Romania, is one of many paradoxes. One of these paradoxes is the very thing that pushed me throughout my journey so far: the gender equality discourse. Romanian women are resilient, loud, and proud, so, on paper, we are part of the fortunate few who get STEM representation nationwide. This is deeply felt in the fact that most good educators are female. However, sexism and gender inequality are day-to-day factors you have to take into consideration at every step in your career.

Lessons from Reality

Once my parents learned I had a passion for computing, every conversation on the topic started with, “You have to work at least twice as hard as any man,” and they were correct, but not in the way I expected. For any first place you get in a physics competition, you have to grit your teeth twice while handling the man on the street yelling for your attention—once on your way there and once on your way back. For any job position you get, you have to bite your tongue twice—once when male colleagues say you are a bootlicker and once when your salary does not match the one of the man you share the office with. For every day you go to work, or school, or anywhere, you have to hold your tears back twice—once because of the pressure you face and the standard you’re held to, and once because despite it all, you made it.

womENcourage 2024

In this context, I have always been very aware of my position. When choosing to attend the 2024 edition of the womENcourage conference, I seized the opportunity to extend my understanding of the current position women in technical fields are put in. One of the workshops that caught my eye was the “Responsible AI: Advancing Gender Equity through Ethical Technology” workshop.

This workshop was meant to open our eyes to how AI is trained on biased and not ethically sourced datasets—and it achieved that very thing. Not only were we presented with a very real problem, but we were also given very real scenarios to try and problem-solve around. This meant a team effort to find feasible solutions that would satisfy our moral needs, while keeping in mind the very source of the problem (and the message we left the workshop with): datasets, and therefore AI, are biased because the very world we live in conducts itself in such a manner. It was very interesting to see, due to the teamwork involved, how different cultures see this problem and how it impacts each and every country differently.

The thing that stuck with me the most, and the point that made me talk about this experience in the first place, is that the woman who held this workshop—a very gifted speaker, may I add—had an approach very specific to how women deal with issues: by giving solutions. The workshop had a final part that taught us about certain tools that are already in development meant to sift through the bad and keep the least bias throughout the training datasets.

This solidified the impact of my previous experiences regarding AI but left me hopeful that society is learning, alongside it, to reconsider its biases and work twice as hard to not let them affect the final product.