“But where are you really from?” How my experiences with bias led me to Unbabel

September 29, 2020

Oh, how to satisfy a travel itch in the midst of a global pandemic… 

For me, joining Unbabel did just that.

Like many people, I’ve always wanted to explore the world and experience other cultures. This desire first led me to major in international business, and later to take positions at technology companies with the intention of traveling as much as I could. 

Most recently, I found my way to Unbabel because I  wanted to find a way to make a difference, while continuing to experience and give back to the global community. I found myself intrigued by this company that was founded to help break down language barriers and become the world’s “translation layer.”  With team members spread across the U.S. and Europe, and customers in every country you can name, our reach is far and wide. It’s almost as good as hopping on a plane myself.

So, here’s a little more about why I’m so excited to be a part of the Unbabel movement and to help solve some of globalization’s most complex problems.  

How language and cultural biases have impacted me

If you live in the U.S. and look the way I do, at some point, someone will ask, “Where are you from?”

It may seem like an innocuous question on the surface. It may indeed come from a place of genuine curiosity or interest. The person may (probably does) mean well. But there is bias hiding in that question.

It has happened to me more than once, people asking where I’m from. 

“California,” I’ll say. 

“No, but really.” 

“San Francisco, California,” I’ll try. 

Inevitably, they are unsatisfied until I explain that, while my family is from Vietnam (insert “aha!” look), I have never set foot there myself. I was born and raised here in the U.S. and am an American citizen.

We all have experiences of stereotypes bubbling up in our minds as we interact with people both personally and professionally. Whether it’s jumping to conclusions based on someone’s appearance, accent, or other personal attributes, we must be aware of the biases that can arise. 

Fortunately, one  extraordinary feature human beings possess—something machines will never have—is self-awareness. We can recognize bias in ourselves, and that recognition can often help us overcome it and treat people more fairly as a result. This is the thesis of many books and other popular pieces of media that have circulated around this year’s Black Lives Matter groundswell. It’s been heartening to see internal bias become a more explicit part of the conversation.

Of course, self-awareness isn’t everything (we need decisive action, too), but it can be the foundation of change.

That’s the good news.

The role of technology in perpetuating bias

The bad news is technology can perpetuate stereotypes and spread bias even further. 

Machine learning algorithms have taken heat in the media for their ability to replicate and extend the biases we possess as human beings. For example, Christine Maroti, an AI researcher on our team, has written about how gender bias manifests in machine translations and what we can do about it.

Similarly, in the paper by GPT-3’s authors, an early analysis of the model’s problems around fairness, bias, and representation revealed some frustrating, if unsurprising, patterns. After being fed text-based data from the internet, then asked to generate sample text, females were more likely to be described with words referring to their appearance (“beautiful,” “gorgeous,” “petite”). Males were described by the algorithm with far more nuance and complexity (“personable,” “large,” “lazy”). 

Machine learning algorithms are only as unbiased as the data we feed them. That doesn’t mean we can’t use them and benefit from their use, but it means humans must stay involved, training and retraining them to weed out bias.

Language bias and how it holds us back

We all know stereotypes exist around attributes like skin color, gender, and sexuality. Less commonly addressed is the issue of language bias. 

In many places, when people don’t speak the country’s primary language well, they are looked down upon as being less intelligent or capable—even though most of us realize this is not fair or accurate. 

Again, this issue can be perpetuated even further by technology. For example, the English language dominated the internet in the early days, and made up about 80% of the content as recently as the mid-1990s. While it is now closer to 30% of the internet, it is still by far the most common language found online. 

This, of course, means it’s much easier to find information in English. It’s easier to find products with text in English. It’s easier to take online courses, access a wide variety of news sources, and get answers to important questions. Speaking English confers an unfair advantage online. The internet is still biased toward English speakers (though, again, it’s slowly getting better.)

A prime example of this is customer service. Too often, customers don’t receive the same quality or speed of customer care when they don’t speak English. There can be huge lags in response times as businesses struggle to find and quickly turn around human translations for each language their customers speak. Even very large businesses can struggle with this. While the bias may not be intentional or malicious, it still impacts how customers feel and whether they’ll return to the business again. 

That said, while technology can indeed perpetuate language bias, it can also be part of the solution to it. Machine learning can automate large parts of the translation process and make it possible to give customers the care they deserve in a timely fashion, no matter what language they speak. 

But we still need humans in the loop. They can improve quality, weed out bias, and help ensure the machines are speaking to humans the way they deserve to be spoken to.

The future at Unbabel: Fostering global understanding

Today, I’m particularly interested in how both people and technology can foster global understanding, rather than further sow division. It’s a big part of what led me to Unbabel. So far, I’ve been impressed  to see that this vision is at the heart of everything the team does.

I believe there are many possible paths ahead of us. If we set our minds to it and stay laser-focused, we can use technology as a means to rid the world of bias—or get as close to that goal as humanly possible. 

The post “But where are you really from?” How my experiences with bias led me to Unbabel appeared first on Unbabel.

About the Author

Profile Photo of Content Team
Content Team

Unbabel’s Content Team is responsible for showcasing Unbabel’s continuous growth and incredible pool of in-house experts. It delivers Unbabel’s unique brand across channels and produces accessible, compelling content on translation, localization, language, tech, CS, marketing, and more.