comments
Share

A UW engineer explains how facial recognition tech erases trans people

After stints in law school and at Wikipedia,  Ada Lovelace fellow Os Keyes is challenging tech companies to rethink the gender binary hardwired into their code.

This conversation has been lightly edited for clarity.

I mostly say I’m from London, but I’m actually from a tiny village called Betteshanger, which has 10 houses and a pub. And about 600 cows.

[My path to data science is] a really convoluted route. When I was about 14 or 15, I really wanted to be a physicist — specifically, I wanted to be a physics teacher because I had this physics teacher who was amazing, and I was like, "If I could grow up and turn into someone who could engage people like he does, then that would be awesome."

This all fell apart when I was 17 and I started my A levels, which are the British equivalent of APs. I enrolled in maths and physics and a few others. We got to the bit on hydrodynamics and my brain sort of turned inside out — I failed high school maths. I loathed the sciences with every fiber of my being. And I switched away from physics. I started off doing a politics degree, and then a law degree.

I was wandering around library and I accidentally ended up in the law section. I ended up by Dewey decimal 720.1: English legal history. And here's all these biographies of lawyers and these histories of cases and all the rest. And I suddenly had this moment of like, “Huh. All of these people are talking about doing good and making an impact, and they're all quite clearly smart people who could've done a lot of things, and they all picked this. It can't be entirely bollocks.”

I did my law degree and found the experience of being a lawyer can be summarized by the comments of two judges — one fictional, and one real. The real is Rigby Swift, who was a British judge who said, "The job of the court is to dispense law, not justice." And then the fictional is from Dickens and it's "The law is an ass." So the job of the court is to be an ass. It sucked away my soul.

By sheer coincidence, at that point I’d been editing Wikipedia in my spare time for about 10 years, since I was 12 or 13. The thing that brought me to it was vandalizing it. But back then, there was a moment where I realized oh, you can edit. You can write things. And so I started just making small changes and all the rest.

Wikipedia's parent organization, Wikimedia Foundation, was looking trying to deploy some software changes. They were like, "If only we had someone in the product process, in the engineering process, who knew how editors thought.” And somebody who I know was like, "I know an editor." And they hired me as a contractor, as the first community manager in engineering.

I taught myself to code in my spare time, until 2 in the morning because I still worked in the U.K. at that point. I was working on a San Francisco schedule in the U.K — your workday starts at 5 p.m. and ends at 2 a.m. It's great if you don't like friends, hobbies or ever seeing anyone you know. So I got into statistical programming and just basic stuff. From there, I ended up getting really into that.

I kept having this recurring thought [while at Wikipedia]: Hey, you don’t know what you’re doing after this. Maybe you should enroll in a Ph.D. And I'd have this thought every year for three years. And eventually I was just like, OK, they're not going to let you [into graduate school in the U.S.] because you have a bad law degree from a different country, and you don't even know how standardized tests work. The GRE [Graduate Record Exam] was the first standardized test of that form that I had ever taken.

I was already living in Seattle and I had just moved from Boston, so I applied to the University of Washington. UW was the only place that would actually offer funding that was worth a damn. I applied for the information school, which rejected me.

[I ended up in] human-centered design and engineering. It's kind of like a grab-bag department of people who are a bit too hands-on for the information school or a bit too social science for computer science, or a bit too computational for the applied psychology folk. It's really interesting. We've got people with doctorates in sociology, we've got designers, we've got old school neuropsych people who got into computers just before computers became a cool thing. The department produces some really interesting and weird work.

At the same time, I think only this past year have I run into like students who are studying stuff anywhere near what I'm studying. Not just because there’s five people in my entire field who study what I study, but also because everyone here is doing something different. Next door contains an accessibility researcher who studies how design processes can include disabled people's narratives. This lab contains me, someone studying salmon fishing in Alaska, someone who is studying how scientists collaborate on water ecology projects and two people who are studying the logic of domains.

I'm studying how infrastructural systems consider and reshape our ideas of gender, and the resulting space they open up or close off, particularly for trans people. [My] current focus areas include the social implications of facial recognition, electronic medical systems and data science practices.

Facial recognition systems generally have a lot of problems when it comes to identities that aren’t static. So people whose appearances change over time, for example. People who develop facial recognition are aware of this in the context of aging. They talk about how the system doesn't work for really old or really young people, because apparently all wrinkles look the same and all chubby babies look the same.

The idea of transition, for example, just doesn’t factor in at all. Facial recognition developers seem entirely ignorant of the possibility that trans people exist and that, because we exist, who we are is not going to be a continuous, static, lifelong thing.

[Facial recognition has a] binary nature. It's also the physiological nature. Also, frankly, the technology doesn't just model gender, and so we have consequences for people who don't fit well for the model. It also shapes our notion of what gender is.

The Misgendering Machines [was about this and] was great to write, but people always ask, "So, what's the story of that paper?" Part one: nobody told me I wasn't expected to write a solo paper in the first year. And then part two: this technology is a technology that I ran into two years ago on Twitter. It was like someone from this field published a paper that used it and it made me mad and steamed, so I tweeted about it. I was still mad, so I wrote a blog post about how the technology is bad, doesn't work, doesn't make sense.

And then two years later, I was still mad. So I decided to write a paper on how the technology is constructed and why it doesn't work. On the one hand it's an amazing achievement to write a solo, authored full-length paper as a first-year student. On the other hand, it was the world's most high effort subtweet.

It's all based around this idea that gender is a thing that only has two categories and these two categories can be assessed physiologically. And assumes that it works the same way, literally everywhere in the entire world, that all of the gendered signals and flags are exactly the same. And so on the one hand it’s horrific in the sense if you're having something that produces obviously transphobic outcomes, like generally speaking, when you think gender only has two categories and that gender is physiologically determined, you're probably not including trans people, nor can you ever, and that produces some hideous outcomes.

It's not just that it enforces this view of gender and then punishes people who deviate from it. It's also that the consequence of that is to shape our view of gender and what gender is. Values don't just go into systems — they also come out of them.  

Story Continues Below

Ph.D. student Os Keyes at the University of Washington in Seattle on May 12, 2019. Keyes is conducting research in the Data Ecologies Laboratory at the UW Department of Human-Centered Design and Engineering. (Photo by Dorothy Edwards/Crosscut)

It's the thing that I had to deal with when I came out as trans. People who were queer, who didn't fit in for a really long time, people who didn't have quite the right narrative — you know, I have [trans] friends who were as late as like the 2000s, ’90s, just when the internet was starting to become like a thing, would run into websites run by trans people that had this “true transsexual” narrative. It would say you're a trans woman if you have these 15 criteria. I have friends who were looking for help. They were teenagers and couldn't go anywhere else and ended up on the one website that talked about trans issues and read this and were like, “Oh, I guess that's not me,” and ended up back in the closet for another 20 years. Not only is the general perception of trans people, but also the trans perception of trans people, often shaped by these narratives.

That's not to say that a lot of us don't fall out of [these perceptions] and things aren’t changing. But there's always that insecurity of not quite being real — like the trans man who doesn't want bottom surgery. Like nonbinary people, people who have dysphoria but wouldn’t describe it as disgust in their genitals so much as alienation from, so they don’t “really count.” There’s this insecurity that plagues communities. It shaped our narratives and who can access care, and also who feels like they can’t. All because of a scale, like an instrument, to measure people on a yes-no basis, like: Do you fall into this category or do you not?  

Imagine that we have this facial recognition system that detects gender and we start integrating it into advertising, which we already know communicates gendered messages. Integrating it into bathrooms in public life, so that people who don't present in a certain way can’t exist in public, meaning that they either change how they present to survive or vanish. Which means people can't see [trans people] about and realize that's the thing that they can be. [If we would] integrate it into human analytics and all of the rest, suddenly not only would trans people be rendered invisible in analytics of what's worth funding or resourcing or who watches what content, who produces this content, but so are any cis [people] who present non-normatively.

There's no such thing as an instrument that just measures. By definition, any instrument that measures and that is used and deployed into wider infrastructures of power and sorting and everything else also dictates who counts and what the categories are and what are associated with each category. That has a shaping effect. It's not just going to measure gender, there's no such thing as a thing that measures gender.

When I say five or six people [are in this field with me], I mean trans people in my entire field who I know. There’s a few of us doing this, but it's not a widespread thing. There are relatively few people in the sweet spot of knowing how I study how society works and also used to study how programming works.

Next year, there's going to be four people in this department studying technology from an identity perspective. That's going to be a really exciting moment. But when I started it was literally just me.

Critical approaches to technology and the study of gender and technology have been going on since the ’70s. The new thing, I think, is actually integrating social science, [which] has happened in the last 20 years with computing professionals. Last year, CHI published a paper in which they discovered that intersectionality existed as a concept, and it was introduced to the field so that they might use it. CHI, for reference, is the conference for computer scientists who are engaged with social issues, as opposed to the ones who just code. The only paper on race I've read in the field is a paper complaining that there's never any papers on race.

When we have something that looks like a reform, that’s not necessarily good. Let's say that we include trans people in these systems. It probably wouldn't change the structural relationship in any way. Like the developers are still the people with all the power. We just handed them an extra tool to legitimize why them having all the powers is not a problem and why this technology is not the problem.

And we know in practice that's not how it would work. One, because if they're still using this model, then really what they mean is trans people who pass are the trans people who are the most normal. Normality in this country has always been whiteness, and passing has always meant having access, having transitioned early and having had access to medical care, having to have the right combination of facial features so that people judge you just how you want to be judged.

I think coming from outside the field is a slight advantage because it means that I don't approach this as, “Hey, I'm a technologist. How do I make myself feel better?” I come from the polar opposite perspective of, like, prove to me the technology is right or technology is good, instead of prove to me that your concerns about technology are worth listening to.

I have more hope now than I did four months ago for a couple of reasons. The first is that if we can get this framework of actually being strategic, not just rolling over, then that means we can be more purposeful. We can keep making demands and we don't put out a press release for recognizing a change without recognizing at the same time that that isn't enough.

I also think it's a generational thing. Like when my Human-Computer Insurrection paper was presented last week, the room was standing room only. This is a paper on anarcho-communist computing at a CHI conference. When I went to add the tag that classifies it in the the Association of Computing Machinery's equivalent of the Dewey decimal system, they've now got this tag of political speech. They added it in 2012. This is the first paper that has ever used that tag in the entire Association of Computing Machinery.

My Ada Lovelace Fellowship is three years long. The big advantage here is I get to focus on this work and also focus on teaching. I like having this opportunity to not only pursue the research that I want to do but also teach and engage in community-facing work.

Every time I teach a class, there's one trans person at least who wasn't out, and didn't know that they could be, or didn't know that they were real or who to talk to. I did a single guest lecture for an inclusive design class and someone reached out afterward. We got coffee and we're getting coffee next week. It’s doing stuff like that and providing some kind of hope and community for my community.

I'm showing that we do exist. It's not just you that feels this way. It’s really the best thing.

I want to teach a hell of a lot and bring more people in. And not only would I not publish solo authored papers, I wouldn’t have to because there'd be a whole crew of us working together and making community and making space and doing this work. And it will be better work for us.

I don't want to be an inspiration after I'm gone because I don't want to have to be an inspiration. I don't want people to have to point to someone on a pedestal who’s dead as evidence, I want people to be able to look around as proof. Hopefully one of these people is the person to do it or one their students, and I can be the person who gave hope to the person who gave hope to the person who gave hope to the person who was smart enough to get us out of this shit. And I would take that as a legacy any day. And I don't care if fixing it makes it not about me. Like, I'm a Pisces moon. I would in fact prefer if fewer things were about me.

comments on

A UW engineer explains how facial recognition tech erases trans people

About the Authors & Contributors

Manola Secaira

Manola Secaira is a reporter at Crosscut focused on environmental justice issues.