Dating apps can do more to fight racial bias and discrimination: report
Researchers who looked at apps including OKCupid, Grindr, Tinder and Coffee Meets Bagel call for dating and hookup platforms to discourage discrimination by tweaking algorithms to make race a less important factor
Nikki Chapman remembers receiving the first message from her now-husband Kay through online dating website Plenty of Fish in 2008.
“I looked at his profile and thought he was really cute,” she says. “He asked me who my favourite Power Ranger was, and that is what made me respond to him. I thought that was kind of cool – it was something that was near and dear to me from when I was a kid.”
The couple from Posen in the US state of Illinois now have two children of their own: son Liam is seven and daughter Abie is one and a half.
Chapman recalls the dating site asking about race, which she does not think should matter when it comes to compatibility. It did not for her; she is white, and Kay is African-American.
“Somebody has to be open-minded in order to accept somebody into their lives, and unfortunately not everybody is,” she says.
Researchers at Cornell University in New York looked to decode dating app bias in a recent paper titled “Debiasing desire: addressing bias and discrimination on intimate platforms”.
In it, they argue dating apps that let users filter their searches by race – or rely on algorithms that pair up people of the same race – reinforce racial divisions and biases. They say existing algorithms can be tweaked in a way that makes race a less important factor and helps users branch out from what they typically look for.
Jessie Taft, a research coordinator at graduate school Cornell Tech and one of the paper’s authors, explains that there is a lot of evidence that says people do not actually know what they want as much as they think they do.
“Intimate preferences are really dynamic, and they can be changed by all types of factors, including how people are presented to you on a dating site,” he says. “There’s a lot of potential there for more imagination, introducing more serendipity and designing these platforms in a way that encourages exploration rather than just sort of encouraging people to do what they would normally already do.”
For the paper, Taft and his team downloaded the 25 most popular dating apps (based on the number of iOS installs as of 2017) including OKCupid, Grindr, Tinder and Coffee Meets Bagel. They looked at each apps’ terms of service, sorting and filtering features, and matching algorithms to see how design and functionality decisions could affect bias against people of marginalised groups.
They found that matching algorithms are often programmed in ways that define a “good match” based on previous “good matches”. In other words, if a user had several good Caucasian matches in the past, the algorithm is more likely to suggest Caucasian people as “good matches” in the future.
Algorithms also often take data from past users to make decisions about future users – in a sense, making the same decision over and over again. Taft argues that this is harmful because it entrenches those norms. If past users made discriminatory decisions, the algorithm will continue on the same biased trajectory.
“When somebody gets to filter out a whole class of people because they happen to check the box that says [they’re] some race, that completely eliminates that you even see them as potential matches. You just see them as a hindrance to be filtered out, and we want to make sure that everybody gets seen as a person rather than as an obstacle,” Taft says.
“There’s more design-theory research that says we can use design to have pro-social outcomes that make people’s lives better rather than just sort of letting the status quo stand as it is.”
Other data shows that racial disparities exist in online dating. A 2014 study by OKCupid found that black women received the fewest messages of all of its users. According to the site’s co-founder Christian Rudder, Asian men had a similar experience.
Taft says that when users raise these issues to dating platforms, the platforms often respond by saying it is simply what users want.
“When what most users want is to dehumanise a small group of users, then the answer to that issue is not to rely on what most users want,” he says. “Listen to that small group of individuals who are being discriminated against, and try to think of a way to help them use the platform in a way that ensures that they get equal access to all of the benefits that intimate life entails. We want them to be treated equitably, and often the way to do that is not just to do what everybody thinks is most convenient.”
He says dating sites and apps are making progress. Some have revamped their community guidelines to explicitly state that their site is a discrimination-free zone, and users who send hateful messages are subsequently banned. Others are keeping the race/ethnicity filter but also adding new categories by which to sort.
Taft hopes the people making design decisions will read his team’s paper and at least keep the conversation going.
“There are a lot of options out there,” Nikki Chapman says. “I remember filling out on an app, ‘What hair colour are you interested in? What income level? What level of education?’ If you’re going to be that specific, then you need to go build a doll or something because life and love doesn’t work like that.”