Newswise — ITHACA, N.Y. – Mobile dating apps that allow users to filter their searches by race – or rely on algorithms that pair up people of the same race – reinforce racial divisions and biases, according to a new paper by Cornell University researchers. As more and more relationships begin online, dating and hookup apps should discourage discrimination by offering users categories other than race and ethnicity to describe themselves, posting inclusive community messages, and writing algorithms that don’t discriminate, the authors said.

“Serendipity is lost when people are able to filter other people out,” said Jevan Hutson, lead author of “Debiasing Desire: Addressing Bias and Discrimination on Intimate Platforms,” co-written with Jessie G. Taft, a research coordinator at Cornell Tech, and Solon Barocas and Karen Levy, assistant professors of information science. “Dating platforms have the opportunity to disrupt particular social structures, but you lose those benefits when you have design features that allow you to remove people who are different than you."

The paper, which the authors will present at the ACM Conference on Computer-Supported Cooperative Work and Social Computing on Nov. 6, shows how simple design decisions could decrease bias against people of all marginalized groups. Although partner preferences are extremely personal, the authors argue that culture shapes our preferences, and dating apps influence our decisions.

Fifteen percent of Americans report using dating sites, and some research estimates that a third of marriages – and 60 percent of same-sex relationships – started online. Tinder and Grindr have tens of millions of users, and Tinder says it has facilitated 20 billion connections since its launch.

Research shows racial inequities in online dating are widespread. For example, black men and women are 10 times more likely to message white people than white people are to message black people. Letting users search, sort and filter potential partners by race not only allows people to easily act on discriminatory preferences, it stops them from connecting with partners they may not have realized they’d like.

Users who get messages from people of other races are more likely to engage in interracial exchanges than they would have otherwise. This suggests that designing platforms to make it easier for people of different races to meet could overcome biases, the authors said.

The Japan-based gay hookup app 9Monsters groups users into nine categories of fictional monsters, “which may help users look past other forms of difference, such as race, ethnicity and ability,” the paper says. Other apps use filters based on characteristics like political views, relationship history and education, rather than race.

In addition to rethinking the way searches are conducted, posting policies or messages encouraging a more inclusive environment, or explicitly prohibiting certain language, could decrease bias against users from any marginalized group, including disabled or transgender people. For example, Grindr published an article titled “14 Messages Trans People Want You to Stop Sending on Dating Apps” on its media site, and the gay dating app Hornet bars users from referring to race or racial preferences in their profiles.

“Given that these platforms are becoming increasingly aware of the impact they have on racial discrimination, we think it’s not a big stretch for them to take a more justice-oriented approach in their own design,” Taft said. “We’re trying to raise awareness that this is something designers, and people in general, should be thinking more about.”

Cornell University has television, ISDN and dedicated Skype/Google+ Hangout studios available for media interviews. For additional information, see this Cornell Chronicle story.

-30-