Dating apps have accountability and transparency problems. | #facebookdating | #tinder | #pof


As Valentine’s Day approaches, couples across the country are preparing for this long-standing tradition—and there’s a very good chance they met through online dating. But while dating apps can help people find a partner (or just a fun date), they can also subject users to incredible hate and harassment. Despite the fact that dating apps have accrued significant reach and influence, these companies provide very little transparency around how they keep users safe and how they moderate content. Much of the conversation around online platform accountability focuses on companies like Facebook and Google. But dating apps face many of the same issues.

The online dating sphere has changed radically since Match.com, founded in 1995, transformed the dating landscape by moving hundreds of thousands of meet cutes from cafés to chat boxes. Then came the “swipe right” apps. Grindr was launched in 2009, followed by Tinder, Bumble, and many other apps that are now household names. As these apps grew in popularity, so did services that catered to individuals with specific ethnic, racial, religious, and even interest backgrounds. Looking for South Asian partners? Dil Mil is there for you. Want to find a partner whose résumé matches your expectations? There’s even an app for that.

But these apps can also put users in harm’s way. Stories of hate, harassment, sexual assault, and downright weird encounters both on apps and during app-facilitated dates have gained notoriety. Many of these anecdotes underscore the opaque process users have to go through when trying to report an offending account and platforms’ lackluster responses to user reports. Despite this, dating apps provide very little transparency and accountability around how they handle safety and content issues on their services. For example, very few of these platforms currently publicly publish clear and easily accessible copies of their community guidelines. These policies are essential for understanding what kinds of content are permitted on a service.

Some dating apps have made some progress in this regard. Tinder’s high-level guidelines prohibit nudity and sexual content, hate speech, and users under 18 from using the app, among other things. Coffee Meets Bagel’s guidelines include specific restrictions around photographs involving guns, drugs, and other things.

Others fall short. Some major services (here’s looking at you, Hinge) have nested their community guidelines in their Terms of Service, which is full of legal jargon and not accessible to the average user. This lack of clarity around content policies is also especially visible the more niche the dating app is. A simple search for Dil Mil’s community guidelines leads you to a 15-page terms of use .docx file. Muslim Mingle’s high-level guidance on prohibited content is nested under the company’s privacy policy. The only clear exception to this is Grindr. This is likely because the smaller a platform is, the more resource-constrained it is. However, providing at least a basic level transparency and accountability around content policies should be a priority for all platforms, regardless of size. Without this information, a user in harm’s way has no point of reference to understand if the harmful behavior is permitted and a user who has been flagged has nowhere to turn to confirm that they are actually in the wrong. In addition, without these policies, it’s difficult to hold a platform accountable for keeping its users safe. Employees at Bumble have noted that although the company claims its policies make the platform less misogynistic, it has done little follow-up to map out if and how its enforcement has changed behavior.

Uber, another platform that brings people together in the offline world, publishes transparency reports outlining the volume and nature of safety incidents, such as sexual assaults, that occur during app-facilitated interactions in the offline world. (It’s the only tech platform that currently does something like this.) Social media companies also publish transparency reports that outline the scope and scale of their content policy enforcement efforts, including the removal of content that has been determined to contain hate speech, bullying and harassment, and graphic nudity. Despite calls for dating apps to follow suit, no major dating app publishes a transparency report.

Today, approximately one-third of young people in the United States say they use dating apps. These services are especially popular among certain marginalized groups, such as lesbian, gay, and bisexual adults. A failure to provide transparency and accountability around how a dating app moderates content can therefore create barriers to access and establish a system that fails to adequately protect the safety of its users. As research indicates, when vulnerable groups feel threatened online, they often begin to self-censor and may even leave an online platform altogether. Given that dating apps are prominent methods for building community and finding partners, friends, and even business connections, these companies must ensure that their services are welcoming and inclusive for all users.

Future Tense
is a partnership of
Slate,
New America, and
Arizona State University
that examines emerging technologies, public policy, and society.





Source link

.  .  .  .  .  .  . .  .  .  .  .  .  .  .  .  .   .   .   .    .    .   .   .   .   .   .  .   .   .   .  .  .   .  .