Grindr is partnering with Spectrum Labs, tapping the startup’s AI-based system to help filter postings on the LGBTQ dating service.
Between the lines: For years, Grindr has chosen not to implement an AI system for content moderation, not because it didn’t want to augment its keyword-based filtering system, but because it was concerned that the models weren’t sensitive enough to keep users safe without introducing other types of bias.
Content moderation via machine learning is tricky, controversial and not always good,” Grindr spokesman Patrick Lenihan told Axios.
How it works: Rather than simply police content for certain words or phrases, Spectrum’s contextual AI service works to solve specific issues, such as identifying the sale of drugs and sex as well as trying to detect underage users.
Why it matters: While Grindr had understandable reasons for waiting to find a suitable AI system, not using one meant the company was heavily reliant on user reports. In addition to being reactive rather than proactive, the approach is also vulnerable to abuse.
The big picture: Dating apps have become the key method for matchmaking, but the rise in popularity has also made them a hotbed for harassment, illegal activity and scams.
Click Here For Original Source.
Recently, SEC Chair Gary Gensler issued fresh warnings about cryptocurrencies amid Bitcoin's surge to a…
Pay Dirt is Slate’s money advice column. Have a question? Send it to Athena here. (It’s anonymous!) Dear…
By Virma Simonette & Kelly Ngin Manila and Singapore14 March 2024Image source, Presidential Anti-Organized Crime…
Technology has disrupted many aspects of traditional life. When you are sitting at dinner and…
Reports of suicides, missing bodies, sexual kompromat and emptied bank accounts as fake sangomas con…
A South African woman has been left with her head in her hands after she…