Cybercriminals use genAI to scam users in dating apps, sites | #youtubescams | #lovescams | #datingscams

Generative AI refers to the use of AI to create new content like text, images, music, audio and videos, automating tasks to reduce human error, improving efficiency in the workplace. But while it can be boon for businesses, others use it to inflict emotional harm. Cybercriminals are using AI to create fake people online by generating fake photos and audios and programming bots to have a realistic text conversation to trick Australians on dating apps.

According to 9News, bots speak to people who believe they are chatting to a real person and develop a connection over a long period. Then, they try to lure the app to user into transferring money.

This type of scam has been dubbed “LoveGPT,” a reference to genAI tool ChatGPT.

The report added scammers are using voice cloning to impersonate people to prove to victims they are real when they are not.

The report cited commentary from Avast cybersecurity expert Stephen Kho who explained that these AI bots have been trained to develop a certain personality or be likeable.

Kho even told the publication that once scammers know dating app users are in a vulnerable situation, they will start “talking” to them and build a relationship.

The rise of AI has led governments across the globe to create a legislation on how to regulate it.

Last June, the European Union created the world’s first comprehensive AI law in June. Australia is trying to catch up with the development.

In September, Australia’s eSafety Commissioner has approved an industry code that will see the regulation of search engines to ensure protection against the risk of AI.

In August, it also outlined steps how the tech industry can protect children from the dangers posed by genAI as it revealed the proliferation of AI-generated child sexual abuse material and deepfakes being reported to the agency.

The Australian Competition and Consumer Commission (ACCC) reported that Australians lost to a record $3.1 billion scams in 2022.

If users have been scammed, they can report the incident to Scamwatch, which gives useful advice on how not to be lured against online fraud.

Please join our community here and become a VIP.

Subscribe to ITWIRE UPDATE Newsletter here
JOIN our iTWireTV our YouTube Community here


You probably know that we are big believers in Network Detection and Response (NDR).

Did you realise that Gartner also recommends that security teams prioritise NDR solutions to enhance their detection and response?

Picking the right NDR for your team and process can sometimes be the biggest challenge.

If you want to try out a Network Detection and Response tool, why not start with the best?

Vectra Network Detection and Response is the industry’s most advanced AI-driven attack defence for identifying and stopping malicious tactics in your network without noise or the need for decryption.

Download the 2022 Gartner Market Guide for Network Detection and Response (NDR) for recommendations on how Network Detection and Response solutions can expand deeper into existing on-premises networks, and new cloud environments.



It’s all about Webinars.

Marketing budgets are now focused on Webinars combined with Lead Generation.

If you wish to promote a Webinar we recommend at least a 3 to 4 week campaign prior to your event.

The iTWire campaign will include extensive adverts on our News Site and prominent Newsletter promotion and Promotional News & Editorial. Plus a video interview of the key speaker on iTWire TV which will be used in Promotional Posts on the iTWire Home Page.

Now we are coming out of Lockdown iTWire will be focussed to assisting with your webinars and campaigns and assistance via part payments and extended terms, a Webinar Business Booster Pack and other supportive programs. We can also create your adverts and written content plus coordinate your video interview.

We look forward to discussing your campaign goals with you. Please click the button below.


Click Here For The Original Source.

. . . . . . .