Advertisement
Advertisement
Crime in Hong Kong
Get more with myNEWS
A personalised news feed of stories that matter to you
Learn more
An expert has warned a single photo or mere seconds of recording of a victim’s voice would be enough to generate fake material using AI technology. Photo: Shutterstock

Hong Kong scams using AI? Police issue warning as fake clip of bank CEO tricks victim, man flags blackmail attempt with doctored sex video

  • Japanese man reports being tricked by video showing ‘Hong Kong bank CEO’ describing new product
  • Police urge people to look for telltale signs in fake videos, especially in eye and mouth movements
Hong Kong police have received two reports of scammers using artificial intelligence (AI) to trick their victims, creating videos featuring cloned voices and doctored images.

A Japanese man reported losing HK$1,700 (US$217) worth of computer game credits in May after being fooled by a fake video interview apparently with the chief executive of a bank in Hong Kong.

A 25-year-old Hong Kong man made a police report even though he did not fall for a scammer’s demand to buy HK$10,000 worth of game credits after being threatened in March with a video in which his face was superimposed on the naked body of a stranger engaged in sex acts.

The force said people should be on the alert for scammers using new ways to trick victims, while an expert who warned of a possible emerging trend noted that a single photo or mere seconds of a recording of a victim’s voice would be enough to generate fake material using AI technology.

Superintendent Chan Shun-ching, from the force’s cybersecurity and technology crime bureau, said police had yet to see scams using deepfakes, which include high-quality videos that can place people in fake videos showing them speaking or moving about in real time.

But he said technology was already being used to cause confusion, as scammers could put real people into fake video calls and recordings.

The Japanese man told police in his country that he was tricked after coming across an Instagram profile apparently of a Hong Kong retail bank chief executive, with a video of the CEO promoting an investment plan in an interview with an international broadcaster.

Senior Inspector Tyler Chan displays a technique to check for deepfake technology on a video call. Photo: Xiaomei Chen

The man followed the account and began to chat with the scammer, who he thought was the bank CEO, and eventually bought HK$1,700 worth of in-game credits at the scammer’s request.

It was only after he bought the credits that he called the bank’s headquarters in Hong Kong to verify the authenticity of the CEO’s Instagram account, and realised he had been scammed.

The bank also contacted Hong Kong police over the incident.

Chan said scammers were able to collect face and voice data for their fake videos from genuine content available on social media.

He said AI software tools also enabled tricksters to “swap faces” and appear to be someone else convincingly in real-time videocalls, confusing their victims, but there had been no such case reported in the city yet.

The 25-year-old man who reported a scam attempt in March said it happened after he and the swindler had a video call on a sex live-streaming app from an unclear source.

He was threatened with a video showing him engaging in sex acts, but knew right away that it was fake.

He ignored the scammer’s demand to buy HK$10,000 worth of game credits and deleted the app and all their conversation records, before reporting to the force.

Police did not investigate as the man had suffered no loss, but officers believe the scammer grabbed his image and voice during the video call to use in the fake sex clip.

Senior Inspector Tyler Chan Chi-wing, from the cybersecurity and technology crime bureau, said there were telltale signs in deepfakes used in video calls and people should look out for unnatural eye and mouth movements.

To test the authenticity, he said, the caller should be asked to move a finger across their face.

As the programme needed to adjust to the sudden movement, the imitated face would become blurred as the finger moved. In an authentic video, there would be no blurring as the same movement occurred.

Francis Fong Po-kiu, honorary president of the Hong Kong Information Technology Federation, said he expected the use of AI by fraudsters to become a trend because it was so easy and inexpensive to get the tools to make convincing deepfakes.

A single photo and a voice recording of just a few sentences were enough to make a fake product, and video calls were fertile ground for scammers to obtain the data needed for their sinister creations.

“It can take 30 seconds or a minute as the scammer records your face and voice to impersonate you,” he said.

He listed more than 20 face-swap and voice-cloning AI tools available from sites charging less than US$100 per month in subscription fees.

He advised people to ignore all video call requests from unknown sources.

Police also urged tech-savvy teenagers to remind older family members about the dangers of such technology.

Post