Advertisement
Advertisement
Chan Chi-wing, Senior Inspector of Police of Cyber Security Division with Cyber Security and Technology Crime Bureau, demonstrates how to tell and prevent AI-related deceptions. Photo: SCMP / Xiaomei Chen
Opinion
Editorial
by SCMP Editorial
Editorial
by SCMP Editorial

Deepfake warning is a timely wake-up call

  • Police are saying that cheap artificial intelligence (AI) tools allowing scammers to “swap faces” even in real time are no longer the stuff of science fiction. Such awareness is the first step in avoiding being drawn in

Imagine getting a video call from a friend, relative or business contact, but later discovering that the person you saw and heard was not who they seemed. Instead, you were the target of a costly or even dangerous scam.

Police in Hong Kong are warning that such scenarios are no longer the stuff of science fiction. Cheap artificial intelligence (AI) tools allow tricksters to “swap faces” even in real time.

No such cases using deepfakes have been reported yet in the city, but officers have raised the alarm after scammers used AI software to make clips featuring cloned voices and doctored images.

In May, a Japanese man reported losing HK$1,700 (US$217) worth of computer game credits after being fooled by a fake television interview apparently featuring the chief executive of a bank in Hong Kong.

In March, a 25-year-old Hong Kong man told police he was threatened with blackmail over a video in which his face was superimposed on the naked body of a stranger engaged in sex acts. The man knew the clip was fake and did not fall for the ruse, but investigators say the scammer probably made the fake clip by grabbing his image and voice during a video call.

Hong Kong scams using AI? Fake clip of CEO, doctored sex video prompt police alert

The cases serve as a warning according to experts at the Hong Kong Information Technology Federation who predict AI fraud will become a trend. They said a single photo and a voice recording lasting just less than a minute are enough to make a fake product using the more than 20 face-swap and voice-cloning AI tools now online.

The police cybersecurity and technology crime bureau said scammers can also collect face and voice data for fake videos from genuine content on social media. Officers suggest ignoring video call requests from unknown sources, asking personal questions or looking for telltale signs such as unnatural eye or mouth movements. Asking a caller to move a finger across their face can also help. Any blurring of the image could mean it is a fake since computers often struggle to keep up with sudden movements.

Late last year, police called for everyone in the city to help fight a “vast and borderless” war against scams. Awareness is the first step in avoiding being drawn in. Their deepfake warning is a reminder that it is time to take real steps to stay ahead of the scammers.

Post