Deepfakes are video, audio and image content generated by artificial intelligence. This technology can produce false images, videos or sounds of a person, place or event that appear authentic.
In 2018, there were approximately 14,698 deepfake videos circulating online. Since then, the number has soared through the popularity of deepfake apps like DeepFaceLab, Zao, FaceApp and Wombo.
Deepfakes are used in several industries, including filmmaking, video games, fashion and e-commerce.
However, the malicious and unethical use of deepfakes can harm people. According to research by cybersecurity firm Trend Micro, the "rise of deepfakes raises concern: It inevitably moves from creating fake celebrity pornographic videos to manipulating company employees and procedures."
Al Jazeera investigates the growing threat of deepfakes.
Our research found that organizations are increasingly vulnerable to this technology and the costs of this type of fraud can be high. We focused on two public case studies using deepfakes that targeted CEOs and, to date, have estimated losses amounting to US$243,000 and US$35 million respectively.
The first case of fraud occurred at a British energy firm in March 2019. The chief executive officer received an urgent call from his boss, the chief executive of the firm’s German parent company, asking him to transfer funds to a Hungarian supplier within an hour. The fraud was presumably carried out using a commercial voice-generating software.
The second case was identified in Hong Kong. In January 2020, a branch manager received a call from someone whose voice sounded like that of the director of the company. In addition to the call, the branch manager received several emails that he believed were from the director. The phone call and the emails concerned the acquisition of another company. The fraudster used deep voice technology to simulate the director’s voice.
In both cases, the firms were targeted for payment fraud using deepfake technology to mimic individuals’ voices. The earlier case was less convincing than the second, as it only used voice phishing.
Continue reading here:
Take a look at our collections of unique NFTs, click below
Please take a look below at the amazing work of Author and researcher Stephen Quayle