(Getty Images Bank)
The Korean National Police Agency said Tuesday it had developed deepfake detection software to probe crimes using the technology.
Deepfakes, videos altered with AI to produce lifelike videos of things that did not happen, often depicting well-known public figures, have become a concern for authorities ahead of the upcoming general elections in April. A revision to the Public Official Election Act in December banned the use of fake content such as videos and images in election campaigns.
According to police officials, the newly developed software can analyze a video suspected of using deepfake techniques like face swapping and determine its authenticity within five to 10 minutes of being uploaded to the system. It also generates a report as soon as the analysis is finished, so that it can be immediately used in police investigations.
As the deepfake detection system previously used by the police was developed overseas, it mostly consisted of data that better recognized non-Asian ethnicities, resulting in poor detection rates for deepfake videos featuring Koreans. The new software was developed on 5.2 million data points featuring 5,400 people, including 1 million data points from Koreans and 130,000 data points from those of other Asian ethnicities. The latest artificial intelligence model was also applied to the detection system, to help it respond to new forms of deepfakes that the device was not previously trained on.
As for its detection rate, the National Police Agency confirmed that it ranges up to around 80 percent. Considering that its detection rate is not completely accurate, police officials will use the data to set the direction of the investigation rather than evidence.
Police officials added that in the future, the software will consistently be upgraded so that it accurately detects deepfake content that is not just made during the elections, but also for other crimes such as those used for sex crimes.
Additionally, as new technologies related to deepfakes are constantly being developed, for videos suspected to be deepfake content during the April general elections, the police plan to minimize the possibility of the software from false detection by cross-verifying its findings with related AI experts in academic and business fields.
MOST POPULAR