The Korean National Police Agency (KPNA) is collaborating with the German police force on a latest study aimed at detecting and preventing false content on platforms such as social media. This project, which began in the 2020s, is a significant step toward shaping a secure and law-abiding environment.

In 2026, the KPNA announced that it has Begin Development of a “False fabricated content authenticity determination system,” a joint effort with the police in North Rhine-Westphalia, Germany. This project is part of the first international joint research study between Korea and Germany in the field of science and security. The aim is to create a comprehensive dataset (a collection of analysis data) to identify fake content, including things like deepfake (AI-generated manipulated images or photos), deep voice ( VOICE syngthesized with AI), and fake news. The ultimate goal is to develop an integrated discrimination system by 2027.

The study involves efforts from multiple institutions. In Germany, Boopertal University has been selected as the joint research institute after undergoing field investigations and recommendations from the North Rhine-Westphalia Criminal Investigation Agency. In Korea, Soongsil University has presented as the subject research institute after undergoing a public offering and evaluation process, with Sungkyunkwan University, Yonsei University, and Hancomwith also participating.

Political directives speak volumes about the collaboration. In 2023, the director of the Future Security Policy Bureau at the KPNA, Choi Joo-won, addressed amuch related to the newly announced study. He acknowledged the urgency of addressing the growing issue of deepfake and false content, stating: “It is a joint response that combines the two countries’ capabilities to future security threats beyond simple cooperation. We look forward to suggesting solutions to the problem of false and misleading content throughout this world.”

Choi Joo-won emphasized collaboration in developing the dataset and integrating detection models, stating that “we aim to create a diverse and fair dataset that allows for comprehensive detection of false content.” He also highlighted the importance of securing the progress of the integrated system by working closely with partners across all stages, from model development to distribution and operation.

In future years, the KPNA will publish a detailed report asynchronously, which will detail the actual process and verifiable。The issue is revealing the need for comprehensive security systems at a global level.

Share.
Exit mobile version