Amid the growing concerns over sex crimes against minors, South Korea's national police agency announced Monday it has established an international response platform to tackle the spread of online content on the sexual exploitation of children.
Dubbed "InaRae," which stands for International Response Against Exploitation, the system allows law enforcement agencies in Korea and other member countries to mutually request the removal and blocking of child sexual exploitation content.
Based on the database built by registering a list of URLs of legal and illegal sites provided by each member country, if a site URL containing sexual exploitation of children is entered into InaRae, it shows the name of the country that regulates the site and whether it is illegal.
If it shows that child sexual exploitation is posted but the site itself is registered as legal, a country can specify the member state that regulates the site and request that the site be removed. If the site is deemed illegal, a country can request that all other member states block the URL of the site.
Member States that receive a request for the removal or blocking of sites through the platform, will follow up the process of eradicating such content according to their procedures.
Previously, the police have only requested the Korea Communications Standards Commission to remove and block child sexual exploitation content in cyber sexual violence crimes, as it is considered as important as apprehending the suspect.
However, due to the nature of cybercrime, limitations to the removal and blocking within domestic agencies alone led to the launch of an international joint force, the police agency explained.
To date, law enforcement agencies from Nepal, Taiwan, Singapore, Indonesia, Thailand, and the United Arab Emirates (UAE) have expressed interest in participating. The non-profit organization National Centre for Missing and Exploited Children (NCMEC) also said it would join the cross-border cooperation.
South Korea saw over 144,000 cases of illegal filming, fake sexual images and sexual exploitation of children and adolescents reported by internet service providers last year, and deleted or blocked access to 81,578 cases.