As suicide rate rises, so does 'suicide-promoting content,' say concerned regulators
Update: 2025-09-30
Description
This article is by Nam Soo-hyoun and read by an artificial voice.
Korea's suicide rate hit its highest level in 13 years last year, while reports of "suicide-promoting content" online have surged to record levels. But the rate removing such content has dropped sharply, igniting calls for a more effective response system.
Particularly on portal sites, the removal rate of such content was found to be just 5 percent, and in the past three years not a single piece of reported video content - such as films or dramas - has been taken down, underscoring wide disparities in how different platforms handle reports.
Citizen monitoring teams reported a total of 401,229 items of suicide-promoting content last year - the highest ever recorded, according to data submitted by the Ministry of Health and Welfare to Rep. Han Zee-a of the People Power Party on Monday.
Posts on social networking services, including X, formerly Twitter, accounted for 366,907 of those reports, or 91.5 percent of the total.
Despite the record number of reports, actual removals declined. Of last year's social media-related reports, only 53,233 items were deleted, a removal rate of 14.5 percent - about half the 28 percent removal rate recorded in 2023.
Other platforms fared even worse. Of 2,703 reports involving portal sites, just 136 items, or 5 percent, were removed. Community sites removed 2,563 items out of 21,451 reports, or 11.9 percent.
The category of video content - including films and dramas - was added in 2022. Since then, 4,300 items were reported in 2022, 792 in 2023 and 1,093 last year, but none were deleted over the three-year period.
"While dramas produced by broadcasters are submitted to the Korea Communications Standards Commission for review, none have resulted in removal orders," said a representative from the Korea Foundation for Suicide Prevention. "Movies and online video services undergo only internal reviews by the providers and have no dedicated oversight body."
Currently, the system run by the Health Ministry and the center relies on a citizen monitoring team to report suicide-promoting content. Platforms such as social media and portals then review and delete posts based on their own internal rules, leaving decisions effectively up to each service provider.
"Even when we report, platforms often refuse to delete the content," said an official from a civic group involved in the monitoring effort. "Leaving suicide-promoting information online especially harms vulnerable teenagers."
For clearly defined suicide-promoting content, the Korea Communications Standards Commission can order deletion or blocking, but its reviews take an average of 99 days - enough time for information to spread widely.
The government's recently announced "2025 National Suicide Prevention Strategy" includes plans to expedite review by allowing written deliberations to speed up deletion and blocking, but no implementation date has been set.
Experts say the system needs urgent reform to ensure quick deletion or blocking of harmful content.
"Currently, information providers like portal sites face little penalty for ignoring reports," said Yu Hyun-jae, a professor of communications at Sogang University. "At minimum, platforms should be required to transparently disclose how many deletion requests they've refused to encourage cooperation."
However, some argue that blocking posts on social media that express depression or vague thoughts about suicide could have negative consequences, and therefore require a more precise distinction.
"Clearly harmful content should be deleted, but right now the standards are ambiguous," said Yoo Gyu-jin, who leads a civic group that monitors social media for suicide prevention. "Unrestrained reporting and deletion could actually shrink the online space where teenagers express their feelings."
"We need trained professionals who can determine whether content should be deleted or whether it's a cry for help requiring intervention," Yoo said....
Korea's suicide rate hit its highest level in 13 years last year, while reports of "suicide-promoting content" online have surged to record levels. But the rate removing such content has dropped sharply, igniting calls for a more effective response system.
Particularly on portal sites, the removal rate of such content was found to be just 5 percent, and in the past three years not a single piece of reported video content - such as films or dramas - has been taken down, underscoring wide disparities in how different platforms handle reports.
Citizen monitoring teams reported a total of 401,229 items of suicide-promoting content last year - the highest ever recorded, according to data submitted by the Ministry of Health and Welfare to Rep. Han Zee-a of the People Power Party on Monday.
Posts on social networking services, including X, formerly Twitter, accounted for 366,907 of those reports, or 91.5 percent of the total.
Despite the record number of reports, actual removals declined. Of last year's social media-related reports, only 53,233 items were deleted, a removal rate of 14.5 percent - about half the 28 percent removal rate recorded in 2023.
Other platforms fared even worse. Of 2,703 reports involving portal sites, just 136 items, or 5 percent, were removed. Community sites removed 2,563 items out of 21,451 reports, or 11.9 percent.
The category of video content - including films and dramas - was added in 2022. Since then, 4,300 items were reported in 2022, 792 in 2023 and 1,093 last year, but none were deleted over the three-year period.
"While dramas produced by broadcasters are submitted to the Korea Communications Standards Commission for review, none have resulted in removal orders," said a representative from the Korea Foundation for Suicide Prevention. "Movies and online video services undergo only internal reviews by the providers and have no dedicated oversight body."
Currently, the system run by the Health Ministry and the center relies on a citizen monitoring team to report suicide-promoting content. Platforms such as social media and portals then review and delete posts based on their own internal rules, leaving decisions effectively up to each service provider.
"Even when we report, platforms often refuse to delete the content," said an official from a civic group involved in the monitoring effort. "Leaving suicide-promoting information online especially harms vulnerable teenagers."
For clearly defined suicide-promoting content, the Korea Communications Standards Commission can order deletion or blocking, but its reviews take an average of 99 days - enough time for information to spread widely.
The government's recently announced "2025 National Suicide Prevention Strategy" includes plans to expedite review by allowing written deliberations to speed up deletion and blocking, but no implementation date has been set.
Experts say the system needs urgent reform to ensure quick deletion or blocking of harmful content.
"Currently, information providers like portal sites face little penalty for ignoring reports," said Yu Hyun-jae, a professor of communications at Sogang University. "At minimum, platforms should be required to transparently disclose how many deletion requests they've refused to encourage cooperation."
However, some argue that blocking posts on social media that express depression or vague thoughts about suicide could have negative consequences, and therefore require a more precise distinction.
"Clearly harmful content should be deleted, but right now the standards are ambiguous," said Yoo Gyu-jin, who leads a civic group that monitors social media for suicide prevention. "Unrestrained reporting and deletion could actually shrink the online space where teenagers express their feelings."
"We need trained professionals who can determine whether content should be deleted or whether it's a cry for help requiring intervention," Yoo said....
Comments
In Channel