President Yoon Suk Yeol declares martial law at the presidential office in Yongsan, Seoul, Dec. 3. (Presidential office)
South Korea has descended into political turmoil following President Yoon Suk Yeol’s unexpected declaration of emergency martial law on the night of Dec. 3. The move shocked many, particularly after it was revealed that Yoon had ordered troops to the National Election Commission on that chaotic night.
Many suspect that Yoon’s actions were influenced by far-right extremist YouTube channels, which have long propagated election fraud conspiracy theories, including claims about irregularities in the 2020 and 2024 general elections.
Such suspicion gained traction after reports surfaced that it was an "open secret" that Yoon avidly watches these channels. It reemerged that Yoon had invited around 30 far-right YouTubers to his inauguration, including operators of channels such as Lee Bong-gyu TV and Sisa Warehouse, which boast some 927,000 subscribers and 144,000 subscribers, respectively.
On Dec. 13, Lee Hae-min, a former Google employee and lawmaker, pointed out, “Yoon truly believes in the election fraud theories circulating on far-right YouTube channels. He sees them as the root cause of all problems.”
Critics, including former People Power Party leader Han Dong-hoon, warned that aligning with these conspiracy theorists and extremist YouTubers, "who produce fear commercially," could destroy conservatism in South Korea, signaling how such YouTube channels have infiltrated the country's politics.
As the controversy grows, the role of YouTube’s algorithms in steering political views is under increasing scrutiny, raising the question: How does YouTube shape its users’ beliefs?
Confirmation bias
Experts argue that YouTube's algorithm, with its mechanical nature, amplifies confirmation bias.
“When you enter YouTube’s main page, recommended videos appear on the right side. These recommendations are based on the user’s viewing history, with a certain bias in how they are suggested,” said Han Jeong-hun, a professor at the Graduate School of International Studies at Seoul National University, who authored the 2022 dissertation "Understanding Political Polarization Based on User Activity: A Case Study of Korean Political YouTube Channels."
Han continued, “Human nature also plays a role, as people naturally gravitate toward content they prefer. Combined, these factors create a cycle where users repeatedly engage with similar content, reinforcing a biased consumption of information that can influence people's behavior over time.”
Experts also note that content that is extreme or radical in nature is more likely to go viral. “YouTube's algorithm is a mathematical function. Provocative content gets promoted. Things with the characteristics of propaganda — whether from the right, left or even sexual — tend to be more visible because the platform's goal is to increase MAU, or monthly active users,” said Yu Hyun-jae, a journalism professor at Sogang University.
YouTube among older users
Lee Bong-kyu, operator of the right-wing YouTube channel Lee Bong-kyu TV, which he claims is watched by President Yoon Suk Yeol, speaks in this screenshot. (Screenshot from Lee Bong-kyu TV)
In South Korea, YouTube’s influence is particularly concerning as more older users, often unaware of the potential harms of the platform, are becoming increasingly active.
"There has been a significant rise in people in their 50s, 60s and 70s turning to YouTube for political news and similar content," said Lee Moon-haeng, a media communication professor at Suwon University.
"The shift is mainly driven by convenience, as YouTube now offers content tailored to their preferences. Unlike their children who may offer different perspectives, older individuals are finding peer groups on YouTube that reinforce their views."
One of the pressing concerns is that efforts to disable algorithms in order to explore diverse content are difficult to implement.
“Even if you block the algorithm, you can’t go a day without turning it back on. The convenience disappears, and you’ll end up reactivating it,” said Yu.
In a 2022 report, Mozilla researcher Jesse McCroskey pointed out, “YouTube continues to recommend videos even if users state they don’t want to see them, merely reducing the frequency slightly.” He added, “This means that users cannot meaningfully control YouTube’s algorithm.”
Urgent need for media literacy
Won Jae-yoon, the operator of Cat News (translated), a popular left-wing YouTube channel is seen talking in this screenshot. (Screenshot of Cat News)
Right-wing YouTubers, who are among the most profitable on the platform, as well as active left-wing YouTubers, who have been reported to spread disinformation through major left-wing platforms such as The DDanziGroup and Ruliweb, now face a unique opportunity, as Korean politics is expected to become more chaotic and confusing in the aftermath of the short-lived emergency martial law.
Experts emphasize the critical need for media literacy in this environment. “Few countries have such a close connection between YouTube and politics as South Korea, thanks to its nearly unrestricted internet and robust infrastructure,” said Yu. “The best solution is media literacy education for the public.”
Lee agreed, emphasizing the detrimental impact that ignorance in media literacy could have on society as a whole. "It’s not just about passively accepting information — it’s about sharing without verifying its truth, often without any desire to check. People see something they believe is important or accurate and spread it, amplifying misinformation."
Han echoed this sentiment, arguing that blocking algorithms or regulating content is not only unrealistic, but also detrimental to freedom.
“The public must raise its awareness. People should be able to filter content themselves, and if they suspect something is false, they should cross-check with other media sources. This is what citizens in a democratic society should do," said Han.
Related Stories
MOST POPULAR