"Once you overcome the one-inch-tall barrier of subtitles, you will be introduced to so many more amazing films." At the 2020 Golden Globes award ceremony, Bong Joon-ho, director of the internationally successful film Parasite, spoke of the one-inch barrier. The one-inch barrier refers to the hurdle of Western, English-speaking audiences when they are faced with watching foreign language films.
However, the one-inch barrier that Bong talks about might already belong to the past, now that large language models (LLMs) are advancing with such speed. Will we soon reach the point where the one-inch barrier no longer exists? What will be the future of translation and our languages? The Galaxy 24 smartphone now comes with an AI-powered feature called Live Translate, which translates phone calls in real-time directly on your device. Perhaps soon we will reach a time when we don’t need to translate or even learn languages at all.
Will translation still be necessary in the AI age? Digital humans are becoming increasingly popular and it isn’t difficult for us to now imagine watching a film that is neither subtitled nor dubbed, but instead the actors speak in various languages without translation. The use of AI for music covers is also gaining in popularity. For instance, Lauv used AI voice technology to release his new single in Korean. When I first heard it, I thought he had mastered Korean, but it turned out to be an AI cover. Are we then perhaps entering a time when the confusion of languages post Babel can finally be overcome?
As a linguist with a speciality in Asian languages, I don’t think we are quite there yet. The “one-inch barrier” will be hard to overcome even with the help of AI. AI is a fantastic tool to translate information, but it’s hard to bring linguistic justice to the emotional character of human communication.
In 2021, I wrote a book titled "Pragmatic Particles: Findings from Asian Languages." In the book, I showed how contemporary linguistic theories are unfit to describe and explain the rich pragmatic meanings embedded in Asian languages. Emotional and attitudinal diversity is rich in Asian languages, yet this is hardly translatable into English.
In English, one uses the second person pronoun "you" regardless of whether one is older, younger, or privileged -- it’s a term both for the powerful and powerless. However, this isn’t the case for most Asian languages. In the case of Korean, one of the most difficult words to translate from English is the second person "you." In order to make it work in Korean, one has to calculate multiple interpersonal factors. It’s never easy, even for native Korean speakers, to find the right term of address and we know that inappropriate use of the terms can create serious conflicts that sometimes even lead to violence.
I sense that perfect translation will be unachievable even with the arrival of hyperscale AI. When ChatGPT was released, I tested a number of Asian languages, including Chinese, Japanese, Korean, Turkish, Mongolian, and Kazakh in its ability to translate. The outcome was disappointing. This wasn’t just the case for ChatGPT but also for Google’s Bard. Translating English into Asian languages still causes some serious pragmatic problems in Google Translate which has been around for quite some years. Since each translation is done sentence-by-sentence, interpersonal relations between the speaker and hearer, set by the context, is hard to transfer in this process. In Korean-English translation, Naver’s Papago has a button allowing the user to choose an honorific style. Yet, as all Koreans know, there are multiple factors to consider when using a certain honorific style and it’s difficult to make that decision based on a single sentence. LLM-based translation can function better than the existing models in the pragmatic sense, as one can modify and feed the styles of translation. However, it is not based on grammatical rules -- rather it’s based on the frequency of terms. Hence, it’s easy to find silly mistakes when the output is produced. This is why we can only use Google Translate or other LLM-based translation to find the gist of meaning -- not its meaning in conversation with native speakers.
At present, there is not enough data with which the LLMs can be trained on in Asian languages. It will take a while to establish the dataset for these languages. Hence, AI operates best in English, followed by Western European languages, but when it comes to Asian languages, their ability to translate and produce natural language is quite limited.
The fate of languages is certainly challenged by the advance of AI. Just as we have been more accustomed to paying by card than with cash and sending an instant message rather than a fax, the ways languages are delivered and transferred will constantly change. However, I sense it will take quite a while for AI to achieve pragmatically suitable translations. In the recent film "Decision to Leave" by South Korean director Park Chan-wook, the protagonist Seorae uses different AI mediums along with her own voices to communicate with the detective Haejun whom she falls in love with.
In doing so, when Seorae wants to be herself and express her feelings, she speaks in Chinese first and then uses her phone for AI translation. On the other hand, when she speaks with others and when emotion doesn’t matter too much, she speaks in Korean or uses her AI-voices. In summary, she expresses her feelings with her own voice and then accompanies it with the AI-translated information. Perhaps how Seorae speaks points to the future of translation. This kind of division of labor will be the norm. Information-driven translation will be increasingly achieved by AI, yet it still won’t take away the need for human translators as AI cannot easily transfer rich emotions and pragmatic diversity into languages.
Jieun Kiaer
Jieun Kiaer is a Young Bin Min-KF professor of Korean linguistics at the University of Oxford. -- Ed.