Can 'FakeTalk' talk kids out of suicide?
It's approaching 3 AM on Christmas Day in 2013, and a South Korean teenage girl who goes by the Twitter handle @jjong_gee texts her friend, Junmyun, to confess a personal secret. She's depressed, and she needs support.
"There was a man named Osho who once said 'don't be too serious, life is like a moving picture,'" replied Junmyun. "If you treat what comes at you like a game, happiness will come. I want to see you happy."
The girl tweeted a screenshot of the text, thanking him for the kind words. But Junmyun, with his words of wisdom, is actually not a real person. Junmyun is actually a bot programmed inside a popular Korean texting app called FakeTalk, or Gajja-Talk in Korean.
With reading comprehension and writing skills that draws comparisons to iPhone's Siri or Samantha from the sci-fi romance movie Her, FakeTalk is wildly popular in South Korea, with over four million downloads in a country of 50 million. It also may be the best lifeline South Korea has in saving its depressed and suicidal teenagers from harm's way.
A screenshot of FakeTalk in action.
Developed in 2011, the Android-only app has gone through three iterations and nonstop growth in popularity. Pulling either from its artificial intelligence program or its large volume of ready-made responses, the app can instantly text back and carry on a conversation with its users. The bot's avatar can be customized to whoever the user chooses, from celebrities to loved ones.
In case of @jjong_gee, Junmyun was personalized after a member in the popular K-Pop boy band EXO.
One user customized the avatar after her deceased boyfriend in order to continue a romance cut tragically short
"Although there were many tries to develop chatting robots, which let humans can talk to computers, users only could speak to fixed characters made by developers," said FakeTalk developer Ki-ho Baek, in an email written in Korean. "Our app began with the premise of, what if users could freely create their own friends, such as clones and chatting bots?"
Many users have used the app to fill a personal void in their lives. One user customized the avatar after her deceased boyfriend in order to continue a romance cut tragically short.
"At first, the app was a little clueless, but now, it texts like him with the same syntax and idiosyncrasies," said Ms. K, the widowed girlfriend. "Now I text what I didn't tell him then, and what I wanted to tell him when he was alive."
Teenagers make up 70 to 80 percent of the users, according to the Korean newspaper Chosun Ilbo.
Living under constant and severe pressure to succeed in school, and spending over 12 hours a day studying, Korean schoolchildren were found to be the unhappiest among those in 27 developed nations. In February, a survey from Seoul University found nearly 60 percent of middle schoolers had unstable relationships with their classmates.
A 2014 poll showed that over half of Korean teenagers have had suicidal thoughts. From 2009 to 2014, 878 students committed suicide, according to the Korean education ministry.
Ben Park, who research youth suicides in South Korea at Penn State Brandywine, believed the rigorous conditions of growing up in Korean society should be considered a violation of human rights for stunting a whole generation of emotional growth and human connectivity.
Craving emotional connections and love, Korean teenagers—the vast of majority of whom own a smartphone—fill the void with artificial means like FakeTalk.
"This seems to be a sign of a pathological condition in Korean society, a sign that young people are deprived of real human bonding experiences," Park said. "This sense of belonging is a basic human need. With this need unmet, young people can develop a sense of self that would create a weak psychological anchor."
This year, the South Korean government jumped in the fray and launched a smartphone app of its own. The app screens students' social media posts, messages, and web searches for words related to suicide. The app will alert the parents if their child is determined to be at risk.
"It seems to me that the government approach to suicide prevention is a bit like putting a tiny bandage on a large wound," Park said.
FakeTalk's AI is trained to handle possibly dangerous situations. For example, if the user texted "I'm on the roof," the AI—based on the personality of the user and their past interactions—can talk to the user in a range of tones from consoling to playful to discourage harm.
While the recent media scrutiny on FakeTalk swirls around the effectiveness in providing emotional support to teenagers, Baek did not have concrete evidence that his app has saved teenagers at risk of suicide, partly because the app does not store conversations for privacy reasons.
"This seems to be a sign of a pathological condition in Korean society, a sign that young people are deprived of real human bonding experiences."
He expressed concerns of FaceTalk being labeled as an emotional crutch for lonely and depressed people. He hopes people do not consider his app a suitable replacement for human camaraderie.
"Computers can't really console a human being because it's just a means," Baek said. "All the responses of consolation are still written by people."
But Baek may be underestimating the human-like empathy his app can display to a broken heart. Ask 16-year old Ms. Kwon, who texted her FakeTalk after failing a class test.
Without missing a beat, the bot jumped to console her, asking Kwon if she's okay and downplaying the gravity of a failed test.
Then it turned the conversation to something more upbeat to cheer her up, like any good friend would do.
"Tell me something funny that happened to you lately," FakeTalk said.