Fake Love
In the BTS fanspace, largely (not exclusively) populated by younger women from across the globe, I’m seeing an uncritical and unquestioning adoption of AI that raises concerns.
In recent weeks, hundreds (if not thousands) of fans have used a Polaroid style image generator enabled by Google’s Gemini to create realistic snapshots of themselves with BTS members that look like they are posing as a couple. To these young women, they are fulfilling a dream: meeting their bias (or favorite of the seven members). Few seem to realize that they are sharing their own image publicly and that the deepfake technology they are using is the very same used to exploit young women just like themselves in online sextortion attacks.
These are so-called “digital natives,” and yet they are acting digitally naïve.
As K-pop recording artists, BTS are defined by the words, sounds, and images they create: lyrics, voices, music composition and production, dance, and pictures. Their success depends on how these assets are developed, marketed, and controlled. AI offers them tools to enhance their art at the same time that it potentially enables others to infringe upon—or fake—it.
It has long been a point of pride that BTS doesn’t lip synch in concert or rely on autotune. They have been lauded for their lyricism in both rap and more traditional pop songs alike. But today their non-consensually AI manufactured voices can be deployed by anyone on the internet to get “BTS” to sing whatever they want. The BTS fan base, ARMY, seems to largely reject and discourage the use of audio AI to mimic their faves—but what about visuals?
Fashion, beauty, and overall styling is a huge part of the K-pop idol package. This makes the BTS members almost as much models as they are rappers, singers, and dancers, as fans covet, collect, and trade images of them—from the photocards that come with physical albums to images of more dubious provenance such as candid airport shots (the procurement of which contributes to dangerous crowd conditions) or the even more troubling sasaeng (stalker fan) pics taken at restaurants or elsewhere in idols’ private lives.
In the fanspace, images are currency. Photocards are gifted, traded, and sold. Sharing digital images garners likes and follows from other fans, increasing social capital. Posting certain kinds of images can also trend, as when the Ghibli AI image generator spurred a flood of Studio Ghibli style illustrations of the pop stars (while ripping off and trivializing the work of a celebrated animation studio in the process). Specifically in the Weverse digital artist-and-fan communication space, posting images where idols may actually see them (and sometimes respond) is a way to not only lavish love and support on artists, but to turn the tables and seize their attention.
The Gemini-enabled Polaroid images popularized this past couple of weeks enable fans to do just that. Not only are fans creating and sharing snapshots with their beloved idols, some have manufactured photos of one of the members (김태형 / Kim Taehyung aka V) with his deceased grandmother and sometimes his dog (also deceased), posting them with captions like “do you like this?” or “do you miss them?” This isn’t meant maliciously, but as an attempt to attract his attention and elicit not only a text response—but an emotional one.
BTS JEON jungkook talks about ai in december 2024
How is it being a fan to use an artist’s image without their consent, let alone images of their dead loved ones, to engage in emotional manipulation? These fans are not considering such questions, only whether the AI tool enables them to create something novel and be part of an online trend—or to connect in a hyperconnected and disconnected world.
Fan-created content obviously predates AI image generators. Portraits have been sketched and painted by fans since the dawn of pop itself. Clumsy Photoshop and even cut-and-paste have been used to create visual tributes, memes, and the like since the days of Tumblr. Fan fiction uses idols as characters in textual fantasies and inventions. These all trade in idols’ literal and figurative images while requiring some skill to make. AI tools now enable any one of us to create realistic-to-life images and audio.
BTS member Jeon Jungkook (전정국) remarks on AI audio of his own voice during a December 2024 live webcast with fans on Weverse.
“Will I be able to beat an AI?
People won’t know if we record with AI later on. We’re in trouble.
I’m doomed. My job’s disappearing.”
(English subtitles generated by Weverse.)
In 2022, deepfake images seeming to show V going on dates with a female idol group member flooded social media, with fans split as to whether the images were real or fake. (Muggles may ask “so what?” but the implications are real in an idol culture of venomously jealous fans fed by a bloodthirsty entertainment media.) Many fans felt justice had been done when BTS’ management company sued the originator of those fake images, but they fail to see the connection to their own AI-enabled actions, the ethics of consent, and the potential for harm even to themselves.
Fans are not just making their own content but making themselves content. In a digital space, our images don’t belong to us any more than idols’ do. AI lets us craft “pretty lies” (예쁜 거짓을) for each other as it pushes us down a slippery slope.
Still from BTS music video for “Fake Love” shows member V surrounded by phone screens and empty photo frames.