Samsung envisions a world filled with AI assistants, but not the sort of chatbot-powered assistants to which we’ve become accustomed. In a press release late this evening, the Seoul company unveiled Neon, a project developed by subsidiary STAR Labs that ambitiously aims to deliver “immersive … services” that “[make] science fiction a reality.”
Pranav Mistry, a human-computer interaction researcher and former senior vice president at Samsung Electronics, explained that the Core R3 software engine underlying Neon animates realistic avatars designed to be used in movies, augmented reality experiences, and web and mobile apps. “[Core R3] autonomously create new expressions, new movements, [and] new dialog … completely different from the original captured data [with latency of less than a few milliseconds]” he wrote in a tweet.
Neon’s avatars look more like videos than computer-generated characters, and that’s by design — beyond media, they’re intended to become “companions and friends” and stand in for hotel concierges and receptionists in hotels, stores, restaurants, and more.
In the future, businesses will be able to license or subscribe to Neon as a service, according to Mistry. “Movies are full of examples where AI is brought into our world,” he told LiveMint in a recent interview. “In Blade Runner 2049, Officer K develops a relationship with his AI hologram companion, Joi. While films may disrupt our sense of reality, ‘virtual humans’ or ‘digital humans’ will be reality. A digital human could extend its role to become a part of our everyday lives: a virtual news anchor, virtual receptionist, or even an AI-generated film star.”
It’s worth noting that AI-generated high-fidelity avatars aren’t exactly the most novel thing on the planet. In November 2018 during China’s annual World Internet Conference, state news agency Xinhua debuted a digital version of anchor Qiu Hao — Xin Xiaohao — capable of reading headlines around the clock. Startup Vue.ai leverages AI to generate on-model fashion imagery by sussing out clothing characteristics and learning to produce realistic poses, skin colors, and other features. Separately, AI and machine learning has been used to produce videos of political candidates like Boris Johnson giving speeches they never said in reality.
Neon brings to mind Project Milo, a prototypical “emotional AI” experience developed in 2009 by Lionhead Studios. Milo featured an AI structure that responded to spoken words, gestures, and several predefined actions, and it featured a procedural generation system that constantly updated a built-in dictionary capable of matching words in conversations with voice-acting clips.
Milo never saw the light of day, but Samsung by all appearances seems keen to commercialize the tech behind Neon in the coming years. Time will tell.