Broadcast From Around the World: Real-Time Live! Amazes at SIGGRAPH 2020

Screenshot of “Interactive Style Transfer
to Live Video Streams” captured during SIGGRAPH 2020 Real-Time Live!

On Tuesday, the annual SIGGRAPH conference favorite Real-Time Live! streamed remote demos for the first time ever. SIGGRAPH 2020 Real-Time Live! Chair Marc Olano kicked off the event by welcoming the contributing teams, who were dialed in from Seoul, Prague, and everywhere in between to present their jury-selected work.

Although Real-Time Live! looked a bit different — there was no stage! — than in SIGGRAPH’s past, Olano still brought the fun (and puns) to presenting the nine featured projects, and helped viewers feel connected by asking for audience participation via Twitter for one of the demos.

Watch a retrospective of Real-Time Live! over the years.

Read on for a brief overview of each of the exciting demos presented during the SIGGRAPH 2020 Real-Time Live! show (and to find out the winners).

Chroma Tools: AI-enhanced Storytelling for Motor Racing Sports

Start your engines and head off to the races with this technology designed to automate overlay visuals for live motor racing on television. Chroma Tools enables dynamic overlays that track racers as they appear on screen in real-time. The live demo showed how the system can identify each car and immediately change settings using AI and allowing AI to lead in telling a story.

Interactive Style Transfer to Live Video Streams

Best in Show (Two-way Tie)

For artists who want to sketch out their ideas quickly, they may want a fast, interactive, and easy-to-use platform. Enter “Interactive Style Transfer to Live Video Streams.” During this team’s demo, artist Pavla painted over a stencil of Ondřej’s face and the framework picked up Pavla’s additions, projecting them onto Ondřej’s face in a live, real-time video stream. The network can adapt to style changes in seconds. Major kudos to this team for not only tying for the Best in Show award, but also for burning the midnight oil to present live from Prague! Be sure to catch their accompanying Technical Paper on demand.  

DrawmaticAR — Automagical AR Content From Written Words

Audience Choice

DrawmaticAR was one of the most interactive demos of the event. This project turns what you write on real paper into a 3D, AR, interactive story. To demonstrate this, contributor Yosun (along with her corgi) asked Real-Time Live! viewers to use the hashtag #DrawmaticAR_RTL to tweet ideas for food you’d find in a picnic basket. Viewers suggested hot dogs, pie, cheese, and wine — and each item appeared in real-time! The app offers a bright future for interactive storytelling, and took home the coveted Audience Choice award.

SIGGRAPH 2019 Real-Time Live! Chair Gracie Arenas Strittmatter tweets praise to DrawmaticAR, Olano, and Yosun’s corgi.

Introduction to Real-time User Interaction in Virtual Reality Powered by Brain Computer Interface Technology

Looxid Link, from Looxid Labs, is a VR-compatible, brain-sensing technology that can be utilized as a natural user interface to connect users’ minds to VR. It can acquire biometric data to capture aspects of users’ minds like cognition and emotion, giving insight into the cognitive process. During this demo, the team showed how the technology can track attention and relaxation levels, as well as left and right brain activity.  

AI-synthesized Avatars: From Real-time Deepfakes to Photoreal AI Virtual Assistant

Did you hear that Will Smith stopped by? Actually, that was just a real-time, deep learning-based facial synthesis technology for photoreal AI avatars. This technology allows a user to create their own 3D face model and transform into the face of an actor, athlete, politician, musician, or anyone they can think of. As an entirely cloud-based AI solution, this technology can also enable an AI-based photoreal autonomous virtual companion — presenter Hao Li even conversed with a virtual avatar of his wife.  

Limitless Dynamic Landscapes of 1 mm per Pixel Density, Zooming Included

Take a trip to another land with “Limitless Dynamic Landscapes.” This new system showed how it fulfills a challenging combination of requirements, combining very dense details with huge terrain size. This demo covered distance from 8 km to 1,000 km with high-resolution data up to 1 mm per pixel. The technology, which took almost two years to create, presents a very detailed and specialized interaction with the landscape.

Sketch-to-Art: Synthesizing Stylized Art Images From Hand-drawn Sketches With No Semantic Labeling

Become your own Picasso with “Sketch-to-Art.” By sketching a landscape, still life, or composition, AI translates sketches into a digital drawing. Then, creators can choose a style, artist, or reference image for the digital sketch to emulate, transforming it into a design that looks like a classic piece of art. We also saw demo of “face mix”, which can mix photorealistic and illustrative faces to create a desired face. The example combined Kim Kardashian, Kanye West, and North West to predict what North will look like in her 20s.   

Volumetric Human Teleportation

Best in Show (Two-way Tie)

In this time of social distancing, are you missing (safe) social contact? Volumetric Human Teleportation captures a completely clothed human body using a single webcam in real time. It dynamically adapts to changes in appearance, such as taking off a sweatshirt or adding a prop like a backpack, can handle everyday objects and lighting, and can automatically switch subjects with no template required. The teleportation aspect places subjects in new environments, and the setting is customizable, transporting users to the office or a fantasy world. Now, if only we could teleport together to celebrate this team’s tie for the Best in Show award!

The Technology Behind ‘Millennium Falcon’: Smugglers Run

All aboard! Last but not least, this demo took viewers behind the scenes of the Disney magic that powers the “Millennium Falcon”: Smugglers Run theme park ride. Walt Disney Imagineers created this fully immersive, interactive attraction to take guests inside “Star Wars”. The team created a high-fidelity and high-resolution view of the world, rendering content over five projectors and employing multi-GPU technology to achieve their goals. For more on the making of this attraction, read our interview with contributor Eric Smolikowski.


Congratulations to award winners “DrawmaticAR — Automagical AR Content From Written Words,” “Interactive Style Transfer to Live Video Streams”, and “Volumetric Human Teleportation”! The full SIGGRAPH 2020 Real-Time Live! livestream will be available soon. Check back here or on our YouTube channel to watch.

SIGGRAPH 2020 Real-Time Live! (edited) on YouTube.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.