Siggraph 2016: The Scoop

Community

Our run down of Siggraph 2016 - the 43rd International Conference and Exhibition on Computer Graphics & Interactive Techniques.

Siggraph 2016 - the 43rd International Conference and Exhibition on Computer Graphics & Interactive Techniques - was held in Anaheim, California from July 24th to the 28th. Four members of Team Pixvana attended the convention and gathered their highlights to share with our community: Bill Hensler (CPO), Sean Safreed (CMO), Scott Squires (CTO), and Travis Dorschel (VR Computer Vision Engineer).

VR Village


In the VR Village, attendees participated in VR demos and explored the possibilities of immersive storytelling in far-reaching applications from health and education to design and gaming. One challenging aspect of the Village was that most 
of the interactions required tickets and, even then, you could wait quite a long time to actually get into an exhibit and see an experience.

 

Synesthesia Suit

 

The Synesthesia Suit provides an immersive embodied experience in a virtual reality environment with vibro-tactile sensations on the entire body. Each vibro-tactile actuator provides more than a simple vibration- instead, it delivers haptic sensations based on TECHTILE technology. It was a pretty cool demonstration of full body feedback of sensation using a VIVE and custom suit to feel the interaction of object in space with sound generation. Though it was a pretty clumsy contraption, it definitely points the way to full body VR immersion.

 

 

 

Synesthesia Suit Synesthesia Suit

 

 

Pearl


Pearl
is a short film from Google Spotlight Stories that follows a girl and her dad as they crisscross the country in a beloved hatchback, chasing their dreams. It’s a story about the gifts we hand down and their power to carry love, and finding grace in the unlikeliest of places.

Here's an embed of the film from YouTube 360, however Team Pixvana highly recommends viewing it on the HTC Vive for the most compelling experience.

 

 

 

Real Time Live!


Real Time Live!
showcased a number of interesting new methods for VR production in live sessions. Here are a few highlights:

From Previs to Final in Five minutes: A Breakthrough in Live Performance Capture: Epic Games teamed up with Ninja Theory, Cubic Motion, and 3Lateral to create the world's first believable human driven live by an actress within an Unreal Engine game world. This won the prize for best real time presentation and points the way to quickly creating immersive character driven 3D worlds or stories.

ILMxLAB Augmented Reality Experience: Using an iPad, Vive motion tracking system, ILM's Zeno application framework, and hardware-accelerated video encoding, Industrial Light & Magic has created an augmented reality experience that allows you to see and interact with characters from the Disney and Marvel universes. This was impressive for having tracking in room scale with a Vive controller and linked iPad for AR display. A bit glitchy and the content was kind of boring (Ant Man?!) but shows the way to AR.

Quill: VR Drawing in the Production of Oculus Story Studio's New Movie: "Dear Angelica", which has a unique illustrative look created with a tool that allows artists to paint scenes in VR using Oculus Touch. This was presented by Inigo Quilez, the wizard behind the Quill software, and Wesley Allsbrook, the illustrator showing Quill in real time. Very impressive visuals!

Quill software:

Quill-Illustrations.jpg

Siggraph-2016.jpg

Nvidia: A Siggraph sponsor, Nvidia provided a partner table for Pixvana at their LimeLight event. Travis, Scott and Sean attended and presented our XR Guide: Field of View Adaptive Streaming

3D Video Demos: While videogrammetry is improving with tools from companies like 4D Views and Microsoft Research, it still has a ways to go.

4D Views: A French company showed results from their 360 capture system for transforming video into 3D models and animated textures (the term of art is videogrammetry). The system showed good results with only slightly rough edges.

Live Intrinsic Video: This was a technical presentation of Microsoft's technology for doing real time videogrammetry. This is the same technology they built to do their holoportation demo with the HoloLens AR goggles. The technique allows them to create moving 3D models using 6 cameras (and structured light emission similar to Kinnect). The result is very impressive and is definitely state of the art for real time 3D videogrammetry.

 

 

 

On the last day, Sean presented by invitation at the Nvidia booth in their theater - he introduced our FOVAS solution and talked a little bit about the SuperNova system in the cloud. You can check out the demo and view the video here:

See you next year at Siggraph 2017!

Community