Register for free and continue reading
Join our growing army of changemakers and get unlimited access to our premium content
Interactive Dynamic Video enables visual designers to poke and prod objects in videos to see how they would move to their environment.
As AR and VR devices become more commonplace, the ability to create a life-like environment is one of the main focuses for research. Back in 2015, Springwise wrote about an AR game that uses a smartphone to turn any space into a haunted house. Now, Abe Davis, a PHD student at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has developed Interactive Dynamo Video, a program that allows watchers to interact with the objects in videos to see how they move.
Designers learn a lot about objects by touching and manipulating them: pushing and pulling them to see how they react. IDV looks at the tiny, almost invisible vibrations of an object as it moves and creates a video simulation that users can virtually interact with, enabling them to, in a sense, reach in and touch objects in a video.
The technology’s most immediate application is in visual effects for filmmaking. At the moment, 3D modeling requires filmmakers to use green screens and create detailed models of virtual objects that can be synchronized with live performances. IDV makes this much easier and also makes it possible for real things to engage with virtual objects. To take one example, whilst Pokemon Go can drop virtual characters into real-world environments, IDV can go a step beyond that by actually enabling virtual objects (including Pokemon) to interact with their environments in specific, realistic ways like bouncing off the leaves of a nearby bush.
The development also has possible applications in architecture, to determine if buildings are structurally sound. Are there more ways that this technology can be used to test the safety of prototype projects?