NYU Holodeck Receives National Science Foundation Funding

Coming soon to NYU? (Image courtesy of Paramount Pictures/CBS Studios.)
An interdisciplinary team of researchers from New York University has received a $2.9M grant from the National Science Foundation (NSF) to develop a well-integrated software/hardware instrument incorporating visual, audio, and physical (haptics, objects, real-time fabrication) components, known as the NYU Holodeck.

“Our goal is to create an immersive, collaborative, virtual and physical research environment with unparalleled tools for intellectual and creative output,” said NYU associate professor Winslow Burleson.

A collaboration of researchers throughout NYU’s schools and colleges, the NYU Holodeck has an interdisciplinary backing and an additional $1.2M cost share from NYU. The team is composed of researchers from the NYU-X Lab, based at NYU’s Rory Meyers College of Nursing, Courant Institute of Mathematics, the Steinhardt School of Culture, Education and Human Development and the Media and Game Network (MAGNET) based at the NYU Tandon School of Engineering.

Among the collaborators is professor of computer science, Ken Perlin, of the NYU Media Research Lab. In 1997 he received a Technical Achievement Award from the Academy of Motion Picture Arts and Sciences for the development of Perlin Noise, a technique used to produce natural appearing textures on computer generated surfaces for motion picture visual effects.

“The NYU Holodeck will provide a compelling opportunity to explore and advance new types of science, permitting researchers from diverse disciplines to interact with theoretical models, real objects, robots and agents, engendering insights that may not be possible using current 2-D and 3-D representations and analytic techniques,” said Perlin.

The holodeck project is an amalgamation of collaborators’ prior work in disparate areas, ranging from Burleson’s astronaut robot mission simulators, affective learning companions, and teachable robots designed to help teach school children geometry, to Ken Perlin’s Holojam—a virtual reality experience that entails drawing and collaborative interactions.

“One of our collaborators Agnieszka Roginska, PhD, Clinical Associate Professor of Music and Music Education has a high end Music and Audio Research Lab (MARL),” said Burleson. “We do a lot of motion capture work in the game development department,” Perlin added. “So individual pieces of the Holodeck already exist. It’s our intention to fuse these components into an integrated instrument.”

The NYU Holodeck has drawn comparison to CERN’s particle accelerator in its potential to create new insights into fundamental natural phenomena, offering new tools for intellectual and creative output across disciplines. Others have likened the Holodeck to the flexibility, versatility, and power of Marvel’s Iron Man.

Imagine a Biochemist working with a social robot, collaborating to create new forms of synthetic life. Robotic arms print a DNA helix While a remote musician plays scales on the emerging base-pair xylophone, adding transdisciplinary perspective to the evolution of future life. (Image courtesy of NYU-X Lab.)
A comprehensive Data Management Plan accompanies the proposal. The goal is to give every researcher the ability to discover and interact with datasets across all research sites. Therefore, the Holodeck equipment and nodes will be connected via NYU high-speed internal data links (10Gbps or faster) and Internet 2 to permit seamless exchange of data and simulations.

The Holodeck team will use both distributed representational state transfer data services to support internet access and allow service chaining with data processing and analytical services, as well as direct use for visualizations, sound and haptic interaction. This structure will allow collaborators to create data/analytical workflows that can be validated and reproduced.

For more real-life Star Trek technology, find out how engineers developed a real-life tricorder.