In Through the Looking Glass: Holographic Startup Wins Big on Kickstarter

A startup called Looking Glass Factory recently completed a successful Kickstarter campaign, raising over USD 840,000. They completed four years of research and development to create a holographic display unit for the masses. The company produced an initial version of their holographic display product called "HoloPlayer One", earlier this year. Their team is comprised of a small mixture of optical, electrical and mechanical engineers, as well as Unity developers and 3D modeling artists. The HoloPlayer One has undergone more iterations and experimentation and the holographic display unit you see pictured below, is called the Looking Glass.

Behold, the Looking Glass holographic display from the Looking Glass Factory, a small startup which recently completed an unusually successful Kickstarter campaign. (Image courtesy of the Looking Glass Factory.)

To create a holographic display unit that is user-friendly, the software developers within the company built an SDK called HoloPlay Unity SDK. Backers for their Kickstarter campaign get access to the homegrown SDK to create their own “holographic apps.”

Features

Touting a Model and Animation Importer App that will enable users to import 3D files like OBJ., FBX., gLTF., and STL files in the future. For now, the beta version only allows users to import glTF., OBJ. and glb files. Backers of the Looking Glass also have access to dozens of free downloadable holographic apps to get a feel for what their HoloPlay Unity SDK can do. There are volumetric video clip apps, holographic film short apps, 3D scan viewer apps, CT-scan/DICOM importer apps and a 3D model previewer app for 3D printing among others.


The Looking Glass in action on the “Tested” YouTube channel. Looking Glass Factory's CEO Shawn Frayne walks through the latest updates and explains how the company’s team of engineers combined volumetric and lightfield imaging to try to create realistic holograms. (Video courtesy of Tested YouTube Channel.)

The company is also working on display application that contains viewports from Maya, Zbrush, Blender, Tinkercad, and SOLIDWORKS.

Cost, Dimensions, Specs and System Requirements

For USD 600, backers get an 8.9-inch model and for USD 3,000, they get a 15.9-inch model. The first 100 units are slated to ship this month, with the rest being shipped out in December of this year.

Pictured above, the Looking Glass does support different hardware configurations like Apple’s MacBook Pro Core i5 2.7 system, but overall, PCs are better equipped to handle the Looking Glass. The company recommends the following hardware configuration: Windows 10 64-bit OS with a CPU equal to or better than Intel Core i5 CPUs. For the GPU, they recommend that users have one equal to or better than an NVIDIA GTX 1060 that outputs at 2560 x 1600 or better. The minimum RAM is 4GB or more, and you’ll need a hard drive with 128 GB of storage or better. (Image courtesy of Looking Glass Factory.)

Why Would Product Designers or Engineers Be Interested in Three-Dimensional Displays like the Looking Glass?

A glaring disadvantage that might spring to mind when comparing the Looking Glass to augmented reality or virtual reality for visualizing virtual prototypes: the size limitation. In augmented reality and virtual reality, designers and engineers can show others their large-scale models at a 1:1 ratio for a realistic experience of larger virtual prototypes.

But perhaps three-dimensional holographic display systems like the Looking Glass could be right for different applications in the military and medical sectors of global industry. Product designers could also use them, especially it their product fits within the dimensions of either the larger or smaller Looking Glass display. If you aren't familiar with holographic display technology, you might ask yourself: where did this technology come from?

Did Holographic Displays Evolve from Three-Dimensional Display Technology?

An early form of three-dimensional imaging technology during the Victorian age was called a Wheatstone stereoscope. The observer could see a dimensional image by simultaneously observing two different images, one for each eye.

To view the images on a Wheatstone stereoscope, the observer looked through a viewing device where the right eye observed one picture via a mirror as the left eye did the same with another picture. The two pictures would be of the same object, taken at a different vantage point, spaced about 66 millimeters away, which is the average distance between a human being’s eyes. (Image courtesy of Wikimedia.)

Instead of moving the camera, the images were recorded simultaneously by a camera with two lenses spaced by 65 millimeters, or by a camera with a single lens mounted on a rail or slide bar. Though it appeared with dimensionality, it did not have what’s known as motion parallax, where the image changes relative to the user’s motion.

Another form of stereoscopic display called a lenticular display became part of a popular prize in Cracker-Jacks boxes and the music industry during the 1950’s. This display works with a lenticular lens sheet and cylindrical lenses. A picture taken by a camera in motion with a lenticular lens is provided to the viewer, and horizontal motion parallax is achieved by viewing different stereo pairs that correspond to a view recorded by the camera at that particular instant. (Image courtesy of Pictorial Productions.)

The lenticular display is another step in the right direction, but it is missing one thing: it does not achieve vertical motion parallax. Next in line is the fly’s eye display. With a fly’s eye display, users experience both vertical and horizontal parallax because the image of an object is recorded through a rectangular array of fly’s eye lenses onto a photographic plate while the camera is in motion. The photographic plate is illuminated from behind and the observer sees a stereo pair that is different according to the position it was capturing from on the rectangular array of fly’s eye camera lenses. The fly’s eye display has a shortcoming because it is challenging to reproduce electronically, precisely because the display has to achieve both horizontal and vertical parallax at the same time.

Occluded goggle displays where the observer wears glasses with polarizing shutters meant to separate images for the right and left eye became popular and evolved into the head-mounted display unit, where two television screens (then LCD, LED and now OLED) screens are provided for each eye, and sensor technology combined with computing power tracks the head position as it moves, providing full motion parallax. This is the same basic methodology was  used to create HTC Vive and Oculus Rift. (Image courtesy of Autodesk.)

But holograms are a slightly different approach to forming dimensional viewing objects, and it’s part of the reason why the Looking Glass holographic display might be an interesting if not exceptional achievement. 

A holographic display is created by dividing a single laser into two smaller beams. One falls on the holographic plate and the other is reflected from the object and then also onto the holographic plate. This is achieved by beaming a laser off a mirror and then through a beam splitter. The first split-beam is then reflected off another mirror and onto the holographic plate. The second split-beam is directed toward the object by a different mirror and also directly onto the holographic plate. The first beam is known as the reference beam, and the second is the object beam.

Spatial filters clean up the light source and control the size of the beam since it grows larger or propagates as it moves away. “Tuning” the spatial filters for the object beam and reference beam creates interference between them, and an interference pattern documents the existing wavefronts at the time of recording onto the holographic plate.

The interference pattern recorded onto the holographic plate can be illuminated by different types of light (monochromatic, white light etc.) provided by the laser and viewed by the observer on the opposite side of the holographic plate. This “reconstruction beam” contains different wavelengths of light, and for each wavelength, a different image is created. The observer then sees the different images appearing at different locations with full motion parallax.

There’s a lot more to the evolution, enough for a book on the subject, but if you had to create a list of what kinds of technology and methodology went into the Looking Glass (since they do not list it), it might look like this:

Possible Components and Tech of the Looking Glass Holographic Display Include:

A Flat-panel projection display, a spatial light modulating apparatus to display stereoscopic image, a far-field display, a diffractive beam expander and a virtual display based on a diffractive beam expander, image displaying apparatus, a laser scanning virtual image display, directional flat illuminators, stereoscopic display apparatus, reconfigurable spatial light modulators, a new methodology for creating 3D images based on random constructive interference, a light-modulation device for tracking users and a new method for illuminating computer-generated 3D models. There would also likely be a number of patents used, including one for a proprietary hologram panel and one for a method of manufacturing the hologram panel.

Early units of the Looking Glass start shipping this month, so I’m sure we’ll hear more about it as the story and product continue to develop. It's hard to tell if this holographic display unit is really something special. Like all things related to augmented reality and virtual reality, you simply have to try it.