How Epic & ILM’s John Knoll Tried to Recreate the Moon Landing for Microsoft’s Build 2019 Keynote

In the end, it wasn’t meant to be. Microsoft had pulled out all the stops for its Build 2019 developer conference keynote Monday morning. The company had partnered with Epic Games and Industrial Light & Magic chief creative officer John Knoll for a hugely ambitious demo of its Hololens 2 headset that aimed to recreate the Apollo 11 moon landing, 50 years after the fact, in mixed reality.

All had worked out well during multiple rehearsals over the last few days. But when Knoll and science journalist and “Man on the Moon” author Andrew Chaikin were set to go on stage Monday, the demo just didn’t run. Microsoft stalled by extending its pre-show ImagineCup competition until the show’s moderator couldn’t think of any more questions to ask. Then Knoll and Chaikin went on stage, giving it more go — and the mixed reality overlays simply refused to appear.

Knoll and Chaikin during a May 5 rehearsal for the demo.

“Well, it seems that doing a live demo is actually harder than landing on the moon,” Chaikin quipped, and the duo walked off stage.

The failed demo wasn’t just an awkward way to start Microsoft’s annual event, it was also a sad ending to what could have been a major milestone on multiple fronts: The roughly 7 minutes mixed reality presentation was supposed to be an impressive demo of the capabilities of the Hololens 2 headset, a celebration of the new partnership between the tech giant and Epic Games — and for one dedicated lunar enthusiast, the culmination of a labor of love 20 years in the making.

Despite ultimately failing on stage, it’s a story that deserves to be told.

A childhood dream, and lots and lots of data

John Knoll was only 7 years old in August of 1969, but he still very much remembers seeing the moon landing on television. “We had a big party at the house,” Knoll recalled in a recent interview with Variety. “It’s still a very vivid memory in my mind.”

In fact, the magic of the moment was so intense that it stayed with Knoll for decades, influencing a career that led to him becoming the visual effects supervisor for a number of “Star Wars” as well as “Star Trek” movies. All the while, Knoll was seeking out videos and other source material from the Apollo 11 mission — only to feel disappointed about the lack of visuals, and the low quality of the sole camera capture of the spacecraft’s descend.

A photo from the May 5 rehearsal.

When the world celebrated the 30th anniversary of the moon landing 20 years ago, Knoll stumbled across a website that hosted a lot of the raw telemetry data from the Apollo 11 mission. “I found it really gripping,” he recalled. Wading through the data, he realized that it could be used to create a visual effects version of the moon landing. “I was already starting to visualize that in my mind,” he said. Excited by the possibilities, Knoll came to a realization: “Somebody should do it. I should do it.”

With that began a hobby project that had Knoll use telemetry data and other publicly available records to build a visualization of the moon landing. He started working on the project all the way back in 1999, and even presented some of the early results at an industry conference a few years later. However, work frequently forced him to take breaks. At one point, Knoll didn’t touch the project for multiple years. “I worked on this in fits and starts,” he said.

Then, 8 years ago, he had a bit of a breakthrough. NASA’s lunar reconnaissance orbiter started sending back 3-D imaging data from the moon’s surface, including high-resolution scans of the Apollo 11 landing site. Combined with the telemetry data he had already been working on, Knoll could suddenly visualize the critical moments of the Apollo 11 touchdown in vivid details.

From a breakfast to a live demo in 4 months

Earlier this year, Knoll shared some of his progress with his friend Kim Libreri, the chief technical officer of Epic Games. These days, Libreri’s company is best known for making Fortnite. However, Epic is also the developer of the Unreal game engine, which has long been used by developers to power real-time animations for their video games. More recently, Epic has been branching out to work with ILM and others to bring real-time animation to Hollywood, as well as to enterprise customers across a wide range of industries.

Knoll’s Apollo 11 project seemed like a great way for the company to demonstrate the capabilities of Unreal, and the two decided to team up and use the game engine for a real-time version of the moon landing. “It’s cool to relive the moon landing,” admitted Libreri. But the goal of the collaboration was also to show off that Unreal could be used for scientifically accurate visualization that can be manipulated in real-time without compromising on visual quality. Said Libreri: “We wanted to make sure the quality is up to his eye.”

Microsoft joined the project as a partner when Epic and the software giant decided to bring Unreal support to the Hololens 2. From there on, things suddenly started to move really fast — and Knoll’s labor of love was turned into a demo to open the Microsoft Build keynote within a mere 4 months.

Hololens graphics on steroids, thanks to pixel streaming

Microsoft officially introduced the second version of its Hololens headset in February. Coming three years after the launch of the original Hololens, the new version features a number of significant improvements, including a wider field of view and advanced hand gestures, some of which Knoll and Chaikin meant to demonstrate on stage Monday while showing off different parts of the Saturn V rocket. The headset also makes use of spatial anchors, which can best be described as shared location data points of virtual objects.

But the most significant advancement that contributed to the moon landing demo was the support of pixel streaming, a feature of Epic’s Unreal engine that allows Hololens headsets to access high-quality visuals that are being remotely rendered.

A photo from the May 5 rehearsal.

“I’m super proud of what Hololens can do,” said Microsoft technical fellow Kipman, who leads the company’s mixed reality efforts, in a recent interview with Variety. But ultimately, the Hololens is a mobile device, and as such it can only pack that much computing power. “The strategy for mobile processing is decimation,” he explained. “You dumb it down, you remove the polygons.”

Hololens 2 can handle around 100,00 polygons, which is enough for games — but not for other applications with much more visual data, like the moon landing scene that was meant to be shown off on Monday. With pixel streaming, Hololens can suddenly run applications with up to 100 million polygons, which are being rendered remotely on computers or servers optimized for graphics processing. “This is a whole new ballgame,” Kipman said.

“The pixel streaming gives people a taste of the future,” agreed Libreri. “It looks awesome, it looks amazing.”

A commitment to openness

Adding features like pixel streaming to Hololens via Unreal does make the headset a lot more appealing to enterprise customers with large data sets — but getting the two companies to collaborate wasn’t actually that easy. Epic didn’t support the first generation of the Hololens due to something Epic CEO Tim Sweeney in a recent interview with Variety called “a structural disagreement.” In essence, Epic shunned Hololens 1 because of a lack of openness.

Kipman agreed that his industry as a whole had been moving towards closed platforms with the emergence of mobile computing and exclusive app stores a few years back. “We closed the ecosystem,” he admitted. But he also argued that Microsoft had seen the light ever since Satya Nadella took over as CEO some 5 years ago. Microsoft, the longtime arch-enemy of open software, suddenly began to contribute to open source projects, embraced Linux, and even acquired the open source code repository Github.

More recently, Kipman has tried to bring the same embrace of openness to mixed reality. He began to talk to Epic last summer, trying to convince the company to bring Unreal to the Hololens. “It was up to me to earn their trust,” he said.

All this culminated in Microsoft the launch of Hololens 2 at Mobile World Congress in February, which included a commitment to an open browser, multiple app stores, and APIs based on open standards. Said Sweeney: “Users will have a choice among stores.”

Getting ready to support billions of AR devices

This new commitment to openness was also a key reason for Sweeney to join Kipman on stage for the Hololens 2 announcement. Another was that both companies wanted to show that they’re in it for the long haul. “In 10 or 15 years, there will be billions of AR devices,” said Sweeney. And Epic and Microsoft don’t want to wait around to get there. “It’s a great strategy to pursue this now,” he said.

Microsoft’s AR efforts have thus far squarely focused on the enterprise. Hololens 2 costs $3500, and is clearly not a consumer device, as Kipman freely admitted. “The consumer is in our future,” he said. “It is not now.”

A photo from the May 5 rehearsal.

However, the partnership is also meant to signal that this future may not be that far out, with Microsoft executives proclaiming that the path to consumer AR will be measured in years, not decades. Ultimately, the differences between those two markets may become meaningless. “This isn’t about enterprise or consumer. It’s about the future of all computing,” said Sweeney.

Getting to that future will require a series of steps that go beyond just building better headsets or smart glasses, and also include ubiquitous visual information layers with all their implications on everyday life. Said Sweeney: “This is a science fiction world we are talking about.”

Like DeGrasse Tyson, but live

It’s almost as if the future of AR is kind of like the moon landing itself: a huge technical feat that sounded like science fiction not too long before it became reality. And just like in space travel, after things can go wrong when technology fails to cooperate.

“There are inherent risks in using pre-release technology to deliver a live demo in an unpredictable environment,” an Epic spokesperson told Variety Monday afternoon. “Although we were unable to show the Apollo 11 experience onstage today, we’re excited to help others understand the potential of using HoloLens 2 to learn and share stories in entirely new ways that have never been possible until now.”

If anything, Monday’s failed demo showed that getting AR right can be more complex than one might think. However, the demo, as captured during rehearsals, also showed the potential for AR and game engines to popularize science, with Epic director of advanced projects Francois Antoine likening it to the presentation style of Neil DeGrasse Tyson, down to the scientific accuracy of the visualized landing.

“It’s the actual motion curve and the speed of the descend,” said Antoine. However, unlike something DeGrasse Tyson might do, the objects were supposed to appear in front of the presenters, ready to be manipulated in real-time. “This is all live and it’s actually happening,” said Knoll.

Until it’s not, one might add. But even before Monday’s demo failed, Knoll was certain that he wasn’t quite done with the Apollo 11 mission just yet, telling Variety: “I’d love to do an extended version.”

Source: Read Full Article