VRguy podcast Episode 24: Nick Whiting, Technical Director of VR/AR at Epic Games

My guest today is Nick Whiting, Technical Director of VR and AR at Epic Games. This episode was recorded on Aug 16th, 2017.

Nick and I talk about Robo Recall, the unique challenges of VR and AR for game engines, what Epic learned from enterprise customers and much more.

 

Subscribe on Android

subscribe on itunes

Available on Stitcher

 

 

Yuval Boger (VRguy):     Hello Nick, and welcome to the program.

Nick Whiting:    Thanks for having me.

VRguy:  Who are you and what do you do?

Nick:      My name is Nick Whiting, and I’m the technical director of AR, VR, and XR at Epic Games. I’m in charge of all the acronyms in the group. Yeah.

VRguy:  You’ve been doing this for how long?

Nick:      I’ve been active in VR for about four and a half years now. I basically got involved in it when Oculus was getting their Kickstarter ramped up. They said they had a cool little piece of hardware, and they needed some pretty visuals to show on us. After hours, I started hooking up our engine, Unreal Engine 4, to support Oculus and the rest is history as they say.

VRguy:  Now that you’re four and a half years into it, what’s unique about running VR in a game engine?

Nick:      I think now that we’re four and a half years into it, we’ve come to the point of maturity where we can stop trying to aggressively catch up to the hardware as our primary task day to day. Now that the SDKs from the major manufacturers have matured and they’ve released commercial products, and other more specialty manufacturers like Sensics have got a much more firm understanding of what the area means. We’re in the exciting portion where we get to experiment with what’s possible and what’s potential of the VR as a medium. I’m pretty excited that now that we’ve done all the heavy lifting to get the car built, we can actually take it for a test drive and start experimenting.

VRguy:  What are the open issues for you in terms of VR and the game engine?

Nick:      Right now, I think that the biggest thing is everybody can always use more performance. Since performance unlike traditional games, with VR, you have the potential to make people sick if your product isn’t performing and it’s very, very hard. As a game industry, we’re just getting used to doing standard 1080p high def game engines at 30 and the 60 frames per second. Now all of a sudden, you have to render at even high resolutions, at even higher frame rates. It’s a hard problem. Even internally with Epic’s own games like Robo Recall, we have a team of experts that are constantly looking at performance. That’s their number one problem at the moment. We keep trying to innovate and come up with new ways to make it easier to actually hit the performance requirements of VR.

VRguy:  How much does that comes from the game engine companies versus the hardware vendors? Valve or Oculus or others, in terms of performance optimizations?

Nick:      It’s actually nice in VR, I’ve noticed, compared to other traditional bits of the gaming industry. Everybody’s very interested in the medium itself succeeding. Everybody’s been very open. Both Valve and Oculus and so many have all been very forthcoming with what they think are the best ways to get optimizations. They’ll do prototyping in our engine and give us the code and share it freely with anybody, so that something developed at one company can be shared to all the different companies, because it’s in their best interest to see the medium as a whole succeed.

                That’s actually been one of my favorite parts about working on VR is that unlike the game’s industry, where sometimes it’s like we’re guarding the launch codes for the missiles with the amount of secrecy that we have. The VR industry has been much more forthcoming and open about it. At Epic, we feel like we need to give people a good baseline out of the box experience so that they can have the potential to make and perform a code, and then we try to work with them through education and examples and giving out our content, like the Robo Recall mod kit, to show people how we set things up, to do so efficiently and actually make performance boundaries while making things still look good.

VRguy:  Game engines have grown over the years, physics is in the game engine, there’s more audio in the game engine. Where do you draw the boundaries? Do you anticipate a, I don’t know, object recognition kit in the game engine, or an eye-tracking analysis stack, or a SLAM algorithm?

Nick:      The way that we try to do it is to build APIs to let people plug in whatever they’re good at making. I don’t think that there’s any way that a company like Epic with a few hundred employees could compete with somebody like Google or Facebook on problems like object recognition or deep learning. We just don’t have the resources to do that, but what we want to enable to happen is for those companies to take our product and use it, not necessarily as a game engine but a real time rendering engine, in a much more general sense and be able to put their technology into ours. I think that’s where some of the most interesting collaborations that we have come into play.

VRguy:  Now you have me curious. Could you give a few examples of interesting collaborations that you could talk about?

Nick:      One of the first interesting collaborations that we did in the VR specifically was, we worked with Weta Digital. Those are the guys that do the visual techs for the film industry, and one of their more famous ones was The Hobbit and The Lord of the Rings setups. We actually worked with them to take their pipeline for creating their ultra realistic, physically simulated dragon from The Hobbit movies and get it running at 90 frames a second. There was a really interesting collaboration, because they’re used to sending things off to a gigantic render farm and hours or days later sometimes you get a result that comes out the other end. With VR, we have 11 milliseconds to render something. We did a lot of work with them.

                I learned a heck of a lot about the visual techs pipeline for film and how we could take the things that they’re very good at, which is realism and accuracy and leveraging these vast amounts of computing power to make really detailed simulations. Then, how we can take our knowledge from the games industry and make those render efficiently. That’s been one of my favorite ones. Just because Hollywood has always seemed a bit magical to me, and it was nice to have a peek behind the curtains.

VRguy:  Right. Now, you guys make your source code visible. I don’t know if you call it open or open-source or not, but people can see the source code.

Nick:      Yeah. The source code is available, basically, if you go to unrealengine.com and sign up for an account and link your GitHub. You have access to everything we have. We started doing that a couple of years ago at GDC, and honestly, the first time we did it was terrifying, because all of the sudden, your baby is laid bare in front of the entire world, and you don’t know what people will think or what they’ll do. What we found is, that’s actually opened the door to so many more industries for us, especially with AR and VR. There’s been such an interest in what we call enterprise applications, which is basically everything that’s not film. Having the source open and out there lets them hook up the interesting technology that they have, like the film industry for example, or engineering examples.

                Take those and really use the engine in ways that we haven’t previously envisioned. It’s been great for our business and it’s been great for diversifying what we’ve done. It’s one of those things that’s gotten me very excited again. When you make games for years and years on end, you’re solving a lot of similar problems, but all of a sudden, these new problems faces that you haven’t even though about before open up. It lends itself to a much more interesting, new challenges.

VRguy:  How much of your time do you spend with industry, meaning outside consumer VR or gaming?

Nick:      It highly depends on the week. It goes back and forth, but I would say more and more that we’re trending towards non-gaming and more enterprise applications. For one thing, a lot of people from enterprise aren’t used to doing real time interactive products. They’re very much offline visualization or large CAD systems that they render out results to. Just educating them on some of the tips and tricks that we take for granted as game developers. It’s all very new knowledge for them.

                In turn, they teach us a lot of things about how they use stuff in interactive and user experience paradigms that we haven’t considered. It’s becoming more and more. I really believe that VR’s first killer app is not necessarily an entertainment product, but some of these enterprise applications because if you talk to engineering manufacturers or architects or other people, they’re quite literally saving millions and millions of dollars through the use of VR. While it might not be as sexy or glamorous as a film project or a game project, it’s making attractable difference on their bottom line. To me, that’s the very definition of a killer application, right?

VRguy:  Yes. Can you still keep the same code base for both the AAA games as well as the enterprise apps?

Nick:      For the most part of it, the core rendering technology is all the same, but they do have different problem spaces that they need. We recently has figured out a couple weeks ago at SIGGGRAPH we announced some of our enterprise plans were explicitly focusing on how do we get these gigantic data sets from external CAD software and Autodesk live into Unreal Engine 4 to minimize the friction between the two. We’re spinning up a new team in Montreal that’s exclusively focused on enterprise applications and solving the problems that are specific to that domain, and so far, I would say a large part of those issues are just how do they plug in their gigantic and very established pipelines that they’ve invested a lot of time and money into, into real time rendering technology. Fortunately, we’ve hired a lot of very talented folks up there that can get them off on the right foot. We’re very excited about that project that we just announced a couple weeks ago.

VRguy:  The other thing that we’ve seen in enterprise applications over the years is that they have their unique tracking systems and input devices, and it’s not just what you can get in Best Buy these days. So many different devices. How do you handle that?

Nick:      That is a very interesting point, actually, because they have a lot more specialized equipment than we usually have access to or we usually target with game folks and things. That’s one of the areas that we’ve really been pushing the OpenXR open standards for VR and AR and MR and all the other R’s out there, to help close that gap. What we really want to do is have a platform that people can build content on top of and applications on top of, and then can combine these disparate forms of tracking and display technology and use the same application. One of the big parts of that is how do we make it future proof? If the company invests a lot of time developing an application on their current hardware set up, what happens when the next version of something comes out, or if they have a more specialized application that they need to take advantage of?

                That’s one of the biggest fallout pieces from OpenXR that I’m very excited about, and honestly, I think Sensics has done a pretty good job with that with OSVR leading up here because you can kind of take an amalgam and have a bunch of different devices and technologies and smash them all together and get data out the other side. It’s very flexible, and that’s the one thing I’ve learned with enterprise is that you need to be supremely flexible with the technology and the pipeline in order to get good results with it.

VRguy:  Absolutely. You and I and others are collaborating on the OpenXR initiative, but unfortunately, we can’t talk about the inner working publicly at the moment. Hopefully, in the not too distant future.

Nick:      Absolutely.

VRguy:  Now, we spoke a lot about VR, but let’s talk about AR a little bit. Do you already have or do you see in the future Unreal Engine running on Google Glass or an Epson Moverio or one of these AR devices?

Nick:      Yeah. We’re actually super excited about AR. One of the things to me, VR is great for replacing somebody else’s role, but augmenting it is something that I think will have a very broad market appeal, so it’s something that we’re paying a lot of attention to. Recently with 4.17, we released our initial support for the Google Tango devices, which are, of course, pass-through AR devices with pass-through cameras, as well as AR kit. We continue refining them, so our work this year is largely focused on how do we get AR up to speed with all the advances that we made out of VR technology. Because a lot of the problem space is the same, you have a track device of some sort.

                It’s on a closed system that requires a lot of performance optimization, consideration for battery life, and thermal considerations. A lot of these problems actually overlap very well with our current generation VR tech, so we feel that we’re pretty well positioned in order to take advantage of that, but we’re where VR was a couple of years ago where you have a bunch of different vendors with a bunch of different APIs. Again, my secret hope is that OpenXR comes along and everybody adopts it, and it would save me a lot of work, but more importantly, lets us share the wins and the benefits from one platform to another a little bit more fluidly.

VRguy:  How low does the hardware go? It sounds like you mention phone type devices and tablets, but do you see it on truly wearables? Again, like a Google Glass, or is that just doesn’t have enough CPU power?

Nick:      I think it depends on the application. Google Glass is a pretty constrained platform, but it serves a lot of good, functional uses. I’m just not entirely sure that you would necessarily need something as powerful as Unreal Engine 4 on an application like that when you only have a limited display space, but when people start making these glasses and the technology progresses where the form factor gets smaller and smaller, I can definitely see augmenting … Any time you need to do augmenting virtual aspects onto a real world scene and have high grid lighting and tracking and stuff like that, that’s where Unreal Engine really shines.

                We’ve been going after photo-realistic rendering for so long, and there’s no bigger use case for photo-realistic rendering than putting something that’s fake into the real world. There’s a lot of subtlety to how you match the lighting and the shadows and the contact and the ambient inclusion and all these little issues. That’s where Unreal Engine really shines, so anything where we’re trying to take virtual objects and make them as real as possible, I think is where our strong suit lies.

VRguy:  Now you guys released Robo Recall, which, I think, you were a big part of.

Nick:      Yeah. Yeah.

VRguy:  Could you talk a little bit about what the major challenges were there?

Nick:      Robo Recall started off … Previous to Robo Recall, we had done a lot of tech demos kind of things, where we would have a very small team. Usually at the beginning, it was me and one other tech artist designer for a couple weeks, and we would make a demo. That was how we got our original elemental VR where you’re just walking around a castle, and then Couch Knights where you’re playing a game with another person, and showdown where you’re going slow-mo through an action scene, and finally Bullet Train where we actually tried to make a real game with motion controller input.

                With Robo Recall, Oculus fortunately gave us the money to build up a small team of people and give us a year to execute on it. We started asking, “What do we really need to do to productize and make a full-fledged experience in VR? What are the pain points of the engine and whatnot?” As we started going through it, we decided rendering was one of the big issues that we had. We had a deferred renderer, which is great for a lot of applications, like digital humans. For the art style that we were going for, which was very mechanical and needed very sharp, crisp imagery, we really needed a forward renderer so that we could use MSAA on it.

                We actually implemented that, and actually one of the biggest things to come out of Robo Recall was just having a real game product that we had complete control over, and going through the process of optimizing that over a period of a couple months and figuring out where the actual game points are. All that work actually made it back into Unreal Engine 4, and is available for everybody. The optimizations that we made for the game feedback into the engine. Our philosophy in developing the engine in general is that we need to build real content on top of it, and use it ourselves in order to truly experience the pain points and bring the product to maturity so that other people external to us can use it.

VRguy:  I think I know the answer to my next question, but if you had to do a Robo Recall again, would you use Unity or Unreal? If so, why?

Nick:      I think I would do Unreal Engine. Not that I’m biased or anything like that, but one of the real strengths about it was we only had two game play programs on the entire thing, and most of the gameplay code itself was actually written in Blueprint, which is a visual programing language that we have in Unreal Engine 4, and the strength to that is that suddenly instead of gameplay programmers being a bottleneck and being the only ones that can try out new ideas in a prototype sense, all of a sudden you democratize that across the entire team.

                Our designers were able to write Blueprint code, and our animators were able to write Blueprint code, and through that process, we democratized all the gameplay features. Some of the coolest bits of VR interaction and gameplay came from people just randomly playing around, adding a little bit of visual scripts, and that just made it into the game because Blueprints are a shippable product. That’s really the differentiator for us. Aside from we feel pretty confident in our graphics ability and our support for VR platforms, but we really shined in the fact that we were able to have such a small team and such a very diverse team with very little programing help, actually make a full- fledged game that shipped.

VRguy:  As we start bringing this discussion to a close, if you had even more control over what Oculus and Valve and Sensics and Google and others are developing in terms of hardware or peripherals, what would you have us focus on in the next 18 months?

Nick:      The next 18 months? Wow. That’s a near-term time frame. I think one thing VR has given me an appreciation for is how long it actually takes to get our hardware done and revised, but I think one of the biggest things to focus on right now is if we’re truly going to grow the consumer market is making it more convenient and lower barrier to entry to set these things up. I feel like the tech from a resolution standpoint is fairly good where it’s at right now. I think making jumps in that is going to pay off in the end, but it’s not going to be the difference between our consumer market where it is now and growing it into something that’s even more financially viable. The ease of set up. I need to be able to sell this to my mom, and have her set it up without having to call me, right? Whether that’s through standalone headsets or whether that’s just through more wireless technology and solutions like that, I think that’s the most important thing that we could to do to grow the market at the moment.

VRguy:  Even though, if your mom is anything like mine, she’ll find some other reason to call. It wouldn’t be the set up.

Nick:      I’m a good son, and call my mom every weekend

VRguy:  Nick, how can people connect with you if they want to learn more about the work that you’re doing?

Nick:      I think the best way is to go to unrealengine.com. We try to play a pretty active role with putting blog posts out and active on the forums and answer hub. It’s a great way for us to connect with people and share that knowledge with the masses. One of the things we try to do is have one to many communication, and whether that’s through going to conferences or posting blog posts online and answering questions on our forums, that’s really what we find to be the most effective way of sharing knowledge and interacting with the community.

VRguy:  Got it. All right. I guess I’ll see you next on the panel that we’re doing at VRDC. Thanks very much for coming to the program.

Nick:      Thanks for having me.

 

Related Posts