VRguy podcast Episode 19: Dio Gonzalez, Principal Software Developer at Unity Labs

dio GonzalezMy guest today is Dio Gonzalez, Principal Software Engineer at Unity Labs. This episode was recorded on July 7th, 2016.

Dio and I talk about selecting game engines for VR, about the type of companies that will drive innovation, in-VR editors and much more.

Subscribe on Android

subscribe on itunes

Available on Stitcher

Interview transcript

Yuval Boger (VRguy):     Hello, Dio and welcome to the program.

Dio Gonzalez:    Hello.

VRguy:  Who are you, and what do you do?

Dio:        My name is Dioselin Gonzalez. I go by Dio. I’m the Principal Software Developer at Unity Labs, specifically in the virtual reality crew. I’m specializing, at this point, my research- what we’re working on- is in authoring tools for VR inside VR.

VRguy:  Why is an authoring tool needed to be inside VR?

Dio:        Yes. One of the issues- and it’s happening already, which is- the regular development for virtual reality applications … If you’re using a headset like Oculus or Vive, is that- and it happens very often- you have the headset on your head, and you have to put it on, and put it off for anything you’re 3D seeing or programming. Then put it on for testing. That’s a big pain-point that we’re addressing at this point, and many developers have expressed. That’s the first reason. To make the production more efficient and eliminate that portion of taking it on, taking it off, program, put it on again to test.

                Now, there’s another point which we believe strongly in that, which is directly editing in virtual reality has a lot of benefits, because of that extra dimension. Even when you’re working with 3D dimensions, what you’re doing- the regular editing- is up to the projection on the screen. That’s an abstraction. If instead you go and you are editing directly in the 3D world, and using an extra dimension, we believe that the authoring process can actually be more productive in certain things.

                For example, if you’re a level designer and you are placing the elements in the scene in virtual reality, you immediately have a sense of how that’s going to look to your user, and you can scale it directly. For example, a certain piece of geometry. Those reasons are the main ones.

VRguy:  So, it’s not just about avoiding a bad hair-day because I’m taking the headset on and off all the time.

Dio:        Yes, exactly. Another very interesting things, by the way, that one of my coworkers, in Labs, mentioned it, actually. I thought that was very interesting. He believes that everything in virtual reality can actually help non-virtual reality applications. I thought that was very fascinating. He’s in the graphics research group, and he’s doing work on photogrammetry at this moment. He needs to load very complex things. Very high-quality, lots of polygons and really big textures, and he needs to analyze them, and see, and build a scene. He was telling me, “Even though my final application, my use case is not for a virtual application, I believe that, just by exploring my scene in virtual reality and authoring, I can actually have a much better development.” That was really fascinating. He’s actually one of them. He’s using it right now. He loads his model, and he sees in virtual reality when he’s doing the authoring and all, even though he’s not developing for VR.

VRguy:  In your vision for a VR developer, would the developer spend, what, 50% of his time inside the headset? 100% of the time, 10%? What do you think is going to happen?

Dio:        I believe that almost 100% is very possible. The part, of course … there’s a programming aspect, that programming, writing code, that’s 2D. That’s a 2D activity. In that case, does it make sense that I’m programming in virtual reality? That’s the part that … I’m not sure about that. If we build super-intelligent systems that even come to a point that programming is not typing in keyboard … Programming, maybe it’s voice-activated. I’m dreaming at this point, but it’s totally possible you can actually program with voice, with some very innovative VR UI is totally possible to even do it 100%. I’m totally up for that. Lots of potential of VR.

VRguy:  I understand. Do you see the need, or the drive, towards standardized GUI for VR? Is it going to come through Unity sample applications? Is it going to come through an operating system vendor, or maybe there’s no need for it. What’s your view?

Dio:        I believe there’s the need. It will just grow … It’ll happen. When we talk about standards … We already know about general standards for 2D desktop applications, right? We know that an ‘x’ means closing a window. The image of a floppy disk, even though nobody uses floppy disks, we all know that that means saving things. It’ll happen. It’ll happen when virtual reality and augmented reality becomes more mainstream. I think it’s just … Yes. There’s a need or it, because we don’t want to be reinventing, and relearning every time. Just like when we buy a computer we don’t want to relearn how to open a document and save a document.

VRguy:  Okay. My view is, probably, there’s going to be a lot of experimentation in the next few months. Ultimately though, developers will settle on a GUI language or conventions to do it.

Dio:        Yes. I come from the classic virtual reality, from academia. I started a long time ago in 2002. I am from one of those early virtual reality developers. You know, there has been, already, many people designing, and implementing, and trying to come up with standards … the UI languages. The UX language for virtual reality. Right now, we have this amazing opportunity that is becoming more and more popular, so maybe we will converge. There’s so many people having ideas, maybe we will converge on one. Maybe a de facto, standard UI language will just appear. Internet, for example. We all know that TCP/IP is not necessarily … we know is not the best protocol, right? You know what, it just became the de facto standard out of use. Maybe that will happen, just out of use.

                I think it will come from many people. From not only the technologists, not only the software developers or the hardware vendors, but maybe even the users. Even though there are many US studies that … In academia they have run many user studies on what is natural in VR, but that was some time ago. It’s a different world, right? This new generation is used to many different types of GUI. I think there will be a conversation from the developers, the game engines like us at Unity, and also the consumers. Then, a language will come out of it.

VRguy:  That makes sense. Probably a team effort, I suppose, to one vendor or one company driving this.

                If I’m a game developer … I think it’s understood what are the criteria to choose a game engine for regular games. Do you think the criteria are the same for VR games? What are the criteria for choosing a game engine to develop a VR game?

Dio:        Well, first of course, a VR application normally is a 3D application. Of course, the criteria for a 3D games, they’re there. However, there’s extra aspects to consider in virtual reality applications. Like, which headset, which platform, is going to be supported. Maybe you can see that this is kind of like an extension … Like when you decided, “Okay, I want to create a game or a 3D app for Windows versus OSX, then that drives the choosing of the game engine.

                With virtual reality is a similar aspect, similar issue. What platforms do you want to run on? It’s not only the hardware. It’s also the extra requirements. Now, in virtual reality, as we know, for the headsets there’s a higher requirement of a frame-rate … higher frame-rate that it needs to support. Those are additional aspects that you need to take into account when choosing an engine. Also, whether the engine offers the services around my application that I want to do. Not just only the development, but what I’m going to do after that. Whether it provides me with the tools for specific needs that I have in virtual reality. Now, when we talk about the UI, it’s not just 2D menus, right? Does the engine provide me with additional elements for doing 3D UI, for example.

VRguy:  You mentioned choosing an HMD, and I think a lot of game developers would say, “Well, I want it to run on everything. I want it to run on HTC, and on Oculus, and on OSVR, and on whatever else.” The same would be also true for info devices- this controller, or that controller. Not too far into the future, a mid-range PC, a high-end PC … How important, do you think in light of that, is the creation of standard, so a developer does not need to learn the specific API for every individual device?

Dio:        Yeah, totally. Definitely. That’s one thing … and of course I have to talk about the work that we’re doing at Unity. That’s one thing that is essential. Just make the life of the developer less painful. Focus on creating a good application. Very good for whatever needs that you’re filling. Then, don’t let them have to worry about specific APIs for different hardware or, like you were saying, controllers. That’s a big things as well.

                Similar to how graphic standards like OpenGL came out for graphic programmers. We don’t want to worry about the specifics to the hardware. Same we need for VR, of course, which is that- like you were saying- “I just want it to run in any head-setting, in any platform. I just want to worrying about solving the interaction problem,” for example.

VRguy:  I would imagine that the user would to have this selection almost at run-time, just like I select which printer to print on. I don’t want to have an executable for this HMD, and then a different executable for that HMD, and then a third one for some other combination.

Dio:        Yes, exactly. Run time configuration and detection, that’s big. Not only headset, but even the controllers. Maybe or the same headset- let’s say you have an Oculus- you may want to use the touch controllers, or the Xbox controller, or some other … the Hydra, etc. That becomes, actually, also, important. Like you said, you don’t want to write just an application built and compiled one different executable for every platform.

VRguy:  There are a lot of open things … things that still need to be solved in VR. Whether it’s improve the rendering performance, or foveated rendering, or the standardization, and so on. Where do you see most of that innovation coming from? Is it from game-engine vendors like you guys? Is it from HMD vendors, operating systems, graphics vendors, or is it really all of the above?

Dio:        I see it both … Being a software person, I may be a little biased, because I’m a software engineer. A lot of it, I see a marriage between the research in academia and the work in the industry. Specifically, the game industry, they are experts in real-time applications. They have a lot of work on performance-driven applications. They are adding a lot of new research and new developments in virtual reality. As well as, of course, academia, has been doing that for a long time. I see, in general, from my point of view coming from the industries that have already been dealing with real-time issues for a long time. The simulation industry, right. The game industry, of course. Does that make sense?

VRguy:  Absolutely. Of course it makes sense. I think that today VR games require a whole lot of local storage, and a whole lot of local processing power. Even simply demos are many, many megabyte of download. Do you see some of that shifting into the Cloud- either the processing or the storage- or do you think it’s all going to continue to be local to the rendering machine?

Dio:        Yes, and no. Actually, that reminds me … We are hitting some of that already in my project. What I’m working on at Unity, Carte Blanche. This is this platform that we want to build for creating virtual worlds, virtual environments. One of the things is like this platform. We want to allow people to just essentially create pretty much any kind of geometry that they want. For that, we are integrating with the Unity Asset Store. That’s one of the things that we’re implementing, because, of course, we want very realistic, and very high-quality assets. For that, like you were saying, we’re moving that storage to the Cloud. That’s going to be, like I said, in the Unity Asset Store. Definitely, in the sense of the quality of the models, and the assets, they’re getting bigger. We want to make them nicer. For that, that requires a lot of storage.

                On the other side, I have done … Actually, when I started virtual reality, I worked with projection based systems. Specifically, CAVE. For that, many years ago, we would build graphic clusters that would cost a lot of money. That’s one of the reasons why virtual reality was only limited to the academic field. We will have big computers running graphic … running CAVE. One computer for each wall of the CAVE. Even, additionally, another computer for driving the controllers. Another computer driving physics. However, that is becoming a lot cheaper and cheaper. These days you can build a CAVE for much less. It’s still, you know, thousands of dollars, but much less than before.

                I was actually recently reading that some people are creating graphic clusters, which is off-the-shelf NVIDIA cards. In that sense, in power, most desktop computers are actually really good these days. It’s becoming cheaper and cheaper. I can see people, maybe small teams, having their own graphic clusters locally.

VRguy:  I understand. If you could drive the work plan for Sensics, for instance, since we drive both Sensics products and a lot of the OSVR development … If you could drive the work plan for Oculus and HDC and others, what would you have the major VR vendors on the hardware side focus on?

Dio:        One thing that we’ve been discussing is offer … on the virtual reality operating system level, offer more functionality. For example, speech recognition. Being able to … For example, the Vive includes a microphone. It would be great if we can have more functionality coming from … that’s what I’m calling the virtual reality operating system. From the API to the vendors offer to us. Speech recognition, voice command, more interaction and inputs. That’s all pointing, or course, at standardization. Which is, of course, something that we were talking about before. Making the life of the developers less difficult. More standardization that’s set up like a suite of low-level tools for virtual reality applications. That includes not only the graphics, cards, but everything. It can even be standardization of physics, for example, for VR. Something is offered from the vendors. Make sense?

VRguy:  Absolutely. So, Dio, where could people connect with you to learn more about the work that you’re doing at Unity?

Dio:        Ah, yes. We have our website: labs.unity.com. Over there you can see all the information about our projects.

VRguy:  Excellent. Well, thank you so much for coming onto my podcast.

Dio:        Thank you. Thank you very much for inviting me.

Related Posts