1 October 2020
Image default

Animation, raytracing, virtual production, VR : the future of Unity Technologies

We recently had the opportunity to exhange a few words with Adam Myhill, creative director at Unity Technologies. We discussed with him a few topics : animation and real-time graphics, raytracing, virtual production and, of course, what we can expect from Unity in the future.

3DVF : Last June, you released Sherman, an animated short powered by Unity. Is the future of the animation real time ? Which features are you trying to improve to make this a reality ?

Adam Myhill – Unity : The future of animation certainly can be real time. There is a quality, speed and balance that keeps increasing in the quality department. What things looked like a few years ago are vastly different from what they look like today and where we will be in a few years. For many shows, the acceptable level of quality versus how fast you can produce them makes real time attractive. We had breakthroughs on Sherman, we did real time Fur, which is a hard problem, that was attacked by two talented engineers. Something that we also included on Sherman was high quality motion blur, developed on the show, Baymax Dreams, for Disney Television and Animation, which we won an Emmy for. The breakthroughs on that were visual fidelity. With Unity it’s HDRP [High Definition Render Pipeline], it was work we did on shaders, motion blur and on Sherman we had all that plus fur. The speed at which we did this, Disney saved between 40 and 50 % budget time, to do it in real time.

There is a creative advantage, this might be the most powerful of Unity, because we do not need to wait until the end to see if something is working. In a traditional pipeline there is layout, storyboard, animatics, lighting, render, composite and the stages in between. If you change your mind, you might have to go back up and re-render. In real time and creation engine, like Unity, there is no rendering, no post or after. If the director says, “hold that shot longer, move it closer to camera, make those guys walk faster, make the sun a little bit lower, I want the shadows to come forward”, you do that. The what if is not expensive, what if is just what if. You just make that change.

That is the magic, it is faster and the producers are happy. But I believe that the creativity you can see if a scene is working, if that shot is funny, if that joke works, you figure it out right away and that is the magic of realtime.

3DVF : We are however still miles away from being able to render a Pixar-like movie in real time…

On Sherman or Baymax Dreams, we did start to finish inside Unity and the final pixels came out of Unity. We used Maya for the animation and different programs for creating the assets, nevertheless it was basically start to finish with Unity. For a Pixar film, Realtime is getting closer, but it is still not at the level as a bunker full of computers spending 100 of hours on frames. However, that does not mean that realtime cannot help their system. Real time can be used at the beginning of the project to block ideas out, you work fast and get the lighting, shadows and get to 80% of the quality quickly, then you use a more traditional back end when you are ready to make the film. With support in USD, FBX , Alembic and other file formats you can start off in Unity and use a traditional back for the hyper fidelity graphics, because you have thousands of machines spending 100 of hours on frames. We see it in this spectrum. You can use real time for the whole project or a bit in the pipeline.

3DVF : NVIDIA upgrated their GPU lineup, real-time raytracing is at the heart of those new graphics cards. What’s your approach on this new technology ?

It is amazing, this was a dream not a long ago. I was at Siggraph 20 years ago and we were having a conversation about real time raytracing and we were laughing about how it would never happen… and it happened. Imagine a world where in real time you are working with authentic to reality lighting fidelity, it is amazing. It is still early and I think it will be a while before we are fully raytracing the entire frame. Currently we have a hybrid approach, where we are only raytracing the shiny things or bits that will benefit from it. Over time we will be using more visual fidelity and the realism will to take another step forward.

3DVF : How is Unity being used in virtual production and in the media and entertainment industry ?

We have so much virtual production happening now and that is popular, because it is saving film and TV projects money.

I was at a VFX studio not long ago and one of the VFX artists said “virtual production means more dollars hit the screen”. I said “can you explain?” and he goes :
“Before if we were to have a scene in live action, where we were going to composite a dinosaur into the shot, they would film it, mark it on the camera, on the lense, do all this stuff and later put the dinosaur in the shot. However, if they would find that they only did this, or “I wish we saw that sooner, or there is this problem with this thing, we should have put the dino here”, you don’t see this aspect until later. Now on a virtual film set someone flips open a laptop, has Unity, puts a dino in, everybody looks at it and says ‘ah maybe it should be bigger, maybe it should come in from the left or whatever’, so on the set they shoot that shot appropiately, because they have the immediate awareness of the visuals.”

The VFX artist is spending more time making a good dinosaur instead of fixing other problems. Imagine virtual production where you do not have to build sets, everybody benefits, multiple people sharing the same space. In Virtual reality we load a set up, we are in LA but the DP is in London, but when he logs in, we are all walking in the same space even though we are in different places in the world. There are different ways to use realtime in the virtual production, and we will see more. We be seeing it not only in block buster movies but into high-end television, mid-budget television and eventually indie artists and anyone who wants to start a film, we can block it out in a creation engine and see if the scenes will work.

We are about to release a package for everyone that have real time production running on the phone plugs right into Unity, explore the scenes, laydown cameras, figure out where everything needs to be, what angles look the best.

3DVF : Do you feel things are going too fast sometimes ? You develop a technology and then another comes top and this is challenging you all the time.

It is going terrifyingly fast. I love technology when it solves a creative problem. When it is just for technology’s sake and people try to figure what to do with it, that is when I say that is interesting, but what is the story and what is the best way to figure out if it is working?
We are in world, where when someone says something outlandish, people will believe it. It has advanced to a point where it is hard to separate what is magic and what is not.

Euclideon Technologies in 2011 : extraordinary claims that never really came to life.

3DVF : Real time3D is a very competitive environment. What’s your take on the current market and where do you see Unity in a few years ?

The current market is dynamic, we have companies which come and go. There are two major creation engines and I think the competition between them, makes them better and it benefits the audience. There is so much work that needs to be done that no one company could handle all of it. I see Unity playing a powerful role in the future. We got exciting technology coming forward, our new visual scripting, Fireman, creates things that run faster than if you coded it specifically. This is very unique to Unity. We have a new programing language called DOTS, which is data oriented ; it makes things run 20 times faster than before powering and rendering millions of objects on an iphone, nobody else can do this. I think Unity’s future is incredible.

3DVF : Last, but not least, could you tell us about your vision of VR and the streaming market ?

I think that world will change, because once we had a television with 12 channels, then we had hundreds and then you would buy a cable package with the channels you wanted. Then Netflix came and now there is also, Disney+, Amazon, Hulu, Crave, Apple is getting in to it, I just want to subscribe to one thing. All of these different channels they all want content. I love VR as a medium, but I do not think VR is going to be what save our souls or growth. I think we will have to have a lot of second screen. I saw this Formula One Race on TV, you could switch between the different driver’s cameras. I am convinced that media is going to bi-directional, our content will be richer and more dynamic and the viewer will have some agency over it, that is my prediction.

For more information : Unity’s official website.

A Lire également

1 commentaire

phicata 19 December 2019 at 17 h 33 min
Super article!

Laissez un commentaire

Ce site utilise des cookies pour améliorer votre expérience. Nous supposerons que vous êtes d'accord avec cela, mais vous pouvez vous désinscrire si vous le souhaitez. Accepter Lire la suite