top of page

Reflecting on Real Time Ray Tracing


Uneal Engine real time ray tracing demo of three star wars characters in a lift

Unreal's latest real time ray tracing reflection demo

So, you may have already seen Unreal's veeery impressive demo showing off real time reflected rays (if you haven't there is a link at the end of this short article). If you're working on CG or have a passion/hobby in this area, then no doubt the thought of being able to render everything in real time (the demo boasts 24 fps) as opposed to hours per frame must sound like a dream come true right?


Certainly watching the demo, you can see that there is what appears to be legitimate ray tracing of reflections. Take a look in the image above and you can see the reflections in Captain Phasma's armour of the lights in the ceiling and also of the two storm troopers. In the video you can see those reflections move as the storm troopers move, awesome!


So, no more long renders or render farms right?


In the future this certainly seems possible, though its important to recognise one important fact (there are lots more coming in this article, but this is the BIG one). The demo you see here is running on a server grade Nvidia box worth $150'000. This piece of gear packs 4 of Nvidia's latest Volta based V100 cards for a total processing power of 1000 TFLOPs!! To put this in perspective, for the money you could buy 50 Titan V cards.


Powerfule GPU by nVidia


I'm just guessing here, but I'm fairly certain that putting 50 Titan V cards together would give you real time performance in ordinary ray tracing engines, for SOME things. Thats another important distinction to make, the demo only talks about reflected ray tracing, no refraction, sub surface scattering or heaven forbid, caustics.


Whats more the demo also restricted rendering of Rays to certain parts of the scene, with the walls being rasterised (which is how game engines draw 3d objects and is markedly different i.e. faster than ray tracing).


Spoiler, real time ray tracing is not new


In some ways the demos going around on the internet now seem to have completely reset our collective memories of the fact that we've seen real time reflection ray tracing before, its been in existence for quite a few years now; going by a casual google search on the subject. Here's nVidia's state of the art demo for their Kepler architecture some six years ago.



Another more technically impressive but perhaps rather less visually impressive (than Unreal's demo I mean) example is Altera's demonstration of a 1 million polygon + raytraced scene running in real time on one FPGA chip. Its impressive due to the number of polygons as opposed to how it looks when compared to a recent flight simulator or FPS game.


Altera FGPA demonstration video of real time reflection and shadow ray tracing

And here's a technical demo from intel from 2010 showing "real time ray tracing" using their Nehalem 7500 processor which looks pretty much like what we are using in Blender's rendered viewport. Also, the main focus of this demonstration is? Ray traced reflections!

Still image from a video of Intel's demonstration of real time reflective ray tracing

So real time ray traced reflections were new in 2010? Nope, work has been going on in this area since at least 2004 according to a thesis paper I found which once again focusses on the "easy" (or perhaps easier) task of real time reflection ray tracing. I've given the link below if you're interested in reading about optimising ray tracers; beware! contains technical stuff!

To sum all this up, its seems that we've been doing real time reflection ray tracing for a long time. At least 14 years. To bring this all back to the Unreal demo, what we're seeing is a highly polished example of great art work, creative screen play and a cool reference to the current star wars universe. Technically though, I'm not seeing anything new.


Similar to what Huss describes in his article above, real time ray tracing still requires equipment that is out of reach of the ordinary individual, nothing has changed.


Ok, when will we get real time ray tracing, in games or 3D animation?


Gazing into the crystal ball, I can see that.... no.... I actually don't know the answer to this, those developing the solution probably know better than I when we'll have this, but here are some of my observations.


First the gear that this is running on is so expensive it really is only available to either the mega rich (and mega crazy mega rich) or corporations/government/researchers. Given that the demo is using unreal engine, there could be games made with this in the future though it seems insane right now because, umm, the hardware is $150k? Pretty steep even as recommended system requirements right?


Another problem is power consumption. The equipment that produced this demo, according to Nvidia, consumes 3200W. In countries where the voltage is 240VAC, this results in a current draw which is over 13 Amps, which is greater than a single power point will allow. So you have a slight power issue there, hope it comes with two power cables! Though to be fair, the fact that nVidia has achieved this is quite impressive and thanks to their 14nm architecture which has reduced power consumption from the pascal architecture before it.


But to provide gamers with an experience like this, there needs to be another massive reduction in power consumption. Mainly due to heat and power, if you have a computer pumping most of its power into the room as heat, its not too bad if the power levels are modest say a few hundred watts as they are today. But at even 1000 watts, you're effectively running a heater in your room. Gaming would become very uncomfortable and expensive for your power bill (though in winter this would be awesome! Play games to keep warm!)


Ok, but what about CG, animation, blender, cycles and all that? Blender is getting Eevee, an engine that is similar to a gaming engine, like Unreal. Its not too far a stretch to imagine the tech from Unreal being one day available to artists to render real time ray tracing and the days of waiting hours to render a single frame being history. BUuuuuuttttttt....


.... this is likely to be a few years off yet at best. First the cost of the hardware is huge, for the freelancers, small studios and enthusiasts, even a 10x reduction in the price would still be beyond anyone's budget. Second, the software is an unknown, how fast will this type of real time ray tracing be if you start adding volumetrics, sub surface scattering, refraction, caustics and blah blah blah.... sorry ;P. And third, the development time to integrate the tech into your favourite 3d application has to be factored in. Blender's Eevee engine didn't arrive overnight, these new features take time and some serious funding as the blender foundation's current code quest campaign demonstrates.


Also, hang on a minute! Isn't Blender's rendered viewport a real time ray tracer? According to the article linked to above and the brief discussion around it, yes-ish, though the frames don't actually complete their total sample count in under a second! It would be nice though wouldn't it?!


Faster Rendering now! I want it NOWWWWW!


For the time being the way to faster rendering or more impressive renders/games is more and better hardware. This means *sigh*, spending on either fancy GPUs or renderfarms. And of course we've been working on a little addon for a few years that lets you connect up your own hardware and render your scene faster. Thats one way that won't hurt your pocket because our addon is free, just add computers :D, even $150k ones if you can spare the cash!


Props to:


:D please subscribe if you like this article and would like to know how we help artists get stuff done faster!




Featured Posts
Recent Posts
Archive
Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page