January 22, 2018

Please reload

Recent Posts

We'd like your thoughts

April 4, 2018

1/9
Please reload

Featured Posts

Crowdrender and Denoising in Blender 2.79

June 28, 2017

 

200 samples at 1080p frame size - glass bsdf shader on plate, diffuse bsdf on plane with checker texture

 

Initial Frustrations

 

Wouldn't it be nice if you could render using the denoiser in blender 2.79 and still be able to use multiple machines (for both single frames and sequences) without having artefacts where the tiles meet? Man I sounded like such a salesman there! Sorry :P

 

If you've never tried bucket rendering for single frames, you might not have experienced the blood boiling frustration of having artefacts in your final render - be they squiggles, fireflies or discontinuities. Getting rid of them is not exactly an exact science even when you're not splitting up a render into tiles. So when we had some artefacts whilst testing the denoiser with crowd render, for us it was certainly a show stopper, at first. I mean really, no denoiser support? Might as well pack up and go home. 

 

 

Wait, what is a Crowdrender?

 

Ahhh, look at me asking questions like I am you reading this article! Sorry, a little too mainstream internet blog, thinking I am a clever clogs. Ahem, I'll get to the point. 

If you haven't heard, crowd render is a network rendering add-on for blender, it allows you to connect many computers together to render stills and sequences. In this article we're exploring how it works with Blender's new (coming in 2.79) denoiser feature.

 

If you want the short story, it works, you can (in version 0.1.2 of our add-on, as yet unreased, just like 2.79) render frames using multiple computers and use the denoiser to get the same great results in less time. 

 

If you love a little reading, then read on my fine fellow/fare lady.

 

Delving a little deeper

 

So, for those of you who love to read, here is a reference to an academic journal (its a paid one, so unfortunately you only get to read the abstract for free) behind the new denoiser. 

 

http://dl.acm.org/citation.cfm?id=2641762

 

If you like seeing results rather than indulging in theory, you can see the Adaptive Rendering based on Weighted Local Regression in action in a video presentation at the link below.

 

http://sglab.kaist.ac.kr/WLR/

 

Anyway, the denoiser basically works by doing a lot of data analysis of data in the image plane, thats pixel data as far as my limited understanding goes. The theory goes that you can sample the data in the image and remove noise and keep detail. We know this works cause, well look at the image, it convinced me (the video above is also amazing by the way). 

The controls in blender give a hint at how it works, think of it as a very advanced blur, you get to control things like the radius and strength of the blur. The radius restricts what pixel data at any given location in the image is used to remove noise. The strength affects how strongly the data is weighted. Basic enough explanation, I guess. 

 

In the image below you can see a render where we intentionally disabled denoising on one machine so you can see the difference. This image was rendered with 200 samples, you can see the clarity of the denoised image is quite striking. 

 

 

 The catch, with multi-tile rendering

 

We've come across a similar issue with compositing tiles coming from separate computers or processes. Initially we'd tried compositing each tile on the computers we were using as opposed to sending all the tiles back to the user&