(Note: I'm relatively new to blender, so may be missing some very obvious points!)
I'm looking at setting up a small internal farm that might be used in a classroom situation.
We have a couple of Windows PCs and a couple of Linux VM servers that could be utilized. Students will use a mix of Mac, Windows, and Linux frontends; possibly even Raspberry Pis...
Given the docker containers, it looks to be really simple to set up CPU, possibly even GPU, accelerated containers on Linux... and that's also where I have the most experience.
Is there any instructions/wrappers for setting up Blender + CR as a windows service? They're often used for other, less taxing tasks; so blender needs to be running out of (any) sight, in the background.
In terms of file access and permissions, I'm planning on setting up limited access "CrowdRender" users that have read-only access to a network share, and setting it to be auto mounted on windows with this specific user. Same user would run the Blender + CR headless service.
For render nodes, is read only enough?
And if users save their blend file on this network share, and structure their textures and other external assets to be in sub-directories relative to the blend file, should this work, even cross OS?
Or does the blend file need to (or just a lot easier to) be set to "automatically pack into blend"?
If relative paths don't work, is there a path mapping feature- eg way to use regex to map "x:\shared\assets\..." to "/mnt/shared/assets/..."?
(We discovered the lily surface scraper for getting and setting up textures. It's awesome, but it seems the relative paths-to-textures is either broken in the plugin, or blender... they all went purple when we moved the directory containing the project (and all assets) to a network share. Just discovered the `file->external data` menu; will have to play with that later and see if it fixes things.)
Thanks in advance, just stunned at what even an absolute beginner like me can achieve, especially with the new 2.8 GUI... but now want it faster ;-)