PDA

View Full Version : Pros and Cons of Rendering Realtime



gamedesign1
11-09-2017, 09:17 AM
Hi All,

I noticed there was a little bit of a discussion about realtime rendering on another thread and so I thought it might be good to have a thread dedicated to this.
I for one am very interested in moving some of my projects to realtime rendering, but feel there are some drawbacks. One thing I have noticed (and I might be wrong here) is I cannot output an animation with an alpha channel or materials/object masks, which limits what I can do in compositing. As I say I might be wrong here. I would really like to know all of your thoughts about the pros and cons of realtime rendering.

Thanks :)

Sensei
11-09-2017, 09:58 AM
What do you mean exactly by real-time?
30 FPS is real-time?
1 FPS is real-time?
It all depends on how scene is big..
Even the fastest renderer will crawl with really big scene..
Even the slowest renderer can render real-time simple box scene..

prometheus
11-09-2017, 11:06 AM
not even layout open gl can display and move around in a highly million detailed displaced terrain environment, nor can blender either..in open gl..so what figures is the greatness of realtime "rendering" ??
And as I understand it..itīs not even an option for game engines to produce such high level needed for really realistic detail, the techniques and power to do so doesnīt seem to be there yet.

ianr
11-09-2017, 11:18 AM
THIS WAS from an old thread check the Vids 4 yourself


Quote Originally Posted by MichaelT View Post
Yeah, swarm isn't the quickest, but it is worth the effort in the end. Many companies are using it for a wild range of cases outside games these days.
When if comes to ILM, they used a heavily modified version of UE4. There is a video here:

https://www.polygon.com/2017/3/1/147...star-wars-k2so
Thank you for another piece of ILM info. M-T

We wouldn't have LW lens flares if Mr. Knoll hadn't coded them.

everything trickles down, sooner or later!

BUT , BUT,
it's the UnReal Sequencer thats blowin' my skirts up ATM

it's very creative in a Hyper edit sense,just what we need!
https://www.youtube.com/watch?v=_wKOTmcHI84 Runtime to 12.30 mins approx

Kaptive
11-09-2017, 11:53 AM
The thing is, it really isn't one thing over another, ever. Pre-rendered is to realtime what acrylics are to pencils. It is a medium with a particular result. That might be simplifying it, but it is the same thing, a choice. Oats Studio have already used Unity for realtime output, so it is already a thing. The use of a game engine could be a statement in itself as a story telling component. Adam E2: The Mirror looks good, but unconvincing. Or to put it another way, it looks like live rendered, so very game like.


https://www.youtube.com/watch?v=R8NeB10INDo

I imagine that realtime rendering could come into its own when used for live feedback of an effect driven film etc. In fact, I'm pretty sure that is already out there. In some ways, LW already supports elements of that (virtual studio/studio live.)
This reminds me that I really should try that out properly, it's a feature I rarely think about but looked great fun in the demos.

gamedesign1
11-09-2017, 12:05 PM
What do you mean exactly by real-time?
30 FPS is real-time?
1 FPS is real-time?
It all depends on how scene is big..
Even the fastest renderer will crawl with really big scene..
Even the slowest renderer can render real-time simple box scene..

Sorry I mean using something like Unreal or Unity to output prerendered graphics. The fact that those renderers can output very fast compared to traditional rendering is still very useful I think even if it can't literally play at 30fps in realtime in the game engine. You can still output each frame to an image at a very fast rate.

gamedesign1
11-09-2017, 12:20 PM
I found this company a while back making this FilmEngine software, a realtime software directed at people outputting for Film. Still in development though.
http://www.filmengine.com/

CaptainMarlowe
11-09-2017, 12:31 PM
Actually, you have quite a few options for exporting for compositing in Unreal Engine : https://docs.unrealengine.com/latest/INT/Engine/Sequencer/Workflow/CustomRenderPass/index.html

gamedesign1
11-09-2017, 12:35 PM
Actually, you have quite a few options for exporting for compositing in Unreal Engine : https://docs.unrealengine.com/latest/INT/Engine/Sequencer/Workflow/CustomRenderPass/index.html

That's great, thanks for the link. I wasn't aware this was an option :)

OlaHaldor
11-09-2017, 01:11 PM
This might poke your interest. Animated in Maya, shaded, lit and rendered in Unreal Here's an article about it (http://www.cartoonbrew.com/tools/one-animator-making-cg-series-unreal-engine-153377.html).
I think this looks fantastic!


https://youtu.be/_GpOM-4uA2Q


Basically; animate in whatever tool you like, export baked skeleton data to the engine and apply to the skeleton mesh.
It might not be as easy as it sounds, but that's the core technique to bring it over to Unreal.

Let's say you have a cape on a character. It can be simmed in real-time. Simplified. But still look great. Just set the rules for the cape and wind.

gamedesign1
11-09-2017, 02:02 PM
This might poke your interest. Animated in Maya, shaded, lit and rendered in Unreal Here's an article about it (http://www.cartoonbrew.com/tools/one-animator-making-cg-series-unreal-engine-153377.html).
I think this looks fantastic!


https://youtu.be/_GpOM-4uA2Q


Basically; animate in whatever tool you like, export baked skeleton data to the engine and apply to the skeleton mesh.
It might not be as easy as it sounds, but that's the core technique to bring it over to Unreal.

Let's say you have a cape on a character. It can be simmed in real-time. Simplified. But still look great. Just set the rules for the cape and wind.

Thanks for the link :)
Yeah I tried bringing animated meshes into Unreal a while back and it worked great.

Danner
11-09-2017, 02:35 PM
I'm rendering with Unreal as we speak, it's our third animation with it. I am still slower to final product than rendering with LW, but I think it'll get better. It's a joy to render at 4k and not worry about render times, but setting up materials and figuring out how to remove some artifacts has been very time consuming. Gotta get back render is finished =P

OlaHaldor
11-09-2017, 03:10 PM
but setting up materials and figuring out how to remove some artifacts has been very time consuming.

Doing anything special with the materials?
What kind of artifacts?

Danner
11-09-2017, 05:43 PM
Well first artifact was a very strong shadow on the floor that would not go away no matter how much light I put there. In the end it was reflection sphere coverage. That bug made no sense, so it took a while to get rid of.

A couple of Emissive materials that would refuse to cast light onto the scene. Had to export those pieces as separate FBX and they started to work again.

Some Radiosity "lightmass" noise on geometry edges. For that I modified the UV and upped the lightmap res.

I'm using a few master materials, many instances, and a few regular materials.

OlaHaldor
11-09-2017, 11:57 PM
Interesting. Sounds like you've got a good thing going :)

Danner
11-10-2017, 01:14 AM
if social life doesn't get in the way I might do a little tutorial this weekend, of a LW to Unreal workflow.

gamedesign1
11-10-2017, 04:04 AM
if social life doesn't get in the way I might do a little tutorial this weekend, of a LW to Unreal workflow.

I would be interested in seeing that ��

tyrot
11-10-2017, 04:40 AM
eagerly waiting

Norka
11-10-2017, 04:40 AM
https://home.otoy.com/render/brigade/

Brigade will be here before too long (I think as soon as Octane 4 comes out, in coming months). I haven't seen if Brigade will be part of the whole Unity/Octane deal though... In case you don't know what that is, Unity is going to be bundling a free version of Octane soon (I think limited to either one or two GPUs) for baking shite... even the free version of Unity will have this Octane Jr.

So stay tuned and keep an eye out for Brigade, which looks like it is going to be pretty damn special...

gamedesign1
11-10-2017, 06:21 AM
Yeah I looked at some examples of Brigade it looks impressive :)

mummyman
11-10-2017, 08:17 AM
I remember seeing this a while back... makes it look too easy: https://www.youtube.com/watch?v=SYU2l_OPXOQ

MichaelT
11-10-2017, 10:41 AM
The thing is, it really isn't one thing over another, ever. Pre-rendered is to realtime what acrylics are to pencils. It is a medium with a particular result. That might be simplifying it, but it is the same thing, a choice. Oats Studio have already used Unity for realtime output, so it is already a thing. The use of a game engine could be a statement in itself as a story telling component. Adam E2: The Mirror looks good, but unconvincing. Or to put it another way, it looks like live rendered, so very game like.

...

I imagine that realtime rendering could come into its own when used for live feedback of an effect driven film etc. In fact, I'm pretty sure that is already out there. In some ways, LW already supports elements of that (virtual studio/studio live.)
This reminds me that I really should try that out properly, it's a feature I rarely think about but looked great fun in the demos.

Interesting that you bring this video up, It is (for those who don't know) going to be a series of episodes. Directed by Neill Blomkamp (Hope I spelled his name correctly)

jwiede
11-10-2017, 08:21 PM
It is worth noting that, as advanced as real-time display engines have become, when it comes to compositing outputs, break-out passes, and other editing workflow integration, traditional render engines still offer much better capabilities. Real-time display engines still have a long way to go in terms of efficient integration with existing editing workflows and toolsets.

Unless and until support for conventional editing workflows and toolsets improves significantly, I just cannot see real-time display engines significantly displacing traditional render engines in mainstream media production (other than within a very limited niche).

gamedesign1
11-11-2017, 05:11 PM
It is worth noting that, as advanced as real-time display engines have become, when it comes to compositing outputs, break-out passes, and other editing workflow integration, traditional render engines still offer much better capabilities. Real-time display engines still have a long way to go in terms of efficient integration with existing editing workflows and toolsets.

Unless and until support for conventional editing workflows and toolsets improves significantly, I just cannot see real-time display engines significantly displacing traditional render engines in mainstream media production (other than within a very limited niche).

Yeah I agree, I'm really not looking to get exactly the same quality as a raytraced renderer. I'm just looking to get a quality that is perfectly good enough for doing childrens TV series. i think the answer is for me to just go ahead and try it out :)

tyrot
11-12-2017, 12:01 AM
For me there is a big issue, i use TAFA for all facial animation and i think it is almost impossible for me to export that MDD data into a game engine . . :( yes may be morph export can be used but i do not know.

I agree with JW on this one too, somehow i feel like it is too early .. Darkside animation put a huge question mark to my mind though ... gotta admit..

Surrealist.
11-12-2017, 12:16 AM
Unreal can import Alembic. You can export that from LW.

Danner
11-12-2017, 07:49 AM
I can confirm that alembic point cache data works in Unreal

https://media.giphy.com/media/3o6fIQaSX1yVTj6sda/giphy.gif

tyrot
11-12-2017, 11:02 AM
hmmm that may change EVERYTHING!

rustythe1
11-12-2017, 03:15 PM
It is worth noting that, as advanced as real-time display engines have become, when it comes to compositing outputs, break-out passes, and other editing workflow integration, traditional render engines still offer much better capabilities. Real-time display engines still have a long way to go in terms of efficient integration with existing editing workflows and toolsets.

Unless and until support for conventional editing workflows and toolsets improves significantly, I just cannot see real-time display engines significantly displacing traditional render engines in mainstream media production (other than within a very limited niche).

Unreal engine comes quite close to lightwave for output now though, a fair few render output options, https://docs.unrealengine.com/latest/INT/Engine/Sequencer/Workflow/CustomRenderPass/ been getting into unreal for a while, i had a scene that lightwave was starting to struggle with, 18 million polys and a lot of textures, but guess what, unreal swallowed it up and it runs at about 30 fps, and that was with also adding a shed load of instance grass to replace textures, to be honest i was astounded at how it handled the geometry and textures, some of which are 8000x8000!

tyrot
11-12-2017, 03:21 PM
any screenshot ?

rustythe1
11-12-2017, 03:52 PM
NDA sorry

jwiede
11-12-2017, 04:07 PM
Unreal engine comes quite close to lightwave for output now though

Unfortunately, Lightwave's current level of render pass support and compositor integration is also quite basic, so probably not the best "bar" to use for comparison.

[Since pic was apparently NDA, I've removed specific comments about it.]

I will say that editing needs go beyond simple "global" (iow, "everything-in-scene") color and component outputs in most cases, need for object/object-group and surface/surface-group breakouts of same (for stuff like fore/mid/back adjustments) have become quite commonplace. Passes like individual and group coverage maps, masks, etc. are also commonly required.

tyrot
11-12-2017, 04:10 PM
actually i tried myself Lightwave to Unity Alembic workflow. Everything is working cool. It is just like MDD import.

But i could not figure out smoothing yet.

- textures UVs-facial animation /MDD based are there :)

Danner
11-12-2017, 04:27 PM
The thing with post processing when rendering with Unreal is that it becomes less necessary. For example: in post you want to make your glass more reflective, you could have an ID pass and a reflection pass, apply the apropiarte masks to adjust it. In Unreal you just tweak the material and export again, in minutes you have the updated sequence. And while you are at it adjust the refraction index, something you can't do (easily) in post.

gamedesign1
11-12-2017, 04:31 PM
The thing with post processing when rendering with Unreal is that it becomes less necessary. For example: in post you want to make your glass more reflective, you could have an ID pass and a reflection pass, apply the apropiarte masks to adjust it. In Unreal you just tweak the material and export again, in minutes you have the updated sequence. And while you are at it adjust the refraction index, something you can't do (easily) in post.

Thats a very good point

jwiede
11-12-2017, 05:03 PM
The thing with post processing when rendering with Unreal is that it becomes less necessary. For example: in post you want to make your glass more reflective, you could have an ID pass and a reflection pass, apply the apropiarte masks to adjust it. In Unreal you just tweak the material and export again, in minutes you have the updated sequence. And while you are at it adjust the refraction index, something you can't do (easily) in post.

Which sounds simple, on the surface, but really means completely redesigning large-scale industries' entire media production workflow and pipeline (and all the checks and balances therein). Expecting that to occur rapidly does not seem... wise.

Also, post-editing small sections is inherently much lower-risk than completely regenerating larger sections of source content (for editing media, code, text, anything really). The existing pipelines are as much about risk management and accountability as anything else. Failing to account for and address those requirements makes adoption less likely, not more so.

For indies, small shops, etc. (aka, small, loose approval chains) the inertia against adoption of real-time display engines is small, so it'll happen quickly. For the large-scale production industries (aka long, hierarchical, mandatory approval chains), the inertia against adoption is much, much greater, and that will have a huge impact on rate of adoption, as it always has before.

Danner
11-13-2017, 03:45 AM
That's correct. I don't see Unreal, Evee, Unity or any other realtime rendering solution replacing traditional rendering just yet, but it might happen down the line. In my particular case it is useful for certain types of clients and animations. Arch Viz stuff When render time is the bottleneck and you are expecting lots of changes and adjustments for example. Render passes are doable in Unreal, you can export AO, Metallic, Opacity, roughness, specular normals, etc, separately, but one sorely missing pass is Material IDs. You can make it by giving everyting a custom base color and using that render pass output, but it's a manual thing unless you know how to use blueprints, or can find one that does what you want.

tyrot
11-13-2017, 03:57 AM
i found render passes asset - plugin for unity. i will seriously invest time for this workflow. for children tv series i think this can be a game changer.
only issue right now alembic exports smoothing off in my case. lets see.

Danner
11-13-2017, 04:38 AM
you can re-generate the smoothing groups on import, look in the import settings.

tyrot
11-13-2017, 01:57 PM
in unity did you try ? if you did which version? 2017 .2 when i try to reimport with smoothing group instant crash!

Sensei
11-13-2017, 02:50 PM
I don't see Unreal, Evee, Unity or any other realtime rendering solution replacing traditional rendering just yet, but it might happen down the line.

If something can be rendered in 1/30 second, in real-time,
it'll look 1800 times better rendered in 1 minute... ;)

Surrealist.
11-13-2017, 08:58 PM
Well I beg to differ.

With art, which is what rendering is, you can't put numbers on it. It isn't the same as scripting or creating expressions where exact math may need to be valued as implicit values.

Rendering is valued on a lot of other objective and subjective levels than that. And one of the largest variables to achieve an artistic effect or result is iteration.

Iteration plays a huge part in the value and quality of a work of art.

This is why large studios give the daily production lighting and material artists and even animators access to the render farm.

This is why real-time feedback on textures, displacements and so on have been developed such as Viewport 2.0.

The average artist can increase his productivity and ability to create an effect to the degree the time it takes to get feedback on the overall process he is engaged in.

The results are intangible.

While you might be able to put values on a pixel by pixel basis, purely scientifically or technically, the end result of a rendering is the result of much more than that.

And that is all aside from the overall turn around time and the literally infinite ramifications of render time on a frame per frame basis when you calculate it out for animation.

The numbers are staggering.

You want to talk numbers. Pull out a calculator on that.

MichaelT
11-14-2017, 12:53 AM
Unreal engine comes quite close to lightwave for output now though, a fair few render output options, https://docs.unrealengine.com/latest/INT/Engine/Sequencer/Workflow/CustomRenderPass/ been getting into unreal for a while, i had a scene that lightwave was starting to struggle with, 18 million polys and a lot of textures, but guess what, unreal swallowed it up and it runs at about 30 fps, and that was with also adding a shed load of instance grass to replace textures, to be honest i was astounded at how it handled the geometry and textures, some of which are 8000x8000!

I don't know how LW handles it now (let's see how it works in the upcoming version) but I would guess that unlike UE4, LW doesn't ignore data that can't be seen as aggressively. The size of the texture doesn't matter as much these days either. ID Tech 6 (for instance) can use textures only limited by the amount of memory in the computer.

MichaelT
11-14-2017, 01:02 AM
That's correct. I don't see Unreal, Evee, Unity or any other realtime rendering solution replacing traditional rendering just yet, but it might happen down the line. In my particular case it is useful for certain types of clients and animations. Arch Viz stuff When render time is the bottleneck and you are expecting lots of changes and adjustments for example. Render passes are doable in Unreal, you can export AO, Metallic, Opacity, roughness, specular normals, etc, separately, but one sorely missing pass is Material IDs. You can make it by giving everyting a custom base color and using that render pass output, but it's a manual thing unless you know how to use blueprints, or can find one that does what you want.

I'm guessing you are referring to this?: https://www.youtube.com/watch?v=-WfTr1-OBGQ&t=9s (the link to the source code is in the comments)

Surrealist.
11-15-2017, 07:51 PM
The Adam project is quite impressive:


In just five months, the Oats team produced in real-time what would normally take close to a year using traditional rendering. “This is the future of animated content,” declares CG supervisor Abhishek Joshi, who was CG lead on Divergent and Game of Thrones. “Coming from offline, ray-traced renders, the speed and interactivity has allowed us complete creative freedom and iteration speed unheard of with a non-RT workflow.”

And clearly makes the case for real time, currently as a viable option - for some projects.

https://unity.com/madewith/adam#the-project

gamedesign1
11-16-2017, 05:48 AM
The Adam project is quite impressive:



And clearly makes the case for real time, currently as a viable option - for some projects.

https://unity.com/madewith/adam#the-project

+1...

rustythe1
11-16-2017, 06:15 AM
I don't know how LW handles it now (let's see how it works in the upcoming version) but I would guess that unlike UE4, LW doesn't ignore data that can't be seen as aggressively. The size of the texture doesn't matter as much these days either. ID Tech 6 (for instance) can use textures only limited by the amount of memory in the computer.

well the upcoming version should fix things, in the mesh engine blog part 2 he tells us about how the duality of the old engines creates problems with memory and data handling, actually its interesting going back and reading them as you suddenly realize how much work had already been done at that early point of time, and the fact the blog states layout is now a more advanced version of chronosculpt, so eager to try that out, i think it could push lightwave in the direction of Clarrise for large scene handling (although not quite as big) as lightwave has always been one of those throw more hardware at it and prevent it from dying softwares, where as other bigger apps i have found can just choke when file types just get over a certain size,

Dillon
11-16-2017, 06:40 AM
Hyper realistic real time rendering is coming much quicker than anyone here thinks. There was mention of Brigade. But it looks like Otoy is about to disrupt the entire 3d/film industry.

Curiously, LightWave is mentioned several times in this presentation by the CEO of Otoy. Recorded a few weeks ago.... ;)

https://www.youtube.com/watch?v=XumHObY8lJc

gamedesign1
11-16-2017, 06:57 AM
Rendering via gpu clouds is something that is very interesting. I would love to have my VPR or Octane viewport rendering this way. I remember seeing some examples of Octane Render Cloud and it was almost instant with the amount of gpus they were tied to.

gamedesign1
11-16-2017, 07:00 AM
Less fussing with rendertimes will open up creativity I think, especially for people with smaller studios and budgets.

MichaelT
12-08-2017, 05:37 PM
There is a new Adam video released by Blomkamp: https://www.youtube.com/watch?v=tSDsi2ItktY

jwiede
12-11-2017, 10:17 AM
Hyper realistic real time rendering is coming much quicker than anyone here thinks. There was mention of Brigade. But it looks like Otoy is about to disrupt the entire 3d/film industry.

Hopefully they'll spend some effort first to sort out the whole "OctaneVR|Lite / ORC" mess they've created. They've been promising to do so since Sept., yet no tangible differences to date.