PDA

View Full Version : RENDERING TIMES is something that belongs to the past :)



silviotoledo
03-02-2012, 06:57 AM
Realtime Renderings:


UNREAL ENGINE:
http://youtu.be/h5XahF-3DWo

CRY ENGINE 3: a realtime Vue!
http://youtu.be/1Kvl31g77Z8

:D:D:D:D:D:D:D

silviotoledo
03-02-2012, 07:03 AM
3d applications have a lot to learn with the game engines for sure!

http://youtu.be/ZqNHJ-ekMR4


Films will be done in Cry Engine at realtime speed soon :)

Ernest
03-04-2012, 07:36 PM
Konami's Fox engine:


Japanese game developer Konami is pretty proud of the true-to-life visuals that can be created using the Fox Engine, which the company is using to develop the next game in the Metal Gear Solid series. It's so proud of those visuals, in fact, that its challenging Internet users to see if they can tell a real photograph of their office from a realtime rendering of the same location.
http://arstechnica.com/gaming/news/2012/03/can-you-tell-reality-from-konamis-fox-engine.ars

The modelers left a massive visual clue, but that's not the engine's fault.

MediaSig
03-04-2012, 10:15 PM
;)

silviotoledo
03-09-2012, 06:47 AM
:)

silviotoledo
03-09-2012, 07:19 AM
UNREAL ENGINE REALTIME

http://youtu.be/ttx959sUORY
http://youtu.be/hmT9N5dUqcM


Unreal Engine uses all the powerfull tools wich is inside Nvidia card for Dynamics and Shading. Can't Lightwave use it too?

Sensei
03-09-2012, 09:21 AM
Can't Lightwave use it too?

No.

lardbros
03-09-2012, 09:36 AM
But they could utilise some cool OpenGL shaders which it is currently not doing, and also have real-time shadows and other cool things in the viewport like LW CORE had. I'm sure this will be coming soon enough!

Sensei
03-09-2012, 10:11 AM
IMHO waste of time improving OpenGL perspective viewport display, if user have VPR and use it..

If they would waste time and make it, then users would complain that they don't work with all lights, just mimics point and distant and nothing else.. And everybody uses Area, Dome, IES or DP Lights..

Once I said that translucency is not showed in OpenGL (and it's million times easier to implement than shadows), and it was closed in the same day..
https://fogbugz.newtek.com/default.asp?18042_9eft46a2

silviotoledo
03-09-2012, 10:17 AM
Maybe the way is using some of these GPU elements to acelerate de VPR rendering too.

Turbulence 4D already uses GPU to acelerate calculations inside lightwave what show it's possible, in Lightwave, to have hybrid calculation.

it's really good the quality of the atmosphere effects ( dust, rain, fire,... ) rendering and DOF effects in realtime that would take long time for a frame in VPR.

and this http://youtu.be/lhoYLp8CtXI is coming :). Wow!

jasonwestmas
03-09-2012, 10:29 AM
These game scenes are highly optimized. They would work for a lot of situations until you wanted closeups of your assets. When GPU's are waaay cheaper then maybe we'll see more GPU rendering for highresolution geometry. It's just that the general cost to performance ratio is still in favor of the CPU rendering.

Sensei
03-09-2012, 10:36 AM
It has nothing to do with cost.
It's about writing everything from scratch.

Not to mention GPU can work only with what it has in gfx memory, which is at max 1-2 GB (and where are textures etc?!), where CPU can work with f.e. 12-24 GB physical memory in everybody machine, not to mention virtual memory on hard disk..

jasonwestmas
03-09-2012, 10:40 AM
It has nothing to do with cost.
It's about writing everything from scratch.

Not to mention GPU can work only with what it has in gfx memory, which is at max 1-2 GB (and where are textures etc?!), where CPU can work with f.e. 12-24 GB physical memory in everybody machine, not to mention virtual memory on hard disk..

I didn't mean just lightwave and other software rendering environments, but I get your point. But don't you suppose the GPU could be designed with far more video Ram, hence the cost comment. =)

Sensei
03-09-2012, 10:45 AM
When GPU will have 12 GB, CPU & MB will have 256 GB on board.. ;)
And everybody will be making 3d with 100 million polygons.

jasonwestmas
03-09-2012, 10:49 AM
When GPU will have 12 GB, CPU & MB will have 256 GB on board.. ;)
And everybody will be making 3d with 100 million polygons.

lol, Awesome! :D

silviotoledo
03-09-2012, 10:54 AM
And do not forget the stereo. Damn it!

Actually Nvidia have a 6 GB card that costs too much. Maybe Microsoft will bring us this tech for popular pricing like they did with Kinect. Hope.

Anyway, I still think there will be a way to use both CPU and GPU capabilities together. Games market wich wins more money will decide more.

Developers will need to be more flexible for porting in all they will code from now.

jasonwestmas
03-09-2012, 11:02 AM
. . .and maybe bucket rendering for GPUs!!! ;)

Sensei
03-09-2012, 11:34 AM
Bucket rendering is TOTALLY not the way GPUs are rendering graphics..

To what tasks GPU can be used now, in days or hours:
- generating sub-patches and catmull-clark sub-d (can be done by 3rd party developer as long as NewTek will release informations about algorithm)
- calculating smoothing, and vertex normal vectors
- calculating kd-trees
- transforming objects from local coordinate space to world coordinate space (as long as on deformation is done to each vertex, so no bones, no displacements etc).
- image manipulation such as changing brightness, gamma, contrast, etc.

Dexter2999
03-09-2012, 11:37 AM
It has nothing to do with cost.
It's about writing everything from scratch.

Not to mention GPU can work only with what it has in gfx memory, which is at max 1-2 GB (and where are textures etc?!), where CPU can work with f.e. 12-24 GB physical memory in everybody machine, not to mention virtual memory on hard disk..

But GPU's could work wonderfully as long as users understand these limitations. GPU's for wireframe, solid shade, weight shade, wireframeshaded modes are all feasable. And if the software would support GPU's in SLI...even better!

I think even more sophisticated modes could be workable but things like texture and light replacement would be a necessary evil. The user would have to understand that to get the gains from using the card they must also confine themselves to the limitations of the card.

And of course we know that people won't do that. They will only cry and complain when they can't do whatever they want to do. There will always be cries of "why can't it do this?" And these cries are what push developers to try to figure out solutions for demands.

jasonwestmas
03-09-2012, 11:44 AM
Bucket rendering is TOTALLY not the way GPUs are rendering graphics..



True, but there might be other techniques to render GPU frames in pieces, hypothetically. I see new crazy ways of processing data come out of the shadows all the time. Not saything this is ever going to happen, just saying.

Sensei
03-09-2012, 11:53 AM
The whole point of bucket rendering is not that it renders region from x0,y0 to x1,y1, but that data not visible are not generated, and if they're not used long enough (rays don't fly through them), they are freed from memory.. Then when rays fly to geometry, it's rebuild, and again taking memory. That's why bucket renderers don't need instances made such way as HD_Instance or LW v11 instances... If they suffer out of memory, some long time not used geometry is cleaned up and memory is allocated for new one, and rays are going by it..

GPU must have EVERYTHING made, and ready in gfx memory.. All games have "wait.. loading.." stages (don't you see how long such loading can take???), when level data are transfered to gfx memory, and stay there until player goes outside of level..

Lightwolf
03-09-2012, 11:54 AM
Bucket rendering is TOTALLY not the way GPUs are rendering graphics..
Except for PowerVR GPUs (tiled rendering), which are currently ruling the mobile market in terms of performance...

Cheers,
Mike

Dexter2999
03-09-2012, 11:55 AM
jason- get your drool on..

http://www.youtube.com/watch?v=WlVlX5jX9AQ&feature=results_video&playnext=1&list=PL50489E552BD60B89

116 Tesla cards and the joy of CUDA

jasonwestmas
03-09-2012, 12:15 PM
jason- get your drool on..

http://www.youtube.com/watch?v=WlVlX5jX9AQ&feature=results_video&playnext=1&list=PL50489E552BD60B89

116 Tesla cards and the joy of CUDA

now that's what I'm talking about! GPU clusters/render farm capabilities! Thanks.

jasonwestmas
03-09-2012, 12:17 PM
The whole point of bucket rendering is not that it renders region from x0,y0 to x1,y1, but that data not visible are not generated, and if they're not used long enough (rays don't fly through them), they are freed from memory.. Then when rays fly to geometry, it's rebuild, and again taking memory. That's why bucket renderers don't need instances made such way as HD_Instance or LW v11 instances... If they suffer out of memory, some long time not used geometry is cleaned up and memory is allocated for new one, and rays are going by it..

GPU must have EVERYTHING made, and ready in gfx memory.. All games have "wait.. loading.." stages (don't you see how long such loading can take???), when level data are transfered to gfx memory, and stay there until player goes outside of level..

Well that sucks, we should redesign that approach. ;)

pat-lek
03-14-2012, 01:49 PM
Another engine;

http://unigine.com/products/heaven/

With a free version for personnal use:

Editions
There are two editions of the benchmark:

Basic (for personal use; available for download for free)
Professional (for commercial use; features command line automation, reports, stress testing mode and technical support)

And work on mac (mac os 10,7) (I am still on mac os X 10,6) and Linux

Elmar Moelzer
03-14-2012, 02:14 PM
3D game engines are great for making 3D games. But dont be fooled, there is a whole army of artists working on these scenes, making sure that they get the most out of the models which still have limited numbers of polygons, shaders etc. Even a small node in LWs nodal surfacing will bring even the fastest GPU to a screeching halt if you implement it as a shader.
Also, the moment a scene renders in realtime on a given hardware, a user will find a way to make a more demanding scene. Just look at the way scenes have grown since SeaQuest and Babylon 5.
Its just the way it is. Because of that until we have hardware capable of rendering fully raytraced scenes with complex shaders at 60 FPS at 16 times HD resolution, I doubt we will ever see really satisfying "realtime rendering".

archijam
03-14-2012, 02:52 PM
game engines coming to archviz : http://lumion3d.com/

Elmar Moelzer
03-14-2012, 04:31 PM
game engines coming to archviz : http://lumion3d.com/
This is a very good example of why realtime rendering allone wont cover all your needs. Lumion does look good, especially if you stay outdoors, but it cant compete with rendering done with LW or Vray. That is however not what it intends to do. It is meant for realtime and interactive presentations, maybe even while working in front of the client.

lardbros
03-20-2012, 07:33 AM
Recently we went to the Crytek offices in Frankfurt, and one thing that came out was the tool is great for creating awesome looking real-time stuff... but we asked if there was an incidence angle shader for making glossy pearlescent car-paint, and they used to have one but it was FAR too slow! So, even simple things like that have to be cheated. GREAT game engine, but as Elmar says, it is HEAVILY, HEAVILY optimised. I really wouldn't want to have to create separate shadow proxies, physics proxies, collision proxies, Level of detail models, limited to 16 materials on a single object. Only a single UV channel... the list goes on... but this is how we have to work when using the CryEngine! It's all VERY streamlined and optimised massively!

jasonwestmas
03-20-2012, 09:01 AM
Ah yes, the incidence angle/ fresnel thing. Yeah absolutely necessary for reflective shading, even if you are just using spec and gloss.

Elmar Moelzer
03-20-2012, 02:33 PM
I am actually wondering whether the trend will go back to software rendering in the mid term. There are some developments that are just slowly creeping up that may bring at least some of the game- rendering back to the CPU.

jasonwestmas
03-20-2012, 03:09 PM
I am actually wondering whether the trend will go back to software rendering in the mid term. There are some developments that are just slowly creeping up that may bring at least some of the game- rendering back to the CPU.

Yeah I heared about that CPU verses GPU idea from a programmer I work with for Unity occasionally. I never gave myself the chance to bug him about what he meant by that. =)