PDA

View Full Version : crytek engine 2



scratch33
05-31-2007, 07:13 AM
this real time engine is awesome.

http://www.gametrailers.com/player.php?id=19967&type=mov&pl=game

come on newtek, buy it and put it in a viewport to previous lightwave scenes.:D

Phil
05-31-2007, 07:25 AM
I just wish Vue could deliver this kind of thing. I begin to cry with frustration at its slow rendering and hideous crawlies in animations.

katsh
05-31-2007, 07:39 AM
this scene requre very high spec.
especially DX10 is.

and theres many sprite trick that not good as u think.
sprite trick(we call it "billboard") give us nothing and no meaning in 3D app.

Mattoo
05-31-2007, 09:09 AM
There's a lot of limits in a video game engine and the artists have to work within those known boundaries. That's what makes them so fast.

I'm sure Newtek could get the LW OGL viewports to be just as fast, just so long as you were happy to spend hours creating shadow volumes, LOD objects, no Sub-ds, nurbs or splines, you've a texture budget of 128mb and no more..... basically, the list goes on.

Simply put, it's not as easy as it looks and the reason why it looks so good is because it took a lot of people a lot of effort with a very limited and bespoke engine.

oDDity
05-31-2007, 10:16 AM
Yes, but you have to admire the trickery and workarounds they're coming up with to get the appearance of rendered scenes in real time.
I suppose it'll all meet in the middle some day soon and we'll have actual realtime rendered viewports to work in, and only have to render for a few extra bells and whistles.
WHat scratch means is he'd like to have that sort of approximation of say, fire he was working on, even if it is only sprites in the view window, and you have to render it for the high quality volumetrics.

Wickster
05-31-2007, 10:24 AM
The way GPUs are accelerating faster than CPUs, I'm sure we'll see some of those technology on our 3D Apps someday.

Mattoo
05-31-2007, 10:35 AM
Well, even though I've left games for a little while now I'm still too close to it to be impressed. I probably remember the headaches a little too well.


The funny thing is, I also remember having to put together these BS videos. And this video full of it - DX10 in and of itself will not allow you to mysteriously add more scene geometry to the foreground for instance or add extra meteors. Unless I'm very much mistaken and there is additional meteor pipelines available in the DX10 spec.

...lol.... just looking at it.... it's clear they've just removed crap from the DX9 version and left it in for the DX10.

Mattoo
05-31-2007, 10:39 AM
The way GPUs are accelerating faster than CPUs, I'm sure we'll see some of those technology on our 3D Apps someday.


I am mildly perplexed we haven't seen a widespread use of GPU assisted rendering. Siggraph papers have been coming out ever since the original GeForce displaying this capability. Various features to do with raytracing and GI can be done a lot quicker on the GPU leaving the CPU to worry about shaders and what not.
I believe Mental Ray is preparing something along these lines. Will be interesting to see what the real benefit is.

Sande
05-31-2007, 11:30 AM
Fast GPU-rendering would be nice (and we may even get that, if PIM lives out to it's promises), but I would really, really, like to see some realtime physics-engine implemented.

Current physics system in LightWave may be more accurate in some conditions, but I've seen that in Max and Blender (which both have realtime physics engines implemented) you often get better results faster, when your whole day isn't spent tweaking numerous settings and calculating simulation previews.

Matt
05-31-2007, 12:36 PM
Was there a train coming in that clip? ;)

Looks awesome, I bet it's a scary moment in the game when you come across that thing coming for ya!

Extent
05-31-2007, 07:08 PM
I am mildly perplexed we haven't seen a widespread use of GPU assisted rendering.

Well I know the AGP bus has always been a limitation. Now that PCI express is mainstream I hope we see some more development in that direction.

hrgiger
05-31-2007, 10:07 PM
Was there a train coming in that clip? ;)



That's what I was thinking. That and they were really trying to copy the War of the Worlds alien feel.

The DX did look nice though.

Elmar Moelzer
06-01-2007, 01:25 AM
Our VoluMedic has realtime rendering of volumetrics in LightWaves OpenGL Viewports.
It is nice, but cant even remotely be compared to the quality of the software rendering.
Right now, I only see it as being useful for camera- navigation and setting up our "bounding box" bounding objects.
The resolution is rather limited too at the moment. You can fit a 32 bit 3d- texture with 400x400x400 voxels into the texture memory of a 256 MB graphics card and this only works on Nvidia cards, since ATi has a driver problem that causes crashes at non power of two 3d- texture resolutions...
With these settings, 400 slices (thats basically 400 sprites stacked behind each other to give the illusion of depth) and complex shading on, my poor 6600 chokes along pretty badly.
Luckily these settings are not needed for previewing, but if you wanted to come even remotely close to the quality of our software rendering, you would need at least these setting, if not better. Our software rendering manages to do a nice 640x480 rendering with raytraced shadows and all the bells and whistles in just 7 secs on the same machine, so at the moment hardware rendering is still a bit away, IMHO.
Also, as has been said, game engines need a lot of faking and tweaking and the scenes in these engines are still fairly low poly. I actually hate the fact that everyone simulates higher poly scenes with complex shaders and normal mapping and what not, but when you look closely the character still have heads with 8 corners or so (if looked at from the side).
Personally I would rather see them handle more polys and do less with shaders, but for this even the most modern graphics cards dont have the power yet, it seems (no matter how many millions of polys they claim to process a second).
Then of course deformations etc have to be precalculated so they can be handled by the graphics card otherwise the CPU has to do that anyway).
CU
Elmar

Red_Oddity
06-01-2007, 07:52 AM
this scene requre very high spec.
especially DX10 is.

and theres many sprite trick that not good as u think.
sprite trick(we call it "billboard") give us nothing and no meaning in 3D app.

Actually, since as far back as the third patch with the FarCry engine, the folliage engine can be set to use the geometric instancing features of GPUs, so trees in background are actual 3d (though using LOD) trees.
And i can tell you, these instanced trees look so much sweeter in the game than the heavily mipmapped sprite 'cardboard' LODs.

Cageman
06-01-2007, 03:56 PM
I believe Mental Ray is preparing something along these lines. Will be interesting to see what the real benefit is.

There is a renderer called Gelato that uses GPU for rendering avaliable for Maya. Never tried it though.

Elmar Moelzer
06-01-2007, 05:39 PM
Not to many are using Gelato, simply because it is not much if any faster than a highly optimized pure software renderer.
One reason is that reading back and forth between the graphics card and the cpu is comparably slow still. So you must try to avoid this. However doing everything on the GPU is not that great either due to hw limitations. So I guess that they end up shoveling data back and forth, reducing the advantages of the added hardware rendering support quite a bit.
Personally I havent seen a really nice Gelato rendering yet either.
So I am still kinda in the dark on what Gelatos real purpose is, other than being a very nice tech- demo from Nvidia.
CU
Elmar