PDA

View Full Version : Improved Clip Mapping



Matt
05-07-2003, 02:25 AM
Ran into clip map limitations again today . . .

Can we please have clipmaps defined using the suface editor rather than the objects render properties.

While you're at it, it would be much more useful if clips were true solid boolean clips (or at the very least faked, nothing real in this world is made up of thin sheets!) and it would be even more useful if you could animate them too!

Ta!

Lightwolf
05-07-2003, 02:36 AM
Hi Matt (again :) )
there is a tutorial somewhere for fake boolean clip maps.
Basically you do this:
Load your object, apply an animated clip map.
Take your object into modeller, flip all the polygons and apply a surface to all of them (this will be the fake 'inside' of your object). Set luminosity to 100% and diffuse to 0% since it won'T shade properly.
Take that into Layout as well and apply the same clip map.
Done.

You might even get away with the same object, including the flipped surfaces.

You will have to make sure that your object has true volumes though.

Cheers,
Mike

Matt
05-07-2003, 10:24 AM
I remember that cheat LightWolf, but true booleans would be nice, failing that clip map via surface settings will do!

Elmar Moelzer
05-07-2003, 12:24 PM
Matt I am with you on that one. Same for displacement- maps, they also belong into the surface- properties IMHO!
Booleans in Layout should be doable by using LWs raytracing- functions.
We have had an idea or two concering that one.
CU
Elmar

Lightwolf
05-08-2003, 02:53 AM
Originally posted by Elmar Moelzer
Matt I am with you on that one. Same for displacement- maps, they also belong into the surface- properties IMHO!
Hm, I'm not sure about the displacement maps actually, what happens to vertices that are on surface boundaries, how should they be treated by the displacement?

Booleans in Layout should be doable by using LWs raytracing- functions.
We have had an idea or two concering that one.

I bet you do :D

Panikos
05-08-2003, 03:50 AM
Displacement map per surface is not possible.

Displacement map works per object.

What is possible is to have a parameter per surface that influences the amount of displacement, just like MotionDesigner allows per-surface control.

Additionally, there are some serious restrictions in displacement, like low-poly constraints, non-planar polys etc

If someone wants displacement per surface, he can use SockMonkey - split the main object into pieces that will stitch back,
the way Fori's old PuppetMaster worked.

amorano
05-08-2003, 12:32 PM
Originally posted by Panikos
Displacement map per surface is not possible.



Only because of current core architecture in LW. There are displacement handlers and there are surface handlers. Lameo.

As to how you would cross boundaries on surfaces, easy. Leave it up to the user. Options like Fade Out Edges (falloff), or just using them, even with breaking surrounding points from the other non-displaced surfaces, would be fine for me.

Displacement should be a surface parameter. It is technically the attribute OF an object, not the object itself.

Certain things should be looked at from an OOP perspective.

The old HAS and IS. HAS = attribute. IS = Object.

An object has a surface. An object has a displacement.

An object is not a surface. AN object is not a displacement. Although that last statement can be argued if LW was a volumetric modeler.

Sensei
05-08-2003, 01:55 PM
Originally posted by amorano
Displacement should be a surface parameter. It is technically the attribute OF an object, not the object itself.

You are completely wrong. Displacement handlers definitely can't be surface parameter, they basically works on vertices which object has. Surface in LightWave terminology is how the end polygon will looks (texture, color, bumps etc. etc.).

Displacement handler is Vertex Shader, were surface and shader handler are Pixel Shaders. The all 3D graphics card manufacturers and companies like f.e. Microsoft (DirectX), and also OpenGL authors, make distinction between them. Do you think so they are all wrong, and you are correct? ;-)

Theoretically displacement handler could check whether currently evaluated vertex is used by some surface (remember that it could be used by couple of them at the same time!), but in practice this would very very much slow down the whole rendering process (or precisely initialization stage), as you have to scan the all surfaces list as many times as your object has points. Senseless. Additionally this feature must be built-in the all displacement handlers.


Certain things should be looked at from an OOP perspective.

Agree, not in the case of displacement handlers, but others, yes. But unfortunately it's really too late for it in LightWave... Making LightWave truely OOP like program would be much bigger step than between 5.6 and 6.0...


AN object is not a displacement.

Object is raw object (which you created in Modeler) after displacing the all it's vertices...

amorano
05-08-2003, 04:48 PM
Originally posted by Sensei
You are completely wrong. Displacement handlers definitely can't be surface parameter, they basically works on vertices which object has. Surface in LightWave terminology is how the end polygon will looks (texture, color, bumps etc. etc.).

Suppose I have been writting plugins (not just for LW) wrong for the last three years then, hope no one notices.

Surfaces are attributes of an object. Period. Surfaces are not functions of the object, unless you are talking about NURBS or volumetric modelers. Both of which are not in LW.

Be that color, diffusion, bump, etc. Now, displacement can go either way. Now instead of making fifty objects all sliced into 50 layers so I can apply displacement to various sections, do this by surface so I can maintain one single object, but easily manipulate the surface parameters. And there is no reason to throw away the global displacement either.




Displacement handler is Vertex Shader, were surface and shader handler are Pixel Shaders. The all 3D graphics card manufacturers and companies like f.e. Microsoft (DirectX), and also OpenGL authors, make distinction between them. Do you think so they are all wrong, and you are correct? ;-)

Yes I do :D

Seriously though, that is MS's implementation. LW is a 3D application with a rendering pipeline for production value, not a video card with an end value of playing games.




list as many times as your object has points. Senseless. Additionally this feature must be built-in the all displacement handlers.


Not correct. Binsearch and sorts with another bin tree as an index to the verticies on that surface, very fast. Of course, again, three years of already having done some of those types of implementation must make me wrong.

Not to mention, LW already does this. Ever notice how it knows how many polygons are attached to a surface in the surface editor? And the SDK also exposes this data. So, no reinventing anything here that is not already there.



LightWave truely OOP like program would be much bigger step than between 5.6 and 6.0...


Not really. If you have any SDK experience you'll realize that most of the core wouldn't be hard to port to a OO design right now. This is obvious in the structure that has been hacked together allowing external plugins to be made -- presented like a bastardized verion of an OO program.

Sensei
05-09-2003, 01:31 AM
Originally posted by amorano
Suppose I have been writting plugins (not just for LW)

So do I, so do I. For at least two years just commercial ones. Currently I have team of experienced programmers who do it. And by myself too for the last 15 years was writing a lot of 2D and 3D graphics software as well, including in this animation & image processing and ray-tracing software...


wrong for the last three years then, hope no one notices.

I didn't say you write wrong plug-ins, I just say your thinking about where displacement handlers should be put is wrong...


Surfaces are attributes of an object. Period.

I didn't say otherwise.


Be that color, diffusion, bump, etc. Now, displacement can go either way. Now instead of making fifty objects all sliced into 50 layers so I can apply displacement to various sections, do this by surface so I can maintain one single object, but easily manipulate the surface parameters. And there is no reason to throw away the global displacement either.

If vertices would be attached (in OO environment) to surface instead of object since the first LightWave version, everything what you said would be correct. But they are not and won't be.

Polygons could be right now attached to surface because they can't have multiple surfaces at the same time. But they probably just have pointer to surface instead.


Seriously though, that is MS's implementation. LW is a 3D application with a rendering pipeline for production value, not a video card with an end value of playing games.

I am wondering whether you have experience in writing software rasterizers... The both rendering pipelines of ray-tracing and real-time 3D graphics (either hardware accelerated and software rasterizers) are very similar. The main difference, for now, is that (in the true rendering stage) ray-tracing software cast rays into scene, which then bounce couple times from one surface to another until threshold is reached, completely opaque surface is hit or ray go to universe...

Displacement handlers (Vertex Shader) are executed not in the true rendering stage, but at initialization stage the same were the all vertices are transformed, rotated and scaled...

The true rendering stage is for Shader handlers (Pixel Shaders).


Not correct. Binsearch and sorts with another bin tree as an index to the verticies on that surface, very fast. Of course, again, three years of already having done some of those types of implementation must make me wrong.

Now I am slowly starting understanding why you are so much complaining... ;) I would the same all the time if I would be writing displacement handlers for three years which have to have internal list of surfaces on which they operate or do not operate ;)


Not to mention, LW already does this. Ever notice how it knows how many polygons are attached to a surface in the surface editor? And the SDK also exposes this data. So, no reinventing anything here that is not already there.

Polygon is essencial parameter of surface, without it surface wouldn't be even exists. But displacement handlers do not work on polygons but on vertices! Making LightWave displacing all vertices of surface polygons is not big deal really (without searching or bintree or whatever other vertex processing slow downing technique) but the problem would be what to do with those vertices which are attached to couple surfaces at the same time!