PDA

View Full Version : Amazing Normal Displacement Links



policarpo
08-20-2003, 01:38 PM
This stuff is just too freaking cool.

Guess it's time to learn another application boys and girls.

image of armor at the bottom
http://www.cgtalk.com/showthread.php?s=&threadid=77779&perpage=15&pagenumber=10


Glen Southern's Bust
http://www.cgtalk.com/showthread.php?s=&threadid=77779&perpage=15&pagenumber=12

Aboriginal Man
http://www.cgtalk.com/showthread.php?s=&threadid=77779&perpage=15&pagenumber=19

A Dragon and LightWave Normal Maps
http://www.cgtalk.com/showthread.php?s=&threadid=79121&perpage=15&highlight=southern&pagenumber=5

He's pink and awesome
http://www.cgtalk.com/showthread.php?s=&threadid=77992

i am just stunned.

cheers!

WilliamVaughan
08-20-2003, 01:40 PM
Looks like it works great with LightWave!

xtrm3d
08-20-2003, 01:41 PM
me too so stuned..
itīs look that i can forget about having time to go deeper inside of maya..
if z-brush really work so nice with lw.. and if lw8 is so cool like it seemed to be at siigraphe. then.. then,.... nothing should be able to stop my plan for world domination !!!:D

papou
08-20-2003, 01:46 PM
saw that, i was stuned too. still stuned.

Limbus
08-20-2003, 01:50 PM
Originally posted by proton
Looks like it works great with LightWave!
Does that mean we will get micropoly displacements in LW8? ;)

I am thinking about getting ZBrush since the first examples of the work done with version 1.6 where posted. Very nice.

Florian

policarpo
08-20-2003, 01:50 PM
Originally posted by xtrm3d
me too so stuned..
itīs look that i can forget about having time to go deeper inside of maya..
if z-brush really work so nice with lw.. and if lw8 is so cool like it seemed to be at siigraphe. then.. then,.... nothing should be able to stop my plan for world domination !!!:D

calm down jack.

have some more vodka. :P

xtrm3d
08-20-2003, 02:30 PM
hello florian?
werde langsamen zeit das man sich trift .. so zum kenlernen..oder ?

hast mal icq.. ?

Limbus
08-20-2003, 02:34 PM
Stimmt!
ICQ steht da unten.

labuzz
08-20-2003, 03:45 PM
Quote:

Does that mean we will get micropoly displacements in LW8?



A REAL DISPLACEMENT in LW8?:rolleyes:

policarpo
08-20-2003, 03:54 PM
Originally posted by labuzz
Quote:

Does that mean we will get micropoly displacements in LW8?



A REAL DISPLACEMENT in LW8?:rolleyes:


um...have you used Normal Displacement from Marvin Landis yet? It helps us to achieve these results. I haven't tested it with fine details like this yet...but let's cross our fingers. :)

The workflow would be as follows:
1. Model low res mesh in LightWave
2. Import into zBrush
3. Subdivide the hell out of it
4. Paint in all the geometry deformations
5. Generate Displacement Map
6. Generate Normal Map
7. Open lowres mesh in Layout
8. Apply Normal Displacement plugin from Marvin Landis
9. Render and achieve bliss

Check out Glen Southerns bit on the dragon link up above.

Cheers,

Limbus
08-20-2003, 04:00 PM
Originally posted by policarpo
um...have you used Normal Displacement from Marvin Landis yet? It helps us to achieve these results.

Check out Glen Southerns bit on the dragon link up above.

Cheers,

Micropoly displacements would still be better because the polys would only be generated by the renderengine and it still looks better than normal mapping in close-ups.
I guess I just was hoping for some render improvements. ;)
The current workflow should work fine and still makes ZBrush a must have app.

Florian

labuzz
08-20-2003, 04:03 PM
I know this already ( on this topic I want to add that with the current limitation in the surface editor in LW it's a pain to combine some bump on top of this. And I have try the Marvin Landis tools and Microwave).

I just want to know if there's some micro poly displacement in LW8 ( ā la renderman: no need to subdivide, no need to worry about the mesh topology, render as fast as bump )
Tky for the info

EyesClosed
08-20-2003, 04:13 PM
LightWave needs a good implementation of micropolygon displacement for things to work properly.

Mattoo
08-20-2003, 04:18 PM
Originally posted by policarpo
um...have you used Normal Displacement from Marvin Landis yet? It helps us to achieve these results. I haven't tested it with fine details like this yet...but let's cross our fingers. :)

Cheers,

People keep on getting Normal Displacement and Normal Maps mixed up. They are both different things. What you mean here is Normal Maps.
It's a similar effect but the techniques are quite different.

Sorry to be so nit-picky.

I too am looking forward to trying Z-brush again for this stuff. I've tried painting bumpmaps in DeepPaint3D and applying them as Normal Displacements in LW with some succes - but too much trial and error for my liking.
Mostly problems came down to dealing with UV seams and the fact that they don't displace - you just get big long lines throughout your model.

policarpo
08-20-2003, 04:25 PM
Originally posted by Mattoo
People keep on getting Normal Displacement and Normal Maps mixed up. They are both different things. What you mean here is Normal Maps.
It's a similar effect but the techniques are quite different.

Sorry to be so nit-picky.

I too am looking forward to trying Z-brush again for this stuff. I've tried painting bumpmaps in DeepPaint3D and applying them as Normal Displacements in LW with some succes - but too much trial and error for my liking.
Mostly problems came down to dealing with UV seams and the fact that they don't displace - you just get big long lines throughout your model.


right on....thanks for the clarification. :)

Limbus
08-20-2003, 04:27 PM
Originally posted by Mattoo
[B]People keep on getting Normal Displacement and Normal Maps mixed up. They are both different things. What you mean here is Normal Maps

Would you be so kind to enlighten me on the differences?
Thanks

Florian

Mattoo
08-20-2003, 04:42 PM
Originally posted by Limbus
[QUOTE]Originally posted by Mattoo
[B]People keep on getting Normal Displacement and Normal Maps mixed up. They are both different things. What you mean here is Normal Maps

Would you be so kind to enlighten me on the differences?
Thanks

Florian

Sure.

Normal Displacement is where the mesh is displaced in 3D respective to the average normal each polygon. This is defined by a greyscale image. ie. actual geometry is used to create the height. Currently it's hugely expensive on RAM to use this effectively in LW.

Normal Maps are a bit cleverer (word?), without using many polys at all they fake having loads of geometric detail. Normal Maps are generated from a very dense mesh and look like a bizarre rainbow coloured bumpmap. This image can then be applied to a low poly version with a Normal Map shader, such as Marvin Landis'.
Think of Normal Maps as better bump maps I suppose - ie, if you look at the the edges of the model the simplistic real geometry can be seen and the illusion is lost somewhat, but it fakes this particular aspect much better than old fashioned bumpmaps.

jin choung
08-20-2003, 05:44 PM
howdy fellows,

ok, this is to even further clarify.

recently, the term NORMAL MAP has become the rage because of the unique and new techniques invented to GENERATE THEM and the prominence of DOOM 3 in the media. but, it ends up being a sort of red-herring that gets confused with a CRAPLOAD of other stuff so let me re-iterate the IDENTITY of these things, let you guys in on a free program as well as tell you why some of the coolest effects are not possible in lw.

------------------------------------------------------------------------------------

NORMAL MAP - DOOM 3 - BUMP MAP :

a normal map is a RENDERING TRICK and is essentially a 'pre-digested' version of a bump map. it is being lauded these days because modern games are using them IN REAL TIME but the technology is not new. it is essentially the bump map technology that has been in non-rt renderers like maya and lw for years and years. it does not disturb, move or create geometry. it just fakes the lights out into thinking there's detail where there is not.

it is 'pre-digested' in that the information that we have given lw as a greyscale has been converted into a 24bit color with x, y, and z perturbations of the surface getting encoded into 3 discrete colors. in lw and maya etc, this has probably been done after we give them the grey scale image but internally (and transparently to the user) on the CPU. in using this 24bit NMs in games, it just speeds up the calculation process.

another big breakthrough that is only TANGENTIALLY and COINCIDENTALLY related to normal maps is a new methodology of GENERATING them.

what i mean is that before, when we created bump maps, we simply went into photoshop and started drawing pimples and boils and wrinkles and such using greyscale values to determine the amount of 'in' or 'out'.

however, the advent of normal maps coincided with a method of ENCODING the data of a high res mesh to a lower one by means of a bump map.

so, instead of going into photoshop, you load up two models into a specialized app or into your 3d app with a plugin - the low res mesh is uv mapped. you run the special procedure and it COMPARES THE SURFACE DIFFERENCES of your low res mesh to your high res mesh and it writes those differences as a NORMAL MAP!

imo, this is the big revolution. not the normal maps themselves. NMs are an evolutionary optimization, not a big leap in thought.


NORMAL DISPLACEMENT MAP - THE LORD OF THE RINGS - DISPLACEMENT MAP:

here, lw is a bit more confusing with its terminology because it uses the term NORMAL DISPLACEMENT to describe what we've always known as simple displacement maps. the only difference is that now lw can do displacement according to a uv mapped greyscale map. but other apps have ALWAYS been doing this so it's not really a new thing that lw came up with but a catching up.

i know lw calls it 'normal' displacement but just think of it as a displacement map and you're good. this does not refer at all to a normal map which we discussed above.

again, the big revolution with displacement maps is not the technology itself - we've had displacement maps for years and years and years in lw, maya etc.

some games DO use DMs for things like terrain but it's still not a big feature for real time applications.

again, the big deal with old displacement maps is this new way of ENCODING the data of a high res mesh into a lower res mesh by means of an image map.

in this case, it is just the same as generating a normal map - the specialized program compares the surface of the high poly mesh compared to the low poly and writes in the differences. but this time, instead of spitting out a 24bit normal map, it spits out an instantly recognizable grey scale map that is simply applied as a displacement map.

but there's one other difference-

see, as we all should know, a displacement map DOES NOT CREATE ANY NEW GEOMETRY!!!! so, in the case of a low poly mesh be 100 polys and the high poly being 1million, if you take the resulting image map and map it onto the low poly, it's gonna look like utter crap.

SO - the other difference is that the low poly model in this example is a HIGHER ORDER SURFACE - that is, it is not just straight polys. it must be either a NURBS model as in LOTR (why or why they decided to model characters in NURBS is beyond me.... really, in this day and age) or an SDS model.

thus, your 'low density' mesh still enjoys the advantages of have a 'light' CONTROL CAGE.... but it gains the additional benefit of maintaining all the detail that was sculpted in - which would only be possible if you FROZE INTO ULTRA HIGH DENSITY POLYS and then went at it.

so for LOTR, they created a nurbs model that is relatively light, with large smooth expanses for all the major details like arms, legs, etc. then, they save that as the 'low density mesh'. then, they start adding detail like crazy, doing anything they like like converting to polys or even booleaning, trimming, blending etc. then, they take that as the 'high density mesh' and run the specialized app and viola!

see? now when you weight for deformation and stuff, you're still just doing it on the control cage, but heck if the model does not look like an ultra dense poly model!

of course, in order for the displacement to work well, the subdivision levels have to be turned up appropriately high when you're rendering.

MICROPOLY DISPLACEMENT

and now for this final bit of confusion.... i think the only thing that is meant by this is renderman's ability to intelligently, ADAPTIVELY subdivide a higher order surface at render time, and be aware of the render resolution!

further, the adaptive subdivision takes into account distance to camera+renderresolution and subdivides so that if desired, a facet is the size of a single pixel or less. what that means in no facets - ever.

for higher order surfaces. if you have a low density poly cage, it will render that just as it is.

also, renderman is aware of WHERE on a model more subdivision is required (on high frequency, high amplitude details) and so, if you apply a displacement map, you still get perfectly smooth, non faceted renders.

this must be compared to most renderers (lw inclusive) that actually simply TESSELATES (turns to TRIANGLES) at render time and is completely non-adaptive and non-intelligent. you determine a setting for which lw will subdivide and everything on that model gets rendered at that setting.

oh, one other difference is that renderman's micropolys are quads... most others tesselate.

as a side note, it is possible to RASTERIZE DIRECTLY FROM THE HIGHER ORDER SURFACE into pixels. that is, it never creates approximations like micropolys or traingles.

at least one nurbs app does this (realsoft i think it's called).

but the big downside is that it is much much much slower doing this. and with renderman's adaptive subdivision, it's hardly necessary.

WHY THE DISPLACEMENT TECHNIQUE IS USELESS TO LW:

sds uv map distortion. in order for this to be usable in lw, we would have to be able to make our final, low density sds model, lay out the uvs - then freeze the model to polys.

then, we take that frozen thing into the special displacement generator app and make it compare between the sculpted high density model and the non sculpted, simply frozen version of the low density sds cage.

[we freeze because most external apps do not accept SDS models of anykind from anywhere - cuz everybody does this differently unlike nurbs.]

but, when the external app spits out the displacement map, we simply apply it to our low density sds model that has not been frozen. it works because the uvs are laid out exactly the same so the image map should fit.

but since we cannot use uv mapping on sds models without distortion, this is not possible for us.
------------------------------------------------------------------------------------

but if it were possible, we could use:

http://www.soclab.bth.se/practices/orb.html

a completely free app that supports .lwo!!! the author needs meshes for testing so hook him up - it's the cool thing to do.

it also generate normal maps for us like marvin landis' excellent free app.

the big difference between the two is that ORB will generate a standard grey scale image that can be applied as a bump or normal map.

------------------------------------------------------------------------------------

hopefully that has clarified and not confused.

thank you.

jin

policarpo
08-20-2003, 08:34 PM
ummm......

ok


:D


thanks for the breathy explanation.

guess our coders need to get to work. :)

dwburman
08-20-2003, 09:50 PM
Would I be way off in thinking that a Normal Map is like a Bump Map and a Normal Displacement is like our Bump Displacement.

Not quite the same thing but related in similar ways?

jin choung
08-20-2003, 10:18 PM
i think that would be the gist.

but i think i liked my way of saying it better. :)

oh, but hold on now... that's right, lw's version is called 'bump displacement'... but it in the manual as displacing along a poly's normal.... so then, the actual term 'normal displacement' is not to be found in lw docs....

and so, if normal displacement is not simply a descriptive, then i have never heard that term before....

but it's probably a descriptive.

jin

Mattoo at work
08-21-2003, 05:24 AM
Originally posted by dwburman
Would I be way off in thinking that a Normal Map is like a Bump Map and a Normal Displacement is like our Bump Displacement.

Not quite the same thing but related in similar ways?

Funnily enough Normal Displacement in Lightwave is called Normal Displacement.
If you don't believe me take a look in your Object Displacement list and look for a plugin called.... drummroll please - "LW_NormalDisplacement". :D

It's been there all along but many aren't even aware it's there. It works pretty well but as I said, UV seams aren't handled very well but you can work around them if you know where they are.

riki
08-22-2003, 07:08 AM
The results look pretty amazing. Did everyone see the test animation

http://www.pixolator.com/zbc-bin/ultimatebb.cgi?ubb=get_topic&f=1&t=011231

The flesh/muscle structure doesn't seem to move much.

Red_Oddity
08-22-2003, 07:54 AM
All Lightwave's Normal_Displacement does is, instead of moving a displacement along a predefined axis (as with the normal/default displacement mapping) it just moves the displacement over the normals of the polygons (or points, not sure...), nothing more...

So instead of pushing a point along e.g. the X-Axis, it moves the points along the normals of the object (probably averaged...)

Another note, this displacement is the most usefull one of LWs displcements in my opinion, it is however, very....veeeerrrryyyy...sloooooooowwwww....i guess it precalculates the averages or directions before you can start adjusting the texture you use...

here's an example:

EyesClosed
08-22-2003, 02:44 PM
We want real micropolygon displacement! :D

fortress
08-23-2003, 03:23 AM
i like the tree red_oddity can you post a wore framr of that

Original1
08-23-2003, 08:39 AM
What I really want to see within Lightwave is a ZBrush type window which allows you to paint on your object in the same way as Zbrush.

So you could start off say by surface baking a procedural into one or two layers useing it as the base layer for your displacement and the paint the detail in in real time.

I don't want to go to another outside application for my workflow

wacom
08-23-2003, 09:12 AM
Originally posted by Red_Oddity
All Lightwave's Normal_Displacement does is, instead of moving a displacement along a predefined axis (as with the normal/default displacement mapping) it just moves the displacement over the normals of the polygons (or points, not sure...), nothing more...



here's an example:

So if I get this right- what we need is a "merged" version of these tools- no? One that uses normal maps on a mesh except when the incident angle of the normal reaches a certain point- then it references the normal displacement mesh/map. Is this what we're wanting...so that we get great details without all of the render hits? That way the profiles of objects would get the mesh detail (and produce truer shadows) and the rest would as well...but without the mesh overhead due to normal map use...

It would be great if we could set this to have an incident angle tolerance too that is adjustable over time (aka distance from camera)...maybe with a gradient?

hmm....would be nice...I'd take that over edge weighting
anyday!:)

Karmacop
08-23-2003, 09:25 AM
Hmm, you could almost do that now. When you use bump displacement it displaces the verticies by their normal instead of the x y z axis. So then you could alpha this with a incidence layer. Ofcourse I don't think this would change render times even slightly as it's rendering essentially the same thing.

wacom
08-23-2003, 10:36 AM
Originally posted by Karmacop
Hmm, you could almost do that now. When you use bump displacement it displaces the verticies by their normal instead of the x y z axis. So then you could alpha this with a incidence layer. Ofcourse I don't think this would change render times even slightly as it's rendering essentially the same thing.

Yes...but NT would have to combine these options in a diffrent way- that way there is a seamless blending of the two. It would also mean that unlike the solution you stated above it would only render the part of the object that needs displacment mapping...not just fade it out. We should try your workaround though...maybe...just maybe we could get something to work? The Normal_Displacment plug-in though is tricky to work with...
I'm going to try and use ORB or Normal Mapper to get a Normal map and a displacment map and then experiment with what you just mentioned above...I for somereason have faith that it'll work! It's too bad that clip maps aren't useable on a surface by surface basis though...

Karmacop
08-23-2003, 10:41 AM
Clip maps would be great i they were done by surface instead of object :)

wacom
08-23-2003, 10:45 AM
...I guess for tests the high poly count displacment mesh will have to remain a proxy untill render...or else LW will almost be crawling on my machine...If you could test your theory too and post your resaults we might be able to get something going...

jin choung
08-23-2003, 03:41 PM
nonononononono

no merging is necessary!!! we don't need a 'normal map' for displacement!

if you use ORB, you can create a grey scale displacement map already.

and lightwave already DOES have 'normal displacement'!

so we're GOOD TO GO there!

BUT

the problem is that this only really is worthwhile on HIGHER ORDER SURFACES like nurbs or sds.

the problem is that we do not have uv mapping of SDS without distortion.

that is the single problem that we have that prevents this from being a useful tool for us.

jin

wacom
08-23-2003, 05:39 PM
Originally posted by jin choung
nonononononono

no merging is necessary!!! we don't need a 'normal map' for displacement!

if you use ORB, you can create a grey scale displacement map already.

and lightwave already DOES have 'normal displacement'!

so we're GOOD TO GO there!

BUT

the problem is that this only really is worthwhile on HIGHER ORDER SURFACES like nurbs or sds.

the problem is that we do not have uv mapping of SDS without distortion.

that is the single problem that we have that prevents this from being a useful tool for us.

jin

I didn't mean to "merge" normal maps and normal displacement maps. What I was getting at was that it would be nice if these two "plugins" were merged into a more helpful plugin with one GUI. Even if you fix the UV problem it would still be nice to have everything in one spot- working in unison. We DO need a normal map if you want everything at a certain incedent angle to be using a normal map and a low res mesh AND a highres mesh with a normal displacement map for the other incedent angles. Isn't this where you'd save render times and still have a high level of detail? Normal Displacement maps alone can't cut render times can they? Am I wrong here?

I'd also like for this plugin to have the use of proxys for Sub-D objects and standard proxy meshs for the normal displacement object which can be set to camera distance etc.

Karmacop
08-23-2003, 06:38 PM
Jin, when you convert an rgb normal map into a greyscale bump map you lose alot of information. 2/3 to be exact :p

jin choung
08-23-2003, 10:09 PM
howdy karmacop,

aha! the debate begins anew.... i initially thought that that must be the case as well but i've been doing some informal numbers in my head and i'm not so sure anymore.

first, let's state the obvious:

8bit = 256 discrete values

24bit = 8bits red x, 8bits green y, 8bits blue z = 16,777,216 discrete values

----------------------------------

ok, but all the normals facing AWAY from you are encoded in this as well. this would be the equivalent of generating normal maps in WORLD SPACE which is not used for characters - again because any normals facing away from is useless and will plunge that area in darkness.

normal maps for characters and such should be generated in TANGENT SPACE.

but i'm not sure how that adds up in the numbers... does it halve the 16million? does it eliminate the y component altogether or half that to 4bits? actually, it may reduce all the rgb values to 4 (useful) bits and so do you get 12bits?

also, if a normal is facing away such that it cannot be seen (all the normals on the 'other side' of the sphere including the normals that are perpendicular to the normal coming straight toward you)

also, the 8bit depth map also sees to be a compressed form of encoding x,y,z perturbation anyway... because if you generate displacement data from a heightmap, the x,y,z perturbation is implicit.

but whether that compression NECESSARILY results in less 'accuracy'... again, i'm not sure but it may not actually be true.

anyhoo, whatever the case may be, i am quite certain that it is NOT a comparison of 8bits vs. 24bits. but yah, what the actual numbers are, someone else has got to come up with that.

jin

jin choung
08-23-2003, 10:16 PM
hey wacom,

right right. the displacement map does NOT reduce render times. it optimizes your workflow in that you can work with a low density mesh during skinning, animating, etc while rendering a superdense one.

if you want to do LOD kinda stuff, normal mapping a low poly model would work great. but then again, so would just run of the mill bump mapping. key to NON RT stuff is that generally, you're just rendering something with inherently less detail anyway because of distance to camera or whatever... in that case, any kind of approximation ain't gonna be all that noticeable anyway.

jin

Karmacop
08-23-2003, 10:43 PM
They are called Object space and Tangent space. I've never heard anyone say object space shouldn't be used for characters, I think it's just fasterto compute ...

Anyway, tangent space is still 24 bit colour, it's just higher detail as it maps rgb from -90 to +90 instead 0f -180 to +180.

Anyway, if you rendered something with a normal map (rgb) and the converted it to a bump map (greyscale) and rendered that you'd see a big difference.

I'll try to post some shots once my site comes back online ...

jin choung
08-23-2003, 11:04 PM
hey karmacop,

yup, i'm quite certain that if you convert a normal map to grey scale, it will not look as good in a render. i've actually seen that before in a newtek forum.

but what i do wonder about however is using one of these derivation tools - take a hi poly and low poly model, and then from here, generate both a normal map and a bump map.

then render.

would it look markedly worse you think?

as for why object space would not be useful... actually, i've tried generating both types and o-space is certainly not useful because the normals are facing away! that means it's always in shadow effectively.

also, in tangent space, it does indeed seem to reduce the data... the distinctive pale blue/green/magenta of tangent space is much reduced from the much broader selection of colors derived from an o-space normal map.

so yah, the image you get is still 24bit but some colors are just not being used anymore... effectively reducing the (again - USEFUL) bit depth.

jin

Karmacop
08-24-2003, 12:04 AM
Yes it'd look worse. You may think they'd look the same but they don't. Just think about it this way, if you could have a greyscale bump map that would render exactly like an rgb normal map you'd have a great way to compress images ;)

I'm, not sure what you mean about object space .. sure there's normals facing away but when you move the camera to the other side of the object those normals would then be facing towards you ... :confused:

And I thought tangent space still used the entire spectrum of colour? Maybe I'm wrong ... dunno ...

jin choung
08-24-2003, 02:59 AM
like i wrote in my dissertation above, the primary difference with the normal map is that it seems to be PRE-DIGESTED.

that is, when lw renders bump maps in non real time, it probably creates a normal map or something like it internally as part of the process of calculating.

creating the normal map probably just saves calculation steps. i think their prevalence now in gaming is NOT THAT THEY ARE MORE ACCURATE (!) - it's that they're FASTER.

actually,

for the normals facing away, try creating a hi-res wrinkled sphere with a low poly version to map onto.

if you generate in 'object space' (actually, i've learned it as being WORLD SPACE), it will NOT render correctly for one side. think about it, even with the normals that ARE facing you, they represent a VECTOR DEVIATION from 'standard'. what is that standard? it's probably calculated from the NORMAL OF THE POLY TO WHOM THAT PIXEL BELONGS TO (on the low poly model). if that is so, if a normal is designated as FACING AWAY FROM YOU, it is designating that condition AS CALCULATED FROM THE POLY TO WHICH THAT PIXEL BELONGS - at all times!

meaning it is ALWAYS FACING AWAY no matter how you may rotate the camera.

also, if you do this test with the sphere, you can CLEARLY SEE that the color range is different and that tangent space's color space IS SMALLER and clipped.

it is NOT 24bits of just more values between -90 to +90. the 24bits would have represented -180 to 180 and now that you've reduced that range in half, it is probably some percentage smaller than the 24bits.

it may in fact be 12bits or even less.

again, the math eludes me but i think that it is NOT CLEAR that a normal map is vastly superior in quality to a bump map.

jin

Karmacop
08-24-2003, 03:20 AM
Yes, lightwave makes a normal map from the bump map. It needs to do this so it knows which way each pixel is facing. The reason games haven't done this before is that a) most cards didn't support dot3 bump mapping b) normal maps take up much more memory than a bump map. It's for this reason that games either didn't have bump mapping or just used simple emboss bump mapping (like how photoshop does its bevels).

Simply, tangent space normal maps are based on the polygon's normal (it uses it as a base) and object space is based on world co-ordinates. If your object isn't rendering correctly then you should shange if you're using object or tangent space to render. They aren't compatible so set up the shader properly.

Normal maps (24 bit) and supperior to bump maps (8 bit) as long as they are created properly.

jin choung
08-24-2003, 03:27 AM
again,

i think it's more complicated than you're making it out to be. but i guess it's a point that we'll have to disagree about because neither of us have the underlying math involved to prove anything.

jin

p.s. as for changing from tangent space to object space in the renderer - I'VE NEVER EVER EVEN SEEN THAT AS AN OPTION! not in the lw normal map renderer from marvin landis, not in orb, and not in the ati package that marvin's plugin is based on.

Karmacop
08-24-2003, 04:07 AM
For the ati program you run "normalmapper w (info about models, textures etc here)"

Karmacop
08-24-2003, 05:44 AM
top row is the same as the bottom row except the bottom is just a grey/white colour with wire frame over the top.

From left to right they go: High res mesh (81920 polys), Low res mesh (320 polys) with a normal map (object space), Low res mesh (320 polys) with a bump map.

As you can see the grey scale bump map shows great smaller detail but lacks any of the larger detail that the normal map shows quite well.

riki
08-24-2003, 06:46 AM
So what's the verdict?? Is it worth buying?? Any major draw backs or limitations??

jin choung
08-24-2003, 04:59 PM
hey karmacop,

nice work. but how was the bump map derived? was it a conversion of the normal map or did the generating app create a grey scale?

cuz as i said, i was pretty sure that converting a normal map to grey scale would result in an inferior quality bump map (which i've actually seen before as well).

and actually, if you are converting a normal map to grey scale, HOW are you doing that? simply DESATURATING would not be appropriate conversion....

afaik, orb is the only thing that will give you a grey scale from the comparison procedure. (oh, microwave too but that's ludicrously expensive).

thanks

jin

Karmacop
08-24-2003, 08:02 PM
Maybe I'll have to download orb to prove to you then? ;)

But no, since it was a sphere I put a null inside and then gave it a gradient colour surface from that null starting from the objects lowest point and going to the objects highest point. That was the best way I could figure out to do a bump map ...

jin choung
08-24-2003, 09:12 PM
aha!

well you needn't download it and do it for my sake but creating the bump map in that way is indeed an inferior method. since it compares distance from the null rather than from the closest low poly model's poly.

anyhoo, rock on.

jin