View Full Version : Modulating Normal effect from procedurals

07-01-2010, 02:15 AM
i'm trying to use 3D procedural texture's as normal maps, but i cant seem to tweak the result.
Basically i plug a FMB's Bump into Normal; result is nice but no parameter into FBM will change result's intensity.
Being vectors i should mix them with other vectors in order to "flatten" them, but every setup i came up with didnt do the job.
Any help ? Basically i'd like to buold asetup to tweak NM's intensity as we do commonly with Bump.


EDIT - if i plug Bump into Normal or Bump, (changing intensity properly in latter case) i get exactly same result. I wonder what does this mean: cant we use procedural's normals as Normal Map at all ?

07-01-2010, 06:17 AM
Well if you got same result by pluged normal and bump in the normal input: it's weird. But you can tweak the normal intensity by using the bump to normal node from Denis Pontonnier, this way the bump amplitude value of your bump texture will affect the normal.

07-01-2010, 06:47 AM
Not sure I understand this right, but why don't you just plug the Bump output o the Bump input and use Bump Amplitude in FBM? You can't use bump on normal, because they represent different things mathematically.

If you absolutely want to use Normal, then use the Bump Normal (converter node) from DP Kit.

The intensity in the preview (ie LW renderer) changes bump intensity, maybe it's an fprime thing that it isn't working?

07-01-2010, 07:09 AM
Mucus- thanks i'll give it a look
Myagi - that's what i wanted to know. Since Bump output in procedurals is blue, i thought it was a vector output so i thought i could use it into Normal. I'll also check Fprime thing.


Captain Obvious
07-01-2010, 07:14 AM
Blue is not *normals*, it's *vectors*.

07-01-2010, 07:16 AM
yes, vectors, and Normals are vectors too, afaik.
If i'm true, why cant i use one on the other ?


07-01-2010, 07:26 AM
A vector is just a (float) triplet, but what it actually represents depends.

A normal is a direction vector (of length 1) pointing in a direction, defined as (x,y,z). An unbumped normal would still be length 1, but pointing in the same direction as the (smoothed) face normal.

Bump is more like an x,y,z offset/perturbation of the normal, where (0,0,0) would be unbumped.

Apples and oranges, even if both are represented as a value triplet (a vector).

07-01-2010, 07:28 AM
thanks, (slightly) clearer now ;)


07-01-2010, 07:31 AM
an additional example :)

just like a scalar is a single value, but it can represent a distance, an angle, an offset, a percentage etc.

a vector is just a 3-pack of scalars, what the values represent depend on the context, a position in space, an x,y,z offset, HPB rotation, unit length vector etc.

07-03-2010, 12:16 AM
I always wondered about normals, as defined in 3D. the rgb only give one set of coords but a vector needs 2 sets. So Lightwave and other packages use the smoothed normal at the same position as the mapped pixel location for the 2nd set?

07-03-2010, 06:16 AM
What do you mean by 2 sets?

A normal (a unit length direction vector) is specified by 3 scalar values [x y z], [1 0 0] would be pointing straight to the right. Which is why it works with Color, that's also 3 scalar values.

For all lighting calculations etc, the smoothed normal is used. If you do advanced node stuff and use the Spot info node, then you also have access to the unsmoothed face normal ("Geometric Normal"), but that's something rarely needed.

When you plug something into Normal, the smoothed normal generated from geometry is essentially replaced with the new normal you supply (from for example a normal map). Thus all lighting calcs etc. use that one, and you get normal mapping.