PDA

View Full Version : Procedural texture not showing



LoveMetal
02-15-2015, 01:38 PM
Hi, I'm completly new at Lightwave and I ran into an annoying problem. I assume the solution is pretty simple and I missed an obvious thing...

- I create a 3D model,
- I go into surface editor and I select my surface,
- I create a 3D procedural texture with nodes for this surface.

But then, my texture is not showing on the model. Same if I don't use nodes and I create a 2D procedural texture or a gradient in the texture editor. Basic color and image map are working though.

127045

So, and idea about how to get this working?

Thank you in advance!

Version: Lightwave Modeler 2015.1

Sensei
02-15-2015, 04:05 PM
Procedural textures and nodes are appearing in VPR in Layout.
Pick up viewport mode VPR (last one).

spherical
02-15-2015, 04:07 PM
It is best to do texturing in Layout and either use VPR in the camera viewport or a series of F9 renders. This will show you what the texture really looks like, because it will be refined using the process that will output the final image. Anything in Modeler, whether using Multitexture or GLSL shading method (the latter shows more), will only be a vague representation; if indeed it shows up at all. Just remember that textures are stroed in the model file, so once you are satisfied with a result, switch over to Modeler and save the updated model.

Sensei
02-15-2015, 04:10 PM
If you have modern computer turn off Draft Mode in VPR options. Will be slower, but better quality.

sukardi
02-15-2015, 06:04 PM
Yup, openGL does not show node textures (not just procedurals). Hope it get addressed in future releases ..

jwiede
02-16-2015, 01:00 AM
I had kind of hoped they were adding the Cgfx support so they could add one that was able to access nodal surface buffers and display them, but no luck so far.

Besides creating some kind of "Cgfx intercept" node that captures channel contents before handing to (final) Surface node (which, IMO, is hacky), there doesn't seem to be a decent way for third-parties to implement such a thing. LW3DG would either need to provide some kind of internal "proxies" that pull from Surface node buffers third-parties could reference in existing Cgfx shaders, or provide a private Cgfx shader with internal capability to do same.

They need to do something to address it. Having modeling, vmaps, UVs, etc handled in Modeler, but then having to go to Layout for decent display of nodal surfaces, is not a very efficient way to work, IMO.

MSherak
02-16-2015, 03:43 AM
I had kind of hoped they were adding the Cgfx support so they could add one that was able to access nodal surface buffers and display them, but no luck so far.

Besides creating some kind of "Cgfx intercept" node that captures channel contents before handing to (final) Surface node (which, IMO, is hacky), there doesn't seem to be a decent way for third-parties to implement such a thing. LW3DG would either need to provide some kind of internal "proxies" that pull from Surface node buffers third-parties could reference in existing Cgfx shaders, or provide a private Cgfx shader with internal capability to do same.

They need to do something to address it. Having modeling, vmaps, UVs, etc handled in Modeler, but then having to go to Layout for decent display of nodal surfaces, is not a very efficient way to work, IMO.

There is CGFX support as a plugin in LW. You can download any CGFX shaders for you model and display them in the viewport. But here is the catch and limiting factor to CGFX.

You have to have all of the model UV'ed. You are in a fixed system when using CGFX. You are limited to one CGFX shader per surface. The shader determines the textures that can be applied. You can only apply one texture per channel. You are limited on what the code in the shader can do with your inputs. Not all shaders are created equal. Oh and depending if you have an Nvidia or ATI the shaders may or may not work. Also depends on the driver for the video card if it supports it.

So basically nodes would have to bake to the UV's and then be linked to the channels that the shader has for inputs. CGFX does work with renderers, only the video card. Have you ever noticed that you can only see certain procedurals in OpenGL?? That is because OpenGL is limited also. Hence why VPR was born. Allows everything to be seen in the viewport. Programs that do PBR viewports are using a fixed system for display. CGFX is great for games due to the fixed system. Not so great for a rendering package.

-M

Sensei
02-16-2015, 06:59 AM
They need to do something to address it.

That is not doable with sense..

Regeneration of texture would have to be done every single vertex is dragged by user.. And slow down would be unbearable to users.

Nodes can change appearance when camera changes position. Which means spinning/moving/zooming viewport..

LoveMetal
02-16-2015, 09:11 AM
Thank you everybody for your answers, I'll do my work in modeler.
However, I need to export my model to an FBX object (for importation in UE4), so is there a way to convert this procedural texture to an image mapped ? (and to do the same for the bump map)

Sensei
02-16-2015, 09:18 AM
Thank you everybody for your answers, I'll do my work in modeler.
However, I need to export my model to an FBX object (for importation in UE4), so is there a way to convert this procedural texture to an image mapped ? (and to do the same for the bump map)

The same problem was recently discussed in this thread
http://forums.newtek.com/showthread.php?145618-Converting-nodes-into-maps&p=1421437
You basically need to bake surfaces in Layout.

jwiede
02-16-2015, 09:37 AM
That is not doable with sense..

Regeneration of texture would have to be done every single vertex is dragged by user.. And slow down would be unbearable to users.

Nodes can change appearance when camera changes position. Which means spinning/moving/zooming viewport..

As with everyone else's viewport previews, not every single aspect needs to be supported, obviously. As for having to deal with new/moved geometry, that is a factor, but one other pkgs deal with just fine and still manage to provide MUCH more detailed surfacing previews, and do so at much higher performance throwing polys around.

Sensei
02-16-2015, 10:12 AM
As with everyone else's viewport previews, not every single aspect needs to be supported, obviously.

If not every single aspect is supported then 90% of people will be disappointed with such half baked feature, complaining on forums once again.. With nodes output from one node goes to another, where is processed any way node wants. If one step is wrongly implemented, whole tree will be wrongly displayed, without sense showing crap unreliable result..

Basically you can have only uv mapped texture color channel supported decent quality, and almost nothing else..

You can't have specularity - it's depending on camera & light source angles (what camera, what light sources in Modeler?!)



As for having to deal with new/moved geometry, that is a factor, but one other pkgs deal with just fine and still manage to provide MUCH more detailed surfacing previews, and do so at much higher performance throwing polys around.

Then what is problem using this package instead of LW?

How can we know how it was implemented? Maybe they have procedural textures with uv. Who knows. They have NODES, that can be connected any way user wants in trees? That's completely different from layer system.
Or procedural textures that are implemented by gpu?
Maybe they have procedural textures generated already for given resolution, and then just blending buffers together in layer system, like Photoshop blends layers in real-time?

Another thing with such display is having to build uv mapping on fly while user is dragging points..

Procedure to display in OpenGL nodes is like follows:

- dragger points: mesh changed shape
- build dynamic new uv map in real-time
- allocate 4096x4096 32 bit pixel texture
- for each row,
- for each column
- call node evaluation function, 16.7 millions times,
- node evaluation is returning RGB result f.e.
- fill texture,
- repeat 16.7 mln times
(or 4.2 mln for 2k maps, or 1 mln for 1k maps)

Run Surface Baking Camera with Width=Height=4096 to check how long you would have to wait for building such texture for OpenGL to have decent quality..

MSherak
02-16-2015, 11:10 AM
As with everyone else's viewport previews, not every single aspect needs to be supported, obviously. As for having to deal with new/moved geometry, that is a factor, but one other pkgs deal with just fine and still manage to provide MUCH more detailed surfacing previews, and do so at much higher performance throwing polys around.

What other packages?? I know Maya's does not do this even with viewport 2.0. By the way viewport 2.0 in Maya is their CGFX version of the viewport. If ones model is not setup like mentioned above it's just another GL window that now will look ugly if you dont adhere to the display inputs.

LoveMetal
02-17-2015, 09:12 AM
The same problem was recently discussed in this thread
http://forums.newtek.com/showthread.php?145618-Converting-nodes-into-maps&p=1421437
You basically need to bake surfaces in Layout.

Thank you, it's exactly what I was looking for!