PDA

View Full Version : PBM materials question



djwaterman
05-24-2015, 05:08 AM
I've just been scanning the Modo forums to read the buzz about the latest release. One of the things that has been mentioned is it's new Physically Based Materials system, PBM as it is becoming known as. My question is, isn't this something we've always had? It's just that people choose not to set up their surfaces that way in a lot of cases. I'm assuming the Modo system is just something that takes away the users ability to make the wrong choices. Is this correct or is there something else about PBM shaders that I'm not getting?

stiff paper
05-24-2015, 10:12 AM
The whole PBR and PBM thing has come from games (which sounds like a bad thing but really it isn't; games has put more thinking into this kind of thing than the pre-rendered crowd has, because games still has a very strong impetus for making their graphics sing, even in real-time.) It's essentially a rethought way of approaching how surfaces and shaders work that adds consistency, ease of use, and results that are much closer to looking realistic.

These two guides will probably help:
https://www.allegorithmic.com/pbr-guide

Allegorithmic and Quixel both have apps that make painting textures and controlling materials much easier than it's ever been before. In order to properly and fully use the output from these and get all the benefits, a 3D package needs to have compatible shader(s) built in, which is probably what the modo news was about.

JoePoe
05-24-2015, 12:44 PM
So.... are PBMs analogous to our "energy conserving" materials?

stiff paper
05-24-2015, 03:09 PM
Well, I'm not an expert. I don't even use any of this stuff at the moment. Having said that...

LW's energy conserving bumf is more a kind of early, very limited nod in the direction of what's being done now. Quixel has a setup that they use to photograph physical materials in a way that gives them all the relevant real world values for materials in their PBM/PBR surfacing pipeline. Take a look, the results are nice:
http://quixel.se/megascans

Allegorithmic's systems seem less reliant on always using outright photographic information and more about using the correct values for any specific material, which seems to make it more flexible, as far as I can tell.

Looking at the Quixel stuff makes me think PBMs are bound to become normal in archvis, because... well, why would anybody ever not. After that, it looks like it's the future of surfacing in general (for now anyway) and it's almost made the leap from real-time into pre rendered.

Does LW need this? Yes, of course. But... well... yeah. Who knows when, really. It's impossible for the dev team not to know about it, so I assume it's in the plan. At some point. Probably. I expect.

djwaterman
05-24-2015, 11:03 PM
I guess I was saying, is anything stopping us setting up our materials this way right now just using the node system?

lightscape
05-24-2015, 11:41 PM
I guess I was saying, is anything stopping us setting up our materials this way right now just using the node system?

You can probably do some lw node networks to make use of the different types of texture maps that Substance or 3dcoat pbr exports. Maybe...
Estimate the values needed to match the look of the viewport from those two appz and use vpr in layout.
Even octane materials using physically correct materials can't translate well to lightwave so who knows.

But think about it like linear workflow in lw. We could do linear workflow since maybe lw 9 but think how much faster and easier it is to do now in lw 11. Would anyone want to go back to how lw 9 tries to do linear workflow?

For me pbr viewport in modo looks great. Its a boost in productivity when trying to get a consistent look not just in modo but in all other appz.

stiff paper
05-25-2015, 07:48 AM
I guess I was saying, is anything stopping us setting up our materials this way right now just using the node system?
Short answer: yes, everything.

Long answer: yes, everything. It's a different shading model. LW would need a new shader writing.


You can probably do some lw node networks to make use of the different types of texture maps that Substance or 3dcoat pbr exports. Maybe...
Hmm. Yes. Maybe. At best.

Maybe one of the node gurus can whip up some crazy network to utilize the maps. But it's definitely a maybe, and I bet it would take eight months to render a single frame full of those "materials."

I'm not saying LW can't do nice materials that obey some of the tenets of real world materials... but it's on a one-off, work at it and work at it again until you've got something good. That's pretty much the horrific oh my god please make it stop version of the new PBR/PBM workflows, which are all about speed, ease of use, and joyous results.

Surrealist.
05-25-2015, 11:06 AM
Also, it is just another way to arrive at the same thing. You can also export maps to use with a standard shading model, diffuse, spec normal. The advantage here is the fact that these maps get generated based on the material. So even if you use a "last gen" shading model it is still attempting to automate the process of giving you maps that will represent the surface properties you would expect. In lightwave you could do pretty good with those maps even in the standard layer system. If the map is not giving you what you want then you can always make adjustments to the overall value. And all this said of course you can just plug in the PBR maps and have a play.

As for a real time workflow that is another subject. A lot of render solutions are starting to adopt this kind of shading model. It is just another way to interpret a surface. And until it arrives in LW it is probably better to not bother trying to copy that so much as simply use your eyes to do what looks right - as usual.

lightscape
05-25-2015, 11:46 PM
I'm not saying LW can't do nice materials that obey some of the tenets of real world materials... but it's on a one-off, work at it and work at it again until you've got something good.

That's why I've been posting requests for Substance, 3dcoat, pbr support in lightwave. Lw needs to move forward.
Modo added the new shading model and the stuff they showed with the "Advance Viewport" looks really good.

Surrealist.
05-29-2015, 04:10 AM
It was my interpretation of the intent behind this question to be more around shading approaches rather than real time rendering. I see them as two different things. I see that PBR or PBM are not intrinsically tied to real time rendering. I do know that both Quixel and SD can render in real time with either a "last gen" shading model or the more recent approach that has been associated lately with real time rendering. But they are separate. Renderman uses a very similar approach with its shaders where for example you can adjust the roughness value in a similar way to how you'd use it in a PBM.

Most of my clients are still using last gen, because in the apps we are porting to for them PBR has not arrived - yet. On the other hand I do have a client where I am porting stuff into UE 4. And there is even a preset for that output in Quixel. So I often switch between shading models from client to client based on the output. And in the case of Quixel I can change the output type at anytime. So in the event that a client asks for a PBR version of maps it would be simply a matter of switching the output type and exporting the maps needed.

I have used these maps to render in Blender, LightWave and Maya/Mental Ray as well. It is just a matter of knowing what you want to get out of it and using the maps accordingly based on how the shaders in your target render solution are designed to look.

And you can do this in LightWave with any texturing approach you choose, be it nodes or classic layers or both.

Netvudu
05-29-2015, 06:36 AM
True, Richard, but you cannot see that in real time in the viewport...and it should be possible.

Surrealist.
05-29-2015, 06:43 AM
Absolutely agree. But I don't see that this was the original question as LW does not have a real time viewport. And clearly no one was expecting to be able to do that with nodes - I imagine. I think it had to be more to do with getting the look of PBM.

And to the original post as I read again. Yes you have flexibility with either system. Meaning you don't have to be tied into any way of doing anything. It is just another way to interpret the same thing. And with nodes, layers, shader tree or whatever you are using, you can plug nodes/layers in the chain to control any aspect of it.

mummyman
05-29-2015, 08:28 AM
PBR's also have the ability to change the resolution of the map on the fly. Nice feature! You can embed custom channels. Very nice. (can't say from experience) Just by watching Substance Design videos.

Amurrell
05-29-2015, 05:45 PM
So.... are PBMs analogous to our "energy conserving" materials?

Actually the key to PBR materials is energy conservation, the key difference in the new PBR setups, is that they take most of the heavy work that you would have to do and boils it down to 4 simple parameters such as base color (diffuse/albedo), normal, metalness/spec, and roughness/gloss, and lets the software do the proper calculations, so that you can just create without having to keep in mind a bunch of rules. So yes, they are energy conserving materials, but easier to set up in most cases.

MSherak
05-29-2015, 09:39 PM
So.... are PBMs analogous to our "energy conserving" materials?

It's basically adding fresnel to viewport rendering with a mini rendering engine and shader. Mini-engine since you have to calculate the light to camera angles and vice versa for the shader every frame.


True, Richard, but you cannot see that in real time in the viewport...and it should be possible.

Problem is that most of real-time shaders require DirectX to use all the bells and whistles. Course this only works on windows related machines. You can have a OpenGL version just takes more knowledge of shader code in the mini-engine. Substance has two engines depending on your card.


PBR's also have the ability to change the resolution of the map on the fly. Nice feature! You can embed custom channels. Very nice. (can't say from experience) Just by watching Substance Design videos.

Substance Designer is nice but at the root it is just a nodal system that bakes textures. Video cards can only use textures for real-time rendering. Also any model that does not have UV's can't be used. Projections take calculations that chew up rendering time. The Tweaks are for adjustments during the baking. You will not find these are real-time parameters.


Actually the key to PBR materials is energy conservation, the key difference in the new PBR setups, is that they take most of the heavy work that you would have to do and boils it down to 4 simple parameters such as base color (diffuse/albedo), normal, metalness/spec, and roughness/gloss, and lets the software do the proper calculations, so that you can just create without having to keep in mind a bunch of rules. So yes, they are energy conserving materials, but easier to set up in most cases.

They boil it down to mixing the textures based on the fresnel angle between the lights and camera. Really nothing more than an "if >= then" based on an angle. So a 0.5 would be a mix of diffuse to the camera between 90-0 and reflection of lighting between 0-90. Just mix between them since if you have more reflection you have less diffuse. The other settings like gloss, roughness and the normal are the extra mix to break up the smooth transition.



Check this out.. http://forums.newtek.com/showthread.php?146223-PBR-Nodal Look at the nodes carefully and you will see that this is only being plugged into the diffuse shading and the normal. All reflections, color, spec, roughness, gloss are mixed in. Again this only works in the non-camera viewports with VPR due to the camera settings work different when taken into account looking though a simulated lens. (Mainly how normals are computed) No such adjustable settings for non-camera viewports.

-M