PDA

View Full Version : Liberty 3D (Ubercam)... lighting and textures go wonky



cyclopse
02-12-2017, 11:12 PM
I'm grateful for any help.

I was under the impression that the Liberty 3D Ubercam was a simple replacement for the stock cameras to render out an Oculus (stereo immersive) camera. But... as you can see from the renders... the lighting goes wonky, and the textures (like the window becoming 100% reflective) go wonky as well.

Anybody else using this plugin?

135975
(Perspective Camera)

135976
(Liberty3D Stereo Immersive Camera)

cyclopse
02-13-2017, 12:22 AM
UPDATE: Ubercam does NOT play well with radiosity. Also, it doesn't seem to play well with fiberFX. I can live with the radiosity problem (with groans of disappointment of course). But this 8000 frame hair simulation took me 3 weeks to get just right. Any suggestions would be AMAZING!

135977
(you can see the fibers in the upper right rendering as if the camera was a standard 4kx4k perspective camera)

MichaelT
02-13-2017, 04:23 AM
Maybe these will help you:

http://forums.newtek.com/showthread.php?144903-Best-and-fastest-way-to-render-360-Panorama-in-Lightwave&p=1414521&viewfull=1#post1414521
http://www.peteryu.ca/tutorials/lightwave/advanced_camera_panoramas

cyclopse
02-13-2017, 04:43 AM
Maybe these will help you:

http://forums.newtek.com/showthread.php?144903-Best-and-fastest-way-to-render-360-Panorama-in-Lightwave&p=1414521&viewfull=1#post1414521
http://www.peteryu.ca/tutorials/lightwave/advanced_camera_panoramas

Thanks. Yes, if I was doing just straight panos, that'd work well. I'm trying to do stereoscopic panos though for oculus (which can't just be side by side sphere cams because then at the sides they meet up, and behind inverts the eyes).

Niko3D
02-13-2017, 05:32 AM
I use Advanced Camera "method" just because UberCamera or Lightwave (I don't know exactly which one) doesn't support Radiosity. So for me it's impossible and ridiculous to render without GI.
Advanced Camera works very good to me...;)...maybe it is a bit slower than UberCam but you can use GI!!!!!

Danner
02-13-2017, 05:44 AM
Niko, besides the speed, Advanced Camera doesn't do stereoscopic. I figured out a workaround to do sterescopic with Advanced camera by rendering a slit, rotating the camera then reconstructing the image, but its slow and you can't animate it. (well you could.. but you'd go mad)

cyclopse
02-13-2017, 06:00 AM
Niko, besides the speed, Advanced Camera doesn't do stereoscopic. I figured out a workaround to do sterescopic with Advanced camera by rendering a slit, rotating the camera then reconstructing the image, but its slow and you can't animate it. (well you could.. but you'd go mad)

Exactly... yeah... this is an 8000 frame animation (music video with several camera angles)

Niko3D
02-13-2017, 06:14 AM
Niko, besides the speed, Advanced Camera doesn't do stereoscopic. I figured out a workaround to do sterescopic with Advanced camera by rendering a slit, rotating the camera then reconstructing the image, but its slow and you can't animate it. (well you could.. but you'd go mad)

Sorry maybe I didn't understand...
But I use Advanced Camera and I did a lot of Stereo Images for OculusVR....and it works fine...

- - - Updated - - -

you need to put "stereo"...and later to save left and right eye...

cyclopse
02-13-2017, 06:17 AM
Sorry maybe I didn't understand...
But I use Advanced Camera and I did a lot of Stereo Images for OculusVR....and it works fine...

- - - Updated - - -

you need to put "stereo"...and later to save left and right eye...

doesn't that just create two sphere cameras side by side? (then you converge on side views and actually invert eyes when looking behind you) Or is it true stereo for all 360 degrees?

Niko3D
02-13-2017, 06:18 AM
This is one for example (still in progress)...135979

Usually I render in High Res 6144x6144...

MichaelT
02-13-2017, 06:48 AM
Can't you set up two advanced cameras, and parent them to a Null. Then adjust the distance between them. Then have both them target a single Null, which is also parented to the first Null? After all, you only need two images from a distance apart.

cyclopse
02-13-2017, 06:53 AM
This is one for example (still in progress)...135979

Usually I render in High Res 6144x6144...

It seems to be what I was afraid of... I'll have to check the image more closely. But when looking behind, the image seems to be inverted (why your eyes seem to twitch when looking back). Try this one, and look behind you... notice it still stays clear and you don't get that twitch (called brain sheer). Here's a render from Ubercam:

135980

I'll double check the image in photoshop though... it could be the resolution, but my eyes twitched out on the lamp behind me on your image.

cyclopse
02-13-2017, 06:56 AM
Nope, I was wrong... it's not inverted... just convergence is set too far away.

Interesting though... mind sending me some screen shots of your settings so I can try it out?

cyclopse
02-13-2017, 06:58 AM
Can't you set up two advanced cameras, and parent them to a Null. Then adjust the distance between them. Then have both them target a single Null, which is also parented to the first Null? After all, you only need two images from a distance apart.

No... on 360, the cameras need to be that distance apart at every angle... not just straight forward.

MichaelT
02-13-2017, 07:00 AM
No... on 360, the cameras need to be that distance apart at every angle... not just straight forward.

But they will be. You rotate the main null to move and rotate both cameras. They are rigged. That null goes in the middle between the cameras.
And depending how advanced (and realistic) you need it to be, you just build on that rig.

Danner
02-13-2017, 07:05 AM
Think about it this way, if you separate the cameras it will look fine looking straight ahead, but looking to the sides, one camera would be behind the other instead of beside it, alas incorrect. So what I did was to parent both cameras to a null, and rotate them all the way around while only rendering a thin slice of both spherical images. Then putting all the slices back together to form two spherical images (using a script in AfterFX). And lastly I had to correct the top and bottom of each image, by removing the stereo effect, because when you look down or up (with the pitch at -90 or +90) you could be looking at it with your eyes in any rotation in heading. So unless you remove the stereo effect it will look wrong also.

Niko3D
02-13-2017, 07:22 AM
No... on 360, the cameras need to be that distance apart at every angle... not just straight forward.

mmm...I don't understand.
My example is 360 image...when you see it on OculusVR works fine, Stereoscopic and 360...you can turn wherever you want and it is ok.
The angle and focus it's depend where is your camera and in Stereo Option in LW you need to setup well the Convergence point. This change by scene to scene...
BUT of course...I'm talking only about still images, no animation or real time with UnReal for example...

My setting are very easy:
-Use Advanced Camera (the resolution has to be Aspect Ratio 2, so for example 1000x500, but for final image normally I use 6144x3072)
-In Camera Properties choose Stereo
-Camera must have 0,0,0 as rotation
-LW will render left and right eyes
-In PS put the left image up and right image under (so you will have the image double 1000x1000 or 6144x6144 in final image)

That's all!
135982
135981

cyclopse
02-13-2017, 07:26 AM
mmm...I don't understand.
My example is 360 image...when you see it on OculusVR works fine, Stereoscopic and 360...you can turn wherever you want and it is ok.
The angle and focus it's depend where is your camera and in Stereo Option in LW you need to setup well the Convergence point. This change by scene to scene...
BUT of course...I'm talking only about still images, no animation or real time with UnReal for example...

My setting are very easy:
-Use Advanced Camera (the resolution has to be Aspect Ratio 2, so for example 1000x500, but for final image normally I use 6144x3072)
-In Camera Properties choose Stereo
-Camera must have 0,0,0 as rotation
-LW will render left and right eyes
-In PS put the left image up and right image under (so you will have the image double 1000x1000 or 6144x6144 in final image)

That's all!
135982
135981

Thx, I'll give it a shot (I really want GI... went through A LOT of trouble to get it just right on this scene). As for fiberFX... heard back from Liberty3D... gotta use the same trick as VPR... volume only.

Danner
02-13-2017, 07:32 AM
I didn't try with stereo camera... I know it seems obvious but I didn't, I'll have to test this out, would be much easier than the messy and complex process I came up with.

Niko3D
02-13-2017, 07:33 AM
You're welcome...to me they work fine these setting.
Just to be careful to put all setting right...in AdvancedCamera put attention don't use Spherical but use Cylinder...

Danner
02-13-2017, 08:22 AM
Doesn't work, too bad, it does render two images, but the stereoscopy is wrong. it is correct facing forward only.

Niko3D
02-13-2017, 08:27 AM
you need select STEREO

MichaelT
02-13-2017, 08:30 AM
Great :)

But just to point out.. I did the two images just as I expected. :)

You can cross your eyes to get the image.

135985

And another:

135986

135987

cyclopse
02-13-2017, 09:42 AM
So here are the tests:

Ubercam (no GI) - 48m 28s:
135984

Advanced Camera (GI turned on) - 1hr 10m 35s
135991

That technique works, and I'll use it on a less complex scene sometime... but the extra render time isn't worth it on this project...

Thank you though! I'm keepin' that one in my back pocket

Danner
02-13-2017, 10:04 AM
135992
One image was rendered using stereo camera, the other with a differente technique. (Ignore the other differences like lack of reflections) Can you guess wich image looks wrong in VR?

Niko3D
02-13-2017, 10:13 AM
Your welcome...
Sure it depends scene by scene...but in my job (archviz) I need absolutely the GI...;)

cyclopse
02-13-2017, 10:41 AM
135992
One image was rendered using stereo camera, the other with a differente technique. (Ignore the other differences like lack of reflections) Can you guess wich image looks wrong in VR?

That's a no-brainer... top one. Total brain sheer there.

- - - Updated - - -

I can totally see that. Archviz is completely dependent on being pretty above all else. With me (entertainment or sci-viz) it's more that it's physically accurate (as in physics accurate) and gets done as fast as possible (balanced with quality).

Niko3D
02-13-2017, 10:49 AM
for best results you need just to find your setting in Stereo Panel...just to play it a bit...

MichaelT
02-13-2017, 11:56 AM
135992
One image was rendered using stereo camera, the other with a differente technique. (Ignore the other differences like lack of reflections) Can you guess wich image looks wrong in VR?

The point being? If you want to point out that the 'other method' is wrong.. then a good tactic would be to mention the 'other method'. Or all you're doing is creating a straw man. If you imply that the method I'm using have that upper effect, then you are mistaken.

Danner
02-13-2017, 12:20 PM
It's not a matter of being mistaken, I'm showing what I get when using your method, and I can't figure out what I'm doing wrong. When I enable stereo it works as if I just moved the camera to the side and rendered the sphere again, so the 3d is incorrect. I already mentioned my other method and it's a mess, so I'm trying to avoid it. Altho I did come up with something better that is very promising, I'll do some tests tomorrow to make sure it works and will share it.

cyclopse
02-13-2017, 12:39 PM
It's not a matter of being mistaken, I'm showing what I get when using your method, and I can't figure out what I'm doing wrong. When I enable stereo it works as if I just moved the camera to the side and rendered the sphere again, so the 3d is incorrect. I already mentioned my other method and it's a mess, so I'm trying to avoid it. Altho I did come up with something better that is very promising, I'll do some tests tomorrow to make sure it works and will share it.

Did you mimic all the settings he posted? Because it worked fine for me (the only difference was that I set convergence to 1m where he had it not set at all). And I checked the full 360 in my Oculus. Only diff between his and Ubercam was radiosity, and very long render time.

MichaelT
02-13-2017, 12:44 PM
It's not a matter of being mistaken, I'm showing what I get when using your method, and I can't figure out what I'm doing wrong. When I enable stereo it works as if I just moved the camera to the side and rendered the sphere again, so the 3d is incorrect. I already mentioned my other method and it's a mess, so I'm trying to avoid it. Altho I did come up with something better that is very promising, I'll do some tests tomorrow to make sure it works and will share it.

Ok.. Because it is basically the same thing as using stereo rendering anyway. So it is better to use that. My method, although it have more control, is also more tiresome to make sure the rig is working properly. And not twisting the cameras etc.. Using the Stereo settings solves that, and you only need one camera. So I am a bit curious what you feel is wrong. As I don't really know what you're after. In any case, looking forward to see what you come up with :)

MichaelT
02-13-2017, 12:54 PM
Did you mimic all the settings he posted? Because it worked fine for me (the only difference was that I set convergence to 1m where he had it not set at all). And I checked the full 360 in my Oculus. Only diff between his and Ubercam was radiosity, and very long render time.

Speaking of.. it is strange that your render takes such a long time. The scene doesn't (to me at least) look that complicated?

cyclopse
02-13-2017, 01:16 PM
Speaking of.. it is strange that your render takes such a long time. The scene doesn't (to me at least) look that complicated?

There are dynamics, nevronMotion, 25 lights, a full-sized 1px=1m earth day, earth night (lights on), clouds, and haze layers earth. Also, the trussels for the lights are waay too detailed (kept them from a studio design I did that had to be accurate to the bolt). There's still some optimization that needs to happen on polys. And almost every surface is reflective, subsurface (carbon fiber), etc. A lot of little things. At 1080p, the scene renders in 10 min per frame. At 4096x4096 for VR (two renders) it drags out. Also, it depends on how close I am to the robot (Not only is it Nevron, but lip-synced with mocap to the song as well... tons of bones and morph targets, and lots of fibers). As I said... a lot of little things add up.

MichaelT
02-13-2017, 01:46 PM
There are dynamics, nevronMotion, 25 lights, a full-sized 1px=1m earth day, earth night (lights on), clouds, and haze layers earth. Also, the trussels for the lights are waay too detailed (kept them from a studio design I did that had to be accurate to the bolt). There's still some optimization that needs to happen on polys. And almost every surface is reflective, subsurface (carbon fiber), etc. A lot of little things. At 1080p, the scene renders in 10 min per frame. At 4096x4096 for VR (two renders) it drags out. Also, it depends on how close I am to the robot (Not only is it Nevron, but lip-synced with mocap to the song as well... tons of bones and morph targets, and lots of fibers). As I said... a lot of little things add up.

:) Which is why I asked. But OK, in understand. What kind of setup are you rendering on? Because I thinking that (depending on your funds of course) at some point it perhaps be a good idea to get a few more CPU's to help out. Or GPUs, whichever suits your fancy :) Since close to an hour a frame is really a lot... and costs a lot too.

jwiede
02-13-2017, 04:45 PM
UPDATE: Ubercam does NOT play well with radiosity. Also, it doesn't seem to play well with fiberFX. I can live with the radiosity problem (with groans of disappointment of course).

Can you clarify what problem you're seeing between Ubercam and GI/radiosity? Is it confined solely to a specific type of 3D camera or such? I've used Ubercam with GI many times without problems (albeit on Mac), so I'm a bit surprised to hear about Ubercam having GI/radiosity problems.

cyclopse
02-13-2017, 08:51 PM
:) Which is why I asked. But OK, in understand. What kind of setup are you rendering on? Because I thinking that (depending on your funds of course) at some point it perhaps be a good idea to get a few more CPU's to help out. Or GPUs, whichever suits your fancy :) Since close to an hour a frame is really a lot... and costs a lot too.

I have an I7 5690X OC to 4.0Ghz (octacore), 32GB, Titan X (it's a good machine... but render time isn't all that much of a concern... have a farm of 48-cores of Xeon power.


Can you clarify what problem you're seeing between Ubercam and GI/radiosity? Is it confined solely to a specific type of 3D camera or such? I've used Ubercam with GI many times without problems (albeit on Mac), so I'm a bit surprised to hear about Ubercam having GI/radiosity problems.

Any radiosity (even set at .05%) gives the results as seen in the first post. It's with the "Stereo Immersive Camera." According to Kat at Liberty3D, that camera and at least one other in their list don't support GI nor additive FX (like fiberFX on anything other than Volume, lens flares, etc). She's been great and been on email with me all morning / day to help me. Wow... refreshing to have a good support department.

Danner
02-14-2017, 01:17 AM
Did you mimic all the settings he posted? Because it worked fine for me (the only difference was that I set convergence to 1m where he had it not set at all). And I checked the full 360 in my Oculus. Only diff between his and Ubercam was radiosity, and very long render time.
That was it! With convergence point enabled, it works!

The other method I was toying with yesterday was to apply a "time sweep" with the setting "Range across image start now" of 5.76
and even without stereo selected that method also works but you have to set two cameras up correctly. The downside is that it takes a little longer to render and more setup time so yeah Niko3d's method wins out in the end, just make sure you guys turn on convergence point. Thanx Niko, Michael and Cyclopse for this conversation. This forum is one of the reasons I still use LW always learn something new and in this case, just in time. =D

cyclopse
02-14-2017, 01:32 AM
That was it! With convergence point enabled, it works!

The other method I was toying with yesterday was to apply a "time sweep" with the setting "Range across image start now" of 5.76
and even without stereo selected that method also works but you have to set two cameras up correctly. The downside is that it takes a little longer to render and more setup time so yeah Niko3d's method wins out in the end, just make sure you guys turn on convergence point. Thanx Niko, Michael and Cyclopse for this conversation. This forum is one of the reasons I still use LW always learn something new and in this case, just in time. =D

No worries. Update from Liberty support: YOU CAN USE GI! (just untick "interpolated"). Liberty3D is soooo much quicker on renders... I gotta say. I highly advise it. It's not even all that expensive.

jwiede
02-16-2017, 10:01 AM
No worries. Update from Liberty support: YOU CAN USE GI! (just untick "interpolated"). Liberty3D is soooo much quicker on renders... I gotta say. I highly advise it. It's not even all that expensive.

Ah, that's why I wasn't seeing it. Thanks!