PDA

View Full Version : Ubercam and the Future of VR



ConjureBunny
01-26-2016, 12:25 PM
Hi,

As you know, you can get into VR pretty inexpensively right now, via Ubercam. Definitely cheaper than with other solutions. But I've noticed most of the VR content is made on other platforms, and I'd like to help get more LW content out there.

In the current version of Ubercam, I'm focused on panoramic rendering and stereo VR panoramas.

Are there other types of VR that people are actively using? Is there something *YOU* could use that isn't on the market right now?

Is there anything else you'd like to see in Ubercam? This latest version is a full overhaul of the algorithm, so it includes convergence, is faster, and more accurate. But I can't help but wonder if I'm missing something huge.

Now, for the next version (v2.2, which will be a free update for registered users), I already have some stuff lined up...

1) I have a solution that reduces or eliminates that nausea feeling when you look straight up or down, that plagues the entire VR industry at the moment. This will be an Ubercam exclusive technology, until everyone figures out how I did it :)
2) I almost have the Mac version of the Oculus Rift head display working (tracking of course works now, but not live viewing, like on the PC)
3) Adding a recommended size setting
4) Adding a visible indication of the convergence point in Layout.
5) Adding a setting to always keep the camera upright

Anything else you'd like to see?
What's stopping you from jumping on the VR bandwagon at the moment?
Is there some 'next big thing' I haven't considered here?

Thank you!
-Chilton Webb

spherical
01-26-2016, 04:59 PM
Perhaps an extension of sorts of live viewing, except the source is Layout VPR. This enables the ability to view, for example, an ArchViz and move objects in the scene; probably best done at first by using the workstation pointing device to accurately translate objects in the scene.

Same sort of link may be able to be created between Modeler and Oculus when the Modeler Camera Emulation is available in LightWave Next. This then would allow a 3D Perspective viewport, so you could model while examining the mesh in 3D. Link it to a 3D Connexion device and we'll never sleep. :D

EDIT: "objects in the scene" obviously includes all items in a scene: textures, backgrounds, meshes, lights and cameras; the latter allowing change of POV.

pinkmouse
01-27-2016, 02:50 AM
What's stopping you from jumping on the VR bandwagon at the moment?

I have no VR device and I don't know anyone that does...;)

fishhead
01-27-2016, 04:48 AM
Hi Chilton,
well, I understand everybody is overly keen on VR for Oculus and the like... But IŽd really would love to see the stereo possibilities from the immersive camera ported over to good ole spherical camera. So that we actually can use lightwave stereo for fulldome directly without having all the extra conversion steps. That actually would make my day/week/month, probably even year!!! ;-)

Cheers,

Lorenz

ConjureBunny
01-27-2016, 11:05 AM
Lorenz,

I'm not even sure where to start on that. I know very little about fulldome tech. Any suggestions or pointers?

-Chilton

Chrusion
01-27-2016, 12:41 PM
Not exactly addressing your original question, but since you're talking about future versions, allow me to ask this: I don't have UberCam, but after watching your demo vid, is it possible to have Ubercam populate the Height/Width fields of the camera selected in the DPI and the other Ubercam type, rather than having the user manually copy/paste/enter the calculated values?

ConjureBunny
01-27-2016, 02:56 PM
Not exactly addressing your original question, but since you're talking about future versions, allow me to ask this: I don't have UberCam, but after watching your demo vid, is it possible to have Ubercam populate the Height/Width fields of the camera selected in the DPI and the other Ubercam type, rather than having the user manually copy/paste/enter the calculated values?

I actually don't know. I'll take a look!

-Chilton

fishhead
01-28-2016, 07:49 AM
As Chilton asked for it only a tiny bit OT...
For Fulldome Tech the following link might be a good starting point:

http://www.fulldome.org/forum/topics/stereoscopic-domemaster-images?page=12&commentId=2223423%3AComment%3A37379&x=1#2223423Comment37379

Cheers,


Lorenz

shrox
01-28-2016, 08:20 PM
Is it ready? I am ready.

Chrusion
01-29-2016, 07:47 AM
Regarding the issues with parallax convergence and reversal in doing VR in 3D apps, wouldn't the most obvious method of eliminating such be to internally rotate a stereo camera rig and create a render using slit scan, similar to how iOS generates pano photos? That is, instead of rendering the entire 360 spherical environment using a single camera, render narrow strips that are 10 deg horizontal by +/- 30 deg vertical, stitching them together as you go. The first pass would have the camera rig (eye separation distance from a master null) rotate 360 on the horizontal plane, then tilt up/down 60 deg and spin 360 to render strips for the zenith/nadir.

IOW, it seems logical to basically internally simulate the human head's way of looking at the world by always keeps the eyes separated the same distance no matter what the look direction is, and just "record" the scene through this narrow, vertical slit. But what do I know... <shrug>... I'm sure this has been considered way long time before, so there must be some technical limitation I'm oblivious to.

ConjureBunny
01-29-2016, 08:48 AM
Regarding the issues with parallax convergence and reversal in doing VR in 3D apps, wouldn't the most obvious method of eliminating such be to internally rotate a stereo camera rig and create a render using slit scan, similar to how iOS generates pano photos? That is, instead of rendering the entire 360 spherical environment using a single camera, render narrow strips that are 10 deg horizontal by +/- 30 deg vertical, stitching them together as you go. The first pass would have the camera rig (eye separation distance from a master null) rotate 360 on the horizontal plane, then tilt up/down 60 deg and spin 360 to render strips for the zenith/nadir.

IOW, it seems logical to basically internally simulate the human head's way of looking at the world by always keeps the eyes separated the same distance no matter what the look direction is, and just "record" the scene through this narrow, vertical slit. But what do I know... <shrug>... I'm sure this has been considered way long time before, so there must be some technical limitation I'm oblivious to.

No, you're not oblivious here, that's pretty much how stereo rendering is done, currently. Ubercam does a bit more than that, but it's mostly cleanup to address issues spherical VR normally creates. Convergence is still an issue though, because the 'forward' direction a ray is cast through that slit is what changes slightly.

-Chilton

lightscape
01-29-2016, 09:54 PM
Hi,

What's stopping you from jumping on the VR bandwagon at the moment?


I tried it, its not for me. The entertainment value faded.
360 VR must be for die hard gamers. FPS games are nice but most of them today are so generic and uninspiring.
I like story driven content better and that doesn't need 360 VR.

shrox
01-29-2016, 10:15 PM
I tried it, its not for me. The entertainment value faded.
360 VR must be for die hard gamers. FPS games are nice but most of them today are so generic and uninspiring.
I like story driven content better and that doesn't need 360 VR.

Oh, something very cool is coming...from me!

cresshead
03-17-2017, 04:12 PM
hi

Just started to use my clients new HTC VIVE today and i'm amazed it's so cool, we're going to mainly use it for virtual products online and virtual showrooms with a selection of products on show using sketchfab as the web delivery platform.

to view sketchfab we're using web VR and the experimental version of chrome noce you enable that setting (webVR) in that browser it's just loads straight into the HTC headset.

so any advancements on the area of VR has my ears pricked up!