Results 1 to 14 of 14

Thread: Ubercam and the Future of VR

  1. #1
    Registered User
    Join Date
    Mar 2010
    Location
    SATown
    Posts
    665

    Ubercam and the Future of VR

    Hi,

    As you know, you can get into VR pretty inexpensively right now, via Ubercam. Definitely cheaper than with other solutions. But I've noticed most of the VR content is made on other platforms, and I'd like to help get more LW content out there.

    In the current version of Ubercam, I'm focused on panoramic rendering and stereo VR panoramas.

    Are there other types of VR that people are actively using? Is there something *YOU* could use that isn't on the market right now?

    Is there anything else you'd like to see in Ubercam? This latest version is a full overhaul of the algorithm, so it includes convergence, is faster, and more accurate. But I can't help but wonder if I'm missing something huge.

    Now, for the next version (v2.2, which will be a free update for registered users), I already have some stuff lined up...

    1) I have a solution that reduces or eliminates that nausea feeling when you look straight up or down, that plagues the entire VR industry at the moment. This will be an Ubercam exclusive technology, until everyone figures out how I did it
    2) I almost have the Mac version of the Oculus Rift head display working (tracking of course works now, but not live viewing, like on the PC)
    3) Adding a recommended size setting
    4) Adding a visible indication of the convergence point in Layout.
    5) Adding a setting to always keep the camera upright

    Anything else you'd like to see?
    What's stopping you from jumping on the VR bandwagon at the moment?
    Is there some 'next big thing' I haven't considered here?

    Thank you!
    -Chilton Webb

  2. #2
    Super Member spherical's Avatar
    Join Date
    Dec 2004
    Location
    San Juan Island
    Posts
    4,686
    Perhaps an extension of sorts of live viewing, except the source is Layout VPR. This enables the ability to view, for example, an ArchViz and move objects in the scene; probably best done at first by using the workstation pointing device to accurately translate objects in the scene.

    Same sort of link may be able to be created between Modeler and Oculus when the Modeler Camera Emulation is available in LightWave Next. This then would allow a 3D Perspective viewport, so you could model while examining the mesh in 3D. Link it to a 3D Connexion device and we'll never sleep.

    EDIT: "objects in the scene" obviously includes all items in a scene: textures, backgrounds, meshes, lights and cameras; the latter allowing change of POV.
    Last edited by spherical; 01-26-2016 at 06:26 PM.
    Blown Glass ˇ Carbon Fiber + Imagination

    Spherical Magic | We Build Cool Stuff!

    "When a man loves cats, I am his friend and comrade, without further introduction." - Mark Twain

  3. #3
    Vacant, pretty vacant pinkmouse's Avatar
    Join Date
    Aug 2003
    Location
    South Yorkshire
    Posts
    1,703
    Quote Originally Posted by ConjureBunny View Post
    What's stopping you from jumping on the VR bandwagon at the moment?
    I have no VR device and I don't know anyone that does...
    Al
    "I conceive of nothing, in religion, science or philosophy, that is more than the proper thing to wear, for a while." Charles Fort

    My Website
    My Lightwave Tutorials

  4. #4
    Frequenter
    Join Date
    Sep 2003
    Location
    Munich
    Posts
    393
    Hi Chilton,
    well, I understand everybody is overly keen on VR for Oculus and the like... But I´d really would love to see the stereo possibilities from the immersive camera ported over to good ole spherical camera. So that we actually can use lightwave stereo for fulldome directly without having all the extra conversion steps. That actually would make my day/week/month, probably even year!!! ;-)

    Cheers,

    Lorenz

  5. #5
    Registered User
    Join Date
    Mar 2010
    Location
    SATown
    Posts
    665
    Lorenz,

    I'm not even sure where to start on that. I know very little about fulldome tech. Any suggestions or pointers?

    -Chilton

  6. #6
    gold plated 3D Chrusion's Avatar
    Join Date
    Mar 2003
    Location
    Chatannooga, TN
    Posts
    1,080
    Not exactly addressing your original question, but since you're talking about future versions, allow me to ask this: I don't have UberCam, but after watching your demo vid, is it possible to have Ubercam populate the Height/Width fields of the camera selected in the DPI and the other Ubercam type, rather than having the user manually copy/paste/enter the calculated values?
    Dean A. Scott, mfa
    Senior 3D Animator and Graphic Design Illustrator, @ Astec, Inc.
    Owner / Lead Artist @ chrusion | FX

  7. #7
    Registered User
    Join Date
    Mar 2010
    Location
    SATown
    Posts
    665
    Quote Originally Posted by Chrusion View Post
    Not exactly addressing your original question, but since you're talking about future versions, allow me to ask this: I don't have UberCam, but after watching your demo vid, is it possible to have Ubercam populate the Height/Width fields of the camera selected in the DPI and the other Ubercam type, rather than having the user manually copy/paste/enter the calculated values?
    I actually don't know. I'll take a look!

    -Chilton

  8. #8
    Frequenter
    Join Date
    Sep 2003
    Location
    Munich
    Posts
    393
    As Chilton asked for it only a tiny bit OT...
    For Fulldome Tech the following link might be a good starting point:

    http://www.fulldome.org/forum/topics...23Comment37379

    Cheers,


    Lorenz

  9. #9
    Man of many cells. shrox's Avatar
    Join Date
    Aug 2006
    Location
    Eureka, CA
    Posts
    6,962
    Is it ready? I am ready.
    shrox www.shrox.com
    -----------------------
    Heavy Metal Landing


    -----------------------
    I build the best spaceships, the biggest spaceships, they're great, you'll love them.

  10. #10
    gold plated 3D Chrusion's Avatar
    Join Date
    Mar 2003
    Location
    Chatannooga, TN
    Posts
    1,080
    Regarding the issues with parallax convergence and reversal in doing VR in 3D apps, wouldn't the most obvious method of eliminating such be to internally rotate a stereo camera rig and create a render using slit scan, similar to how iOS generates pano photos? That is, instead of rendering the entire 360 spherical environment using a single camera, render narrow strips that are 10 deg horizontal by +/- 30 deg vertical, stitching them together as you go. The first pass would have the camera rig (eye separation distance from a master null) rotate 360 on the horizontal plane, then tilt up/down 60 deg and spin 360 to render strips for the zenith/nadir.

    IOW, it seems logical to basically internally simulate the human head's way of looking at the world by always keeps the eyes separated the same distance no matter what the look direction is, and just "record" the scene through this narrow, vertical slit. But what do I know... <shrug>... I'm sure this has been considered way long time before, so there must be some technical limitation I'm oblivious to.
    Dean A. Scott, mfa
    Senior 3D Animator and Graphic Design Illustrator, @ Astec, Inc.
    Owner / Lead Artist @ chrusion | FX

  11. #11
    Registered User
    Join Date
    Mar 2010
    Location
    SATown
    Posts
    665
    Quote Originally Posted by Chrusion View Post
    Regarding the issues with parallax convergence and reversal in doing VR in 3D apps, wouldn't the most obvious method of eliminating such be to internally rotate a stereo camera rig and create a render using slit scan, similar to how iOS generates pano photos? That is, instead of rendering the entire 360 spherical environment using a single camera, render narrow strips that are 10 deg horizontal by +/- 30 deg vertical, stitching them together as you go. The first pass would have the camera rig (eye separation distance from a master null) rotate 360 on the horizontal plane, then tilt up/down 60 deg and spin 360 to render strips for the zenith/nadir.

    IOW, it seems logical to basically internally simulate the human head's way of looking at the world by always keeps the eyes separated the same distance no matter what the look direction is, and just "record" the scene through this narrow, vertical slit. But what do I know... <shrug>... I'm sure this has been considered way long time before, so there must be some technical limitation I'm oblivious to.
    No, you're not oblivious here, that's pretty much how stereo rendering is done, currently. Ubercam does a bit more than that, but it's mostly cleanup to address issues spherical VR normally creates. Convergence is still an issue though, because the 'forward' direction a ray is cast through that slit is what changes slightly.

    -Chilton

  12. #12
    Registered User
    Join Date
    Aug 2006
    Location
    desktop
    Posts
    1,290
    Quote Originally Posted by ConjureBunny View Post
    Hi,

    What's stopping you from jumping on the VR bandwagon at the moment?
    I tried it, its not for me. The entertainment value faded.
    360 VR must be for die hard gamers. FPS games are nice but most of them today are so generic and uninspiring.
    I like story driven content better and that doesn't need 360 VR.

  13. #13
    Man of many cells. shrox's Avatar
    Join Date
    Aug 2006
    Location
    Eureka, CA
    Posts
    6,962
    Quote Originally Posted by lightscape View Post
    I tried it, its not for me. The entertainment value faded.
    360 VR must be for die hard gamers. FPS games are nice but most of them today are so generic and uninspiring.
    I like story driven content better and that doesn't need 360 VR.
    Oh, something very cool is coming...from me!
    shrox www.shrox.com
    -----------------------
    Heavy Metal Landing


    -----------------------
    I build the best spaceships, the biggest spaceships, they're great, you'll love them.

  14. #14
    hi

    Just started to use my clients new HTC VIVE today and i'm amazed it's so cool, we're going to mainly use it for virtual products online and virtual showrooms with a selection of products on show using sketchfab as the web delivery platform.

    to view sketchfab we're using web VR and the experimental version of chrome noce you enable that setting (webVR) in that browser it's just loads straight into the HTC headset.

    so any advancements on the area of VR has my ears pricked up!
    stee+cat
    real name: steve gilbert
    http://www.cresshead.com/

    Q - How many polys?
    A - All of them!

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •