View Full Version : Ubercam 2.5 and our future VR/AR roadmap

11-20-2016, 05:05 PM


Ubercam 2.5 will ship to existing users in the next 24 hours. Testing is done, and I think we have the crashing issues with file permissions all nailed down now. This new version lets you use multiple keys and stores them in different files, so you can more easily manage multiple keys on a render farm and on local devices. These are issues I didn't know about because I just use my laptop, and never use a render farm. I'm hoping these requested changes are of help to everyone. Also, it should crash much less, or even not at all, due to weird file permission issues in Windows 10 and the way Ubercam's underlying code was checking for a valid license. So this was a bugfix version and nothing else. I didn't add any new features (that I know of).

With that in mind, I'd like to take a moment to talk about where we're going with this.

Ubercam is a combination of something like 50 different cameras, plugins, and device managers. I'm specifically looking at the VR and AR stuff at the moment.

VR Cameras

Currently we have one of the fastest panoramic cameras on the market. It's dead simple, you just drop in the plugin, choose the immersive camera, and hit render.

New features will include...
1) The ability to only render the forward facing part of the sphere, so your renders will take at at half as long at most to render, and will be faster than that if I can pull off what I'm trying to do. This will result in a render that is black everywhere but forward, but that will let you more easily set up and render scenes with the interesting bits forward, then when you're ready to do the final render, you just tick a checkbox, and the whole thing will render.

2) Gyro stabilization, where the camera will face the right direction but never rotate on the Z axis. Again this will just be a check mark. I want to make this as simple for you as possible.

3) FOV marker so you can more easily see the area that will be immediately visible if the user is looking straight ahead, for a given FOV. This won't affect the render, but will affect your ability to know what's going to be visible if the user is looking forward.

If you can think of anything else you'd like for the mono immersive camera, please let me know!!

Stereo immersive will also get some love, which will be the following...
1) One click to only render right or left eye, drastically cutting down on render time.
2) One additional click to only render the forward part of that eye's render area, also cutting down on time for previs and other shots where you don't need to see everything.
3) FOV markers as explained above.

Again if you can think of anything for the stereo camera, I'm all ears!

VR Headset Support

This is the virtual studio system for use with Oculus Rift. You can currently look around one of your scenes, through your Oculus Rift headset, in realtime, in VR. Since it's wired into the Virtual Studio tools, you can also record your head movements on the camera into your scene. I don't know why you'd want to, but if nothing else it's a super easy way to control the camera. Just look around :)

Currently the Mac version does not have a viewer. That will be addressed, and the Mac version will work with the latest drivers Oculus provides. Grumble, grumble, I wish Apple would throw us a frikkin' bone here on the video card side, grumble, grumble.

Oculus Rift meanwhile has moved on and we're not moving quickly enough with them. The next version will (hopefully) support direct display mode. That's the current great white whale I'm chasing. As soon as I get my fancy Oculus Rift paddles, I'll add those, too. So at that point you'll be able to record the paddle movements in the virtual studio, too.

HTC Vive support will be coming shortly as well, with the same concepts. The big advantage here is room scale VR. So on this one, I think I'll probably have some kind of scale feature in the virtual tool, so you can scale your virtual set down to whatever size you want, whether it's something you want to walk around *in*, or something you want to set down on your desk. You will then be able to look at, and record movements through, your Vive equipment.

I also have an OSVR headset, and I'm going to add support for that as well. I don't know much about the OSVR SDK, but I've been assured by their devs it's easy to put together.

That's the current roadmap for the hardware. If you have any suggestions there, ask away.

These will all be part of Ubercam.

Most likely these will be released as 2.7, 2.8, 2. 9, 3.0, etc. I don't have any huge feature set I've decided on for 3.0, and I figure some of you need this stuff YESTERDAY, so I'm going to roll them out as quickly as I can.

There is also the VR Companion app, that will be released very, very soon, which will let you quickly test your VR renders directly on hardware, by simply switching to the app. It will watch a folder for content and auto-load it when you switch to the app. This will also let you easily show off your work to others. That is a separate product, but Ubercam users will get a discount on it.

Hmm. That's all I have for you at the moment.

Questions / concerns / feedback welcomed and encouraged.

-Chilton Webb

11-20-2016, 05:41 PM
I don't work with vr but what you've done with Ubercam is very impressive. Congratulations on the hard work.

11-21-2016, 09:26 AM

02-16-2017, 10:05 AM
Chilton, any comments on what's going on w.r.t. Ubercam 3D cameras and (interpolated) GI per this (http://forums.newtek.com/showthread.php?152826-Liberty-3D-(Ubercam)-lighting-and-textures-go-wonky&p=1498268&viewfull=1#post1498268) and this (http://forums.newtek.com/showthread.php?152826-Liberty-3D-(Ubercam)-lighting-and-textures-go-wonky&p=1498289&viewfull=1#post1498289) post? Can you provide a more thorough characterization of the problem, whether it affects both FG & MC, etc.?

Any chance we're likely to see fixes prior to 3.0 release?