PDA

View Full Version : ANN: UberCam 2.0: 360 videos, VR headset, Oculus Rift support, and more



Pages : 1 [2]

ConjureBunny
02-19-2016, 08:11 PM
If it says 4098 in the Ubercam manual that's a typo. 4096 is the magic number.

spherical
02-19-2016, 10:35 PM
Yep. 4098 is what is written.

Dillon
02-23-2016, 11:32 AM
Holy cow....!

https://www.youtube.com/watch?v=ok9xA1fzTu4

pinkmouse
02-24-2016, 03:37 AM
Holy cow....!

Still not seeing anything that wouldn't be deeply annoying to work with after the initial novelty had worn off.

Kaptive
02-24-2016, 04:55 AM
Holy cow....!

https://www.youtube.com/watch?v=ok9xA1fzTu4

Looks kind of fun, but I'll be honest, creating objects in a VR space seems a bit pointless. I say this because it is proven that 3d artists can pretty much create anything using standard current methods on a flat screen. This developer is wasting precious time trying to make a VR modelling environment, when really, the true benefits would be in scene creation. i.e. create all assets in the standard way (or buy prefabs from turbosquid etc) and then create the world in VR by bringing those items in.
Creating a VR scene in goggles (with the end use being a VR product) would be useful and largely practical as you'll be seeing what the end user sees. But, modelling... even if it manages to eventually match what can already be done, it wont surpass it or make it any faster. If anything it'll consume more energy and make the eyes tired.

I do appreciate what this guy is doing from a technical level, but I feel he is approaching it from the wrong direction.

Dillon
02-24-2016, 03:11 PM
Looks kind of fun, but I'll be honest, creating objects in a VR space seems a bit pointless. I say this because it is proven that 3d artists can pretty much create anything using standard current methods on a flat screen. This developer is wasting precious time trying to make a VR modelling environment, when really, the true benefits would be in scene creation. i.e. create all assets in the standard way (or buy prefabs from turbosquid etc) and then create the world in VR by bringing those items in.
Creating a VR scene in goggles (with the end use being a VR product) would be useful and largely practical as you'll be seeing what the end user sees. But, modelling... even if it manages to eventually match what can already be done, it wont surpass it or make it any faster. If anything it'll consume more energy and make the eyes tired.

I do appreciate what this guy is doing from a technical level, but I feel he is approaching it from the wrong direction.

I'm thinking of all the traditional artists that are waiting on the sidelines. Creating 3D content using a keyboard/mouse is quite difficult for traditional artists.

I personally think he's approaching the design of this VR OS perfectly. It's brilliant design, and lays down a tremendously powerful system to create assets in collaboration together.

Kaptive
02-24-2016, 03:35 PM
But traditional artists are drawn to traditional media, just as we are drawn to the digital. Are there really any artists sat on the sidelines? My dad is a traditional artist and has little interest in digital/3d (but enjoys what others do with it). My niece is an amazing pencil illustrator, she is doing pretty well and enjoying it saying how young she is... I've never heard her mention a desire to jump into 3d, nor feeling like it is inaccessable. I mean, though I'd love to think I'm special because I can use 3d software, it isn't all that hard is it? As with any medium, it is the mind and creativity behind it that really elevates it.
I don't know, maybe I've just not walked over to the right sideline, but I've not met anyone you are talking about yet.

By the way, I'm not arguing... I'm just not sure I see it in the same way as yourself. I'm also not saying I'm right either........who knows :)

spherical
02-24-2016, 06:22 PM
I'm thinking of all the traditional artists that are waiting on the sidelines. Creating 3D content using a keyboard/mouse is quite difficult for traditional artists.

Not this one. Be careful spouting absolutes. They most probably aren't. Heck, I just got a Gold Record for an album cover I painted using traditional media. Yet, here I am, happy as I could be working with a keyboard and trackball creating paintings that move and blown glass commission studies to fully visualize a sculpture from all sides and in multiple environments before we start production. Did I abandon traditional media? No. Do I find working in 3D difficult because I don't have physical tools? No. The traditional artists that I know who are not using digital media, other than Photoshop to process images (which operates with a keyboard and mouse and they have no trouble) are just not interested in 3D; and not because the "tools are hard to use". They simply don't have the aptitude for it. Most of their concern is learning the complexities of these applications that have zero to do with how you manipulate stuff. The latter is not a stumbling block for them. Getting their heads around Hypervoxels, IOR, Refraction and Reflection Blur, GI, Keyframes, Deformations and Displacement, Rigging, Texturing, Lighting and the blizzard of other aspects of asset and content creation; and the costs of these applications, plugins and hardware to run them, is. Adding an Oculus Rift into the mix makes this even less appealing to them.

They don't see the point in going down that road, other than it's neat to animate things. Most really enjoy the tactile connection to the media. The feel and sound of brush moving across a canvas or graphite across handmade text paper. That cannot be economically reproduced. Someone may develop a vibrating graphics tablet that mimics this experience but, like the Cintiq, it would likely be way expensive and, as such, what's the point? It's like getting on a treadmill every day to exercise. Take a walk instead. You'll get fresh air and better experience your environment to boot.

Zerowaitstate
02-25-2016, 02:54 AM
i was in VRchat > Gunters Universe where Fruxious was being interviewed.

HE was nervous about showing the demo and how it would be received.

what he is showing here is the underlying infrastructure that allows multiple people to work in the same environment realtime in sync. His thoughts were that you would be able to have multiple artist working on a scene at the same time. as i understand it it is like a shim to Unity.

so you can bring in ready made assets to a scene and have people working at the same time to modify lighting textures geometry etc.

also unless im mistaken plugg into wolfram alpha back end so it can interpret stuff into concepts and apply results that are relevant

i am sure the full video will be posted in a week or so. Its a blast meeting people around the world in VR discussing the future of VR development tools if you have a rift check out VRchat and Gunters universe. you need a HMD to attend you can view in 2D

Kaptive
02-25-2016, 03:09 AM
Getting their heads around Hypervoxels, IOR, Refraction and Reflection Blur, GI, Keyframes, Deformations and Displacement, Rigging, Texturing, Lighting and the blizzard of other aspects of asset and content creation; and the costs of these applications, plugins and hardware to run them, is. Adding an Oculus Rift into the mix makes this even less appealing to them.

Actually, that's it, on the nose... as they say. The basic funstions of 3d are actually reasonably accessable. I mean zbrush is what a more traditional sculpter would gravitate towards, photoshop/painter or such for artists with a tablet, but the functions of these programs are not going to get any simpler or accessable because of VR. I could maybe see zbrush adding some kind of "VR sculpt mode"... but will that make it any easier to use? No access to menus without lots of fumbling etc.
I'm just trying to be realistic. I'm not without enthusiasm for VR, it looks like great fun, but I'm not convinced it will open up previously inaccessable realms for anyone. If anything, it is more about pushing the frontier for those at the cutting edge of 3d isn't it?

Of course, that isn't to say that these guys (who you linked to Dillon) wont come up with some incredible killer app, that all of a sudden makes 3d design very accessable to anyone with a VR headset... and ultimately give them the bug for it... but because of the limitations of it, the user may eventually take the goggles off and also end up working more digitally traditional ...mouse and keyboard... as they need access to more tools and functions. But who is to say. The future is very hard to predict.

I mean, being as realistic as possible, where would functionality begin and end in your mind Dillon? Just sculpting? Texturing? Node interface? Menu systems? Animation?.. with a timeline, rigging... You'd have to have a lot of controls. Not that this is totally impossible! But it would require some very clever design. I do also feel the limitation of how long you can spend inside goggles comfortably to be a bit of a limit. I can sit at a screen with keyboard mouse for many hours no problem. But if I have to be stood up, moving about with sweaty goggles on, that time window is going to be much reduced. This is why I think it'd be better for scene creation than modelling. ...check out the vid below... voxel farm... not created for VR, but more akin to what I'd like to be able to do in VR.


https://www.youtube.com/watch?v=Bj1UQ7yL4mw

pinkmouse
02-25-2016, 03:39 AM
I fully expect VR to become as popular as 3D TV. :)

Kaptive, that Voxel Farm demo looks fun, but I have to say, without that interface and with normal tools, that scene could probably have been built twice as fast. What studio will use something that slows down productivity?

spherical
02-25-2016, 04:45 AM
i was in VRchat > Gunters Universe where Fruxious was being interviewed.

what he is showing here is the underlying infrastructure that allows multiple people to work in the same environment realtime in sync. His thoughts were that you would be able to have multiple artist working on a scene at the same time.

Yeah. That sure sounds like a compatible environment to create in.... destined to end in creative differences. Scenes in large films are not created "at the same time"; nor are great creations borne from committees. Here's another situation where we see a "new thing" washing away millennia of interpersonal dynamics that, heretofore, had stymied advancement. Right...

No matter what anyone may tout, in the end it is individuals working independently in their own strengths, who later on may bring their contribution to a team, that add to the mix that results in a successful product; be it the next great game, blockbuster film, revolutionary application or stunning art. Few, if any, great things come from committees.

Dillon
02-25-2016, 08:31 AM
Yeah. That sure sounds like a compatible environment to create in.... destined to end in creative differences. Scenes in large films are not created "at the same time"; nor are great creations borne from committees. Here's another situation where we see a "new thing" washing away millennia of interpersonal dynamics that, heretofore, had stymied advancement. Right...

No matter what anyone may tout, in the end it is individuals working independently in their own strengths, who later on may bring their contribution to a team, that add to the mix that results in a successful product; be it the next great game, blockbuster film, revolutionary application or stunning art. Few, if any, great things come from committees.

The full actualization of working collaboratively in VR is hard to envision. The video I linked is alpha footage from 1 developer who is hell bent on creating tools to enable everyone to create the VR Metaverse, including gamifying aspects of environments, all the way to total directorial control of what your VR audience experiences as they move through your created world. Believe it or not, I guarantee that Facebook, Google, and Apple are also working just as vigorously to create exactly the same tools.

This will include lightfield films as well. While this technology is also still nascent, it is coming. If you're unfamiliar with lightfield as a format, think of it as a volumetric recording of realworld space and performers. This will enable the viewer to walk around the recording. Put this same recording into a game engine, and now you are have an interactive film in which you are a participant.

I don't think that working in VR collaboratively will be a hindrance whatsoever. In fact, it's going to be quite liberating for many. Creating the interactive immersive VR film will require collaborating in VR.

The full potential of VR will be actualized once the convergence of several technologies has completed: Input fidelity (sub mm accuracy tracking of the full body), VR UI design tools (they're being conceptualized and experimented with now), with file formats compressed enough to enable realtime streaming of virtually any kind of media (film, lightfield, motion data).

The video I posted is just one dude. I guarantee that Facebook, Google, and Apple want to engage the masses in the VR world for the same reason. There is something incredibly powerful about experiencing the presence of another person in a VR space. If you haven't felt it, you won't know what people are talking about, and you won't understand why Facebook bought Oculus for 2 billion dollars almost 2 years ago.

If you have a chance to, I strongly encourage you to try out the Gear VR headset, and drop in on a VR chat room. I recommend vTime VR chat app; it has multiple CGI environments in which a handful of people sit, and get to observe different environmental aspects meant to provoke wonder and surprise. The result is a bonafide full on experience of actually being in the presence of the other people in that space. It's a physical sensation, visceral, and very meaningful.

While my first passion has been filmmaking for the last 36 years, animation, and 3D animation have always been a part of it. I started with Lightwave on the video toaster on the Amiga 3000T in 1998. I've since taught both filmmaking and 3D animation at 3 major universities in the Bay Area, and have been tinkering with VR development since Oculus released their DK1 almost 3 years ago. Currently, I work at an art college, where I support the digital artists making their way into "new media". This includes video, 3D animation, web site creation, and now, VR. I've witnessed many traditional artists become flat out frustrated at the process of trying to work in even a 2D environment (editing video) using a keyboard/mouse. It's counter intuitive to many people's processes. Those who grow frustrated working with a keyboard/mouse to create in 2D, never even attempt 3D. This is a major chasm.

Once VR matures (and it has quite a ways to go), it's going to be transformative to those artists, specifically. The VR UI tools, which still have yet to be developed, much less polished, along with sunglasses sized VR head sets (you'll be able to wear them all day), along with full body sensing/capturing, will enable a casual artist to pick up a headset and dive into creating in 3D, and get nearly instantaneous results when they begin, because they're working inside the environment they're playing/creating in. It's a visceral experience. Presence is incredibly powerful, and is ultimately what will capture people to start working in this medium. Presence and social presence have enormous utility in content creation, education, therapy, and other aspects yet to be discovered.

I know I sound like a fan boy with my ongoing VR rantings. And perhaps I am. But I am coming from a place of over 35 years experience in filmmaking, and 18 years experience working in 3D (lightwave, mostly), working 19 years in education, mostly with artists seeking to explore these mediums. I see a spectacular convergence of technologies occurring, and I see this having a enormous and profound experience on the human condition. It's my particular POV that artists are going to have a boondoggle of new opportunities to explore, create, communicate, and inspire people simply because VR will force a democratization of access to tools that previously were inaccessible. The iPhone is an example of this kind of innovation and revolution. You could start using the iPhone nearly instantly without having to read a manual. Whereas, before that, many people had to consult with a manual for a bit before getting their "smart" phone to begin exploiting it's potential.

True VR has yet to be introduced to the masses. The Gear VR is not a full VR experience, and yet, I'm surprised to find that it evokes a powerful and potent sense of social presence. This surprise was a revelation. It actually takes very little computer horsepower to provoke a sense of being in presence of another person.

One cannot understate the importance of social presence in VR.

Sorry for the super long response, guys. I wanted to address some questions I saw in your replies, that I felt deserved a more in depth answer.

If you have the time, perhaps take a look at this video; made by the same developer, but an unedited 1 hour exploration into this working space, and the groundwork he's laying into this program to enable so many more people to participate in this new medium (which is based on the 3D platform). https://www.youtube.com/watch?v=FreMCnIHr7Y You'll see one person working with lights, while another person works with props in the same 3D scene together. This is programming in the making; at very, very early alpha stage.

We live in interesting times.

Kaptive
02-25-2016, 12:08 PM
When the technology gets there, it will become its own medium. Artists don't draw with pencils, but they might. I think where your last response and previous ones differ is your long term view. Before when you expressed VR opening up 3d to tradional artists, it paints a picture (excuse the pun) of illustrators and fine artists all of a sudden getting the bug and being allowed in. WHere as actually, what will really happen is that artists will find ways of expressing themselves in the new medium. Rather than those who have their own medium migrating to it, I see those who grow up with it in front of them finding ways to create within it, and maybe more traditional formats becoming more rare.

I mean, dance is an art form, and I could see VR dance performances taking on new forms that can be shared with others. That is just one example, but it is a good one regarding artistic expression combined with the new medium.

Anyway... just thoughts out the top of my head!

Dillon
02-25-2016, 12:38 PM
When the technology gets there, it will become its own medium. Artists don't draw with pencils, but they might. I think where your last response and previous ones differ is your long term view. Before when you expressed VR opening up 3d to tradional artists, it paints a picture (excuse the pun) of illustrators and fine artists all of a sudden getting the bug and being allowed in. WHere as actually, what will really happen is that artists will find ways of expressing themselves in the new medium. Rather than those who have their own medium migrating to it, I see those who grow up with it in front of them finding ways to create within it, and maybe more traditional formats becoming more rare.

I mean, dance is an art form, and I could see VR dance performances taking on new forms that can be shared with others. That is just one example, but it is a good one regarding artistic expression combined with the new medium.

Anyway... just thoughts out the top of my head!

Oh, yes, definitely. VR itself is a brand new art medium - it can manipulate perceptions of presence. Incredibly powerful.

And it's all developing at light speed. I think we'll start seeing the true revolution of VR within the next 3 years. Seriously.

Danner
02-25-2016, 03:24 PM
All of this is very new so it will inevitably improve, we won't even need controllers, we'll just use our hands, as demonstrated here:
https://www.youtube.com/watch?v=PA5nKnAk1t8

Dillon
02-25-2016, 03:29 PM
All of this is very new so it will inevitably improve, we won't even need controllers, we'll just use our hands, as demonstrated here:
https://www.youtube.com/watch?v=PA5nKnAk1t8

YUP!!

The orion update is blowing everyone's minds in the VR community. It won't be too long before the whole human body can be tracked the same way (including face) in a single app. There's multiple apps concentrating on various aspects, but not yet a unified app for tracking the body. It's coming, though. Guarantee it.

Zerowaitstate
02-25-2016, 05:30 PM
Dillion, i find it hard some times to understand the resistance i perceive from the community here. It's as though banging on about VR is an attack on LW or traditional 3D graphics tool set / pipline.

I'ts obvious it's in nascent state. I would expect those in the field to be embracing this technology, and instead of poo pooing the idea, coming up with ideas around how to better interact with the data sets we create, we are after all the 3D natives understanding better than most from years of experience... the challenges of asset and environment creation. What could be more fulfilling than sharing your idea not just through a 2D windows (big or small) but putting your audience IN the environment we have created.

I too started out in Amiga land LW has come a long way since then, so to too will VR this time ....

I have observed 1st hand the free sharing of ideas and concepts and lessons learned from others in the industry, in a way no other medium has been able to to-date.

Now is the time to get on board and get our ideas incorporated into the paradigm's that will be used in this new medium.

In a sense this thread has been hijacked, but Chilton seems to be the only (LW dev/plugin creator) that has made tools, plugging LW into this future medium.

\end spleen vent

ConjureBunny
02-28-2016, 11:09 PM
Good news!

132627

I decided to name things what they were, rather than a simpler version of them. The simple reason is that we're rapidly going to hit a point where standardization is vital to move the industry forward, and I plan on riding the hell out of that wave.

IPD is the distance between the centers of the pupils. This is an important one. The average in the US is different for kids and adults, and varies quite a bit. This one's measured in meters, because everything is in meters. Here's more info on IPD
https://en.wikipedia.org/wiki/Interpupillary_distance

Convergence Distance is the distance from the center point between the eyes to the point where the focal point is, at rest. In most humans, it's somewhere between 20 and 40 meters.

Polar Realignment Angle is the first of many cool tricks I have up my sleeve. As the user looks up or down, the stereo effect is reduced, to create a less nauseating effect. The reason this problem exists has more to do with the specifics of how viewer software works. It's impossible to fix this entirely, but it is, IMO, something we should endeavor to reduce.

This version ends a pretty massive rewrite of the overall series of algorithms. To get it right, I had to first create a raytracing 'emulator' that could be used to visually play back the entire render process, one ray at a time, so I could check them for position, rotation, and timing. Fun stuff :) Here's a picture of what the first half of a pass looks like, if I have it displaying all of the rays cast in a render up to around 50% down the screen:
132631

Now that I have this all done, In the coming weeks, I'm going to push out some improvements that I haven't seen done before. I think these will make Ubercam VR renders some of the most visually pleasing renders in the industry.

Samsung is bundling GearVR FREE with their new phones. They showed it off during the Oscars tonight. Google Cardboard continues to throw their weight into Cardboard. Facebook owns Oculus Rift, and they're expected to do something there soon. Facebook now allows panoramic images, as does YouTube. And Apple is up to something, I'm pretty sure of it.

It's a hell of a good time to get into VR if you aren't already.

-Chilton

Danner
02-29-2016, 09:17 AM
Everyone in VR-land is clamoring for stereoscopic 360 video that doesn't suck. All videos in VR are monoscopic, except for a few that are really just 180 and only work well looking at the center. We have been testing solutions for this lately, making several stereoscopic renders and videos at different angles then merging them together, besides being complex there is the issue of the seams and chromatic aberration distortion.. Keep us posted!

ConjureBunny
02-29-2016, 10:12 AM
Yeah, it's tricky. Ubercam creates the best stereoscopic VR you can get, but it's not perfect, and a big part of that is a limitation of the viewers.

One of the problems I'm faced with is camera matching. There's no way to create the exact same stereoscopic image that a camera creates without creating a vastly inferior software camera, to match the real world camera.

Since Ubercam is a software camera, I can do some things that a real camera can't do, like reposition the stereo cameras in an infinite (well, limited by memory) number of different locations and directions per image, to create the final image. In the real world, the best you can get at the moment would be a massive rig of hundreds of cameras, and you'd then stitch them together. It's just not physically possible. Well, maybe possible if you have a camera that can move infinitely fast :D

-Chilton

Kaptive
02-29-2016, 10:32 AM
Dillion, i find it hard some times to understand the resistance i perceive from the community here. It's as though banging on about VR is an attack on LW or traditional 3D graphics tool set / pipline.

I'ts obvious it's in nascent state. I would expect those in the field to be embracing this technology, and instead of poo pooing the idea, coming up with ideas around how to better interact with the data sets we create, we are after all the 3D natives understanding better than most from years of experience... the challenges of asset and environment creation. What could be more fulfilling than sharing your idea not just through a 2D windows (big or small) but putting your audience IN the environment we have created.


\end spleen vent

I don't think it is so much about an attack on LW or traditional methods. It's more about what is most efficient. You've been in the game for a long time, so you know just how long we sit at these screens creating content. So to imagine spending that same period of time in goggles, waving arms around to achieve the same results seems a bit silly until someone invents a viable solution. So far, I've not seen one. Now this isn't to say it wouldn't be great fun, interesting to experiment with etc.... it's just practicality, not aversion. This is why I've said a couple of times that designing the scenes within the environment (especially if VR is the intended final destination) would be a very practical approach. But modelling and texturing in it? I don't see the benefit. It's like 3d TV/monitors... it wouldn't enhance the workflow for LW for example. It'd be fun to see my objects in a true 3d way, but it's not going to make anything faster or better.

So, I hope you understand my point of view here. I'm not poo pooing VR, just taking a very practical approach to it. I think it's cool that people are developing interesting ideas and uses for it, but it's all in its infancy. A to B to C to D. Some developers seem to be jumping in at D, when really getting B right first would be more practical... wait for the input devices to catch up. I can think of a tonne of "before their time" programs and devices that were too forward thinking, and were trying to run before they had the legs to support it.

Anyways, hope that helps explain at least my view. It is exciting, but VR tech hasn't actually moved on that far since the 80s. Quality and precision are much better, but there is still a long long way to go, especially to make it accessable/affordable. The big push in VR now should bring in the investment needed to take us to the next step, and that can only be a good thing.

pinkmouse
02-29-2016, 11:00 AM
...but VR tech hasn't actually moved on that far since the 80s. Quality and precision are much better, but there is still a long long way to go, especially to make it accessable/affordable....

Indeed. For odd bits of presentation, for a short period, people will put up with the headset and handwaving, (hard core gamers aside :) ), and I can see stuff like virtual architectural projects or product prototypes doing really well. But can you really see anyone not directly involved in content creation really wanting to sit around all day in goggles? Would you? Honestly? And if you would at work, how about at home, sitting on the sofa cuddling your "significant other"?

edit: And I do see it being similar to 3D TV. Just remember all the hype a few years ago, manufacturers jumping on the bandwagon, spending shedloads of money on developing and implementing technologies, broadcasters punting special events etc. etc. Yet the consumers spoke, they couldn't even be bothered wearing the equivalent of a pair of plastic sunglasses!

VR is niche, and always will be, but that doesn't mean that LW shouldn't be a significant tool for creating it. ;)

ConjureBunny
02-29-2016, 12:07 PM
Just a year from now, I can easily see VR as being a force multiplier in the real estate world. We already have Ubercam users doing walkthrough videos for their clients. The ability to walk through a fully rendered building, and look around it while you walk, can't be touched with traditional archvis techniques.

Medical visualization is another huge area. I have four different users, that I know of, who say their entire company is embracing VR now, when it comes to showing how their products work inside the body. A movie is great, but a movie you can immerse yourself in is greater.

It's a niche market for now, but there are some really big companies throwing their weight into this. For a niche, it's growing fast in a lot of simultaneous directions.

Maybe it's boredom with traditional media. Who knows. People seem to love this VR stuff though.

-Chilton

Dillon
02-29-2016, 12:14 PM
Just a year from now, I can easily see VR as being a force multiplier in the real estate world. We already have Ubercam users doing walkthrough videos for their clients. The ability to walk through a fully rendered building, and look around it while you walk, can't be touched with traditional archvis techniques.

Medical visualization is another huge area. I have four different users, that I know of, who say their entire company is embracing VR now, when it comes to showing how their products work inside the body. A movie is great, but a movie you can immerse yourself in is greater.

It's a niche market for now, but there are some really big companies throwing their weight into this. For a niche, it's growing fast in a lot of simultaneous directions.

Maybe it's boredom with traditional media. Who knows. People seem to love this VR stuff though.

-Chilton

Yes. Unless you've personally experienced "presence" in VR, you can't understand how powerful VR is. It really is a brand new medium.

Markc
02-29-2016, 12:49 PM
Samsung is bundling GearVR FREE with their new phones. They showed it off during the Oscars tonight. Google Cardboard continues to throw their weight into Cardboard. Facebook owns Oculus Rift, and they're expected to do something there soon. Facebook now allows panoramic images, as does YouTube. And Apple is up to something, I'm pretty sure of it.

And now there's Microsoft HoloLens......only $3000 for developer version 8/

SteveH
02-29-2016, 01:15 PM
VR isn't something that will be niche in my opinion. I agree with Dillon that you can explain it to someone, but it isn't until you actually put on a headset that you understand how powerful it can be. I recently took my Gear VR to work and showed people some images I had made with LW and UberCam. I had them sit down in a swivel chair and they looked at an image with the Gear VR. There were like "oh cool" in a not that impressed way. Then I swiveled the chair they were sitting on - and 100% of the time they were just blown away. OH so THAT is what they are talking about in VR - I see now - was a common feeling from them. And that was a basic Gear VR - Oculus can be 10 times more immersive. I think VR is a major game changer. Maybe not right away for our actual production of 3D - but for viewing the final output - I think it's going to be major.

ConjureBunny
02-29-2016, 01:44 PM
And now there's Microsoft HoloLens......only $3000 for developer version 8/

Yeah I got that announcement this morning. I'm on the fence on that one.

But on the other hand, you can get those Golden Arches for your eyes (http://3dvrcentral.com/2016/02/29/mcdonalds-is-now-making-happy-meal-boxes-that-turn-into-virtual-reality-headsets/).

-Chilton

Kaptive
02-29-2016, 04:23 PM
Hololens in concept sounds like a great idea, and the advert made it seem like it was going to be affordable by one and all. But then $3000 dollars, and the viewing area is meant to be pretty small. This one is going to take quite a while to cook until it is trully ready. I have to say though, I kind of like the concept. One problem I have with VR (especially when we are talking here about working in that environment) is shutting out the world. Hololens does seem to be finding the balance.

ConjureBunny
02-29-2016, 05:28 PM
VR isn't something that will be niche in my opinion. I agree with Dillon that you can explain it to someone, but it isn't until you actually put on a headset that you understand how powerful it can be. I recently took my Gear VR to work and showed people some images I had made with LW and UberCam. I had them sit down in a swivel chair and they looked at an image with the Gear VR. There were like "oh cool" in a not that impressed way. Then I swiveled the chair they were sitting on - and 100% of the time they were just blown away. OH so THAT is what they are talking about in VR - I see now - was a common feeling from them. And that was a basic Gear VR - Oculus can be 10 times more immersive. I think VR is a major game changer. Maybe not right away for our actual production of 3D - but for viewing the final output - I think it's going to be major.

This is so true. When someone tries it for the first time, and looks around, it just blows them away. Every single time.

Except VFX people. They're harder to impress :D

-Chilton

ConjureBunny
02-29-2016, 05:31 PM
Hololens in concept sounds like a great idea, and the advert made it seem like it was going to be affordable by one and all. But then $3000 dollars, and the viewing area is meant to be pretty small. This one is going to take quite a while to cook until it is trully ready. I have to say though, I kind of like the concept. One problem I have with VR (especially when we are talking here about working in that environment) is shutting out the world. Hololens does seem to be finding the balance.


I agree--augmented reality will be where we do a lot more work.

I can absolutely see a 'we can almost do this now' scenario for augmented reality:
Imagine being able to use a coffee table in front of you to lay out objects in a scene. Then imagine working with other effects people, or a director, or actors, all interacting with virtual pieces in scene on a very real table. It's going to be freakin' awesome.

I'm not sold on virtual reality for that scenario, but augmented reality? absolutely.

-Chilton

Paul Brunson
03-12-2016, 01:15 PM
Have a little thread at liberty3d forums going with Ubercam Immersive Stereo animations and stills. Nothing amazing, mostly just doing technical tests. But I thought I might share the tests for other people learning and interested in VR.

http://www.liberty3d.com/forums/viewtopic.php?f=34&t=1108&p=7076#p7076

(Sorry for resurrecting the thread. But I thought it better to post here than start a new thread for a link)

jwiede
03-12-2016, 01:35 PM
I agree--augmented reality will be where we do a lot more work.

NASA apparently agrees. (https://blogs.windows.com/devices/2016/02/20/microsoft-hololens-in-space-making-science-fiction-mixed-reality/)

ConjureBunny
03-12-2016, 10:14 PM
So, we've got version 2.4 coming out, which will have the aforementioned pole flattening feature.

The trick here is that I already have a better solution, which we'll be sending out, soon.

And now that I've got that one in the bag, I can think of an even better solution than that.

And there's the floodgate. Ubercam is way beyond 'making 360 stereo images' and we're now in the 'making them *better*' phase. I have no idea where this is going, or when it will stop. To be honest, we're limited by the playback engines (and I'm tempted to take on that part as well). The playback engines in place now are a result of trying to match real world cameras, and we can do so much more. So I'm going to keep pushing out new tweaks as I can come up with them.

My apologies to anyone making long movies with this, only to find you have to redo older parts because the newer parts look better. I guess that's the march of progress--I will try to make sure new tweaks have some kind of backwards compatibility, but I can't promise that.

-Chilton

jwiede
03-12-2016, 11:49 PM
I will try to make sure new tweaks have some kind of backwards compatibility, but I can't promise that.

I wouldn't worry much beyond reasonable defensive coding against poison files/settings and perhaps updating settings on load (with notification so user can decide whether to save new settings). Output improvements are a part of software life, and users need to decide how much risk and change they can manage in their production schedules. Anyone using software in production should know not to switch to newer versions without planning/testing -- if they don't, it's a lesson they need to learn. Focusing more on documenting changes and known issues will enable users to better make their own decisions.

Glad to hear about the further improvements coming! Don't forget, though, that while operational improvements are great, they are of minimal value if customers have no efficient way to understand/optimize their workflows to make use of those operational improvements, or come up with "non-magical" (read as: user-understood) workflows in the first place.

There are still areas where workflow and settings are still pretty sub-optimal from a UX perspective, and UI and documentation improvements could help make that UX substantially better. For example, putting up fields labeled with obscure terms, and no tooltip or hover help to provide explanations, reinforces discoverability/usability issues. Documentation that goes into the theory of _why_ users should use certain workflows, and the relative benefits therein, would also enhance the overall UX. Right now, documentation and help at both workflow-level and specific field-/control-level are both still a bit... sparse.

ConjureBunny
03-13-2016, 02:33 PM
Documentation that goes into the theory of _why_ users should use certain workflows, and the relative benefits therein, would also enhance the overall UX. Right now, documentation and help at both workflow-level and specific field-/control-level are both still a bit... sparse.

I absolutely agree with this. I'm hoping to do a fairly in-depth explanation of the new tools for the next update of the docs.

Additionally, I've been trying to figure out a way to cram some kind of simple documentation into the product itself, so you have some info on the camera screens themselves. On this next version I think I've spent more time trying to shoehorn a solution into the UI than on actual product code, but I think it will be worth it, if I can make it simpler to understand.

-Chilton

jwiede
03-13-2016, 08:38 PM
On this next version I think I've spent more time trying to shoehorn a solution into the UI than on actual product code, but I think it will be worth it, if I can make it simpler to understand.

Sounds really promising! Those kinds of additional explanatory content, etc. in-app/plugin, are often what elevate a product to "great" from simple "good enough".

I really wish LW3DG would put some effort into boosting the abilities of the UI toolkit, etc. to support such help content, because it's so important both for LW itself and for third-party devs. Even if they could just add a CORE-style "webkit pane view" and tooltips, those would allow substantial improvements over what's currently possible with native LW UI options.

ConjureBunny
03-16-2016, 08:42 PM
New build, 2.4 in da haus...

http://www.liberty3d.com/2016/03/ubercam-2-4-update/

Adds polar convergence start and end points. Basically, these are the angles at which it starts to align the toe-in and it also draws the eye IPD together, then eventually the eyes are at the same point, facing the same direction in perfect parallel.

This addresses a limitation in the playback systems where looking down looks crazy. Now, not so crazy.

Why is this a setting? Because there's no 'right answer' to how this should work, at the moment. Sure, I can make some suggestions (the default values), but one of you intrepid souls might find a better combination. If so, lemme know...

Thanks!
-Chilton

rdolishny
03-17-2016, 08:45 PM
I just bought UberCam 2.4 with the current coupon code... the manual alone is worth the price of admission. A lot of work went into it, but the joy and enthusiasm for seamless one-click 360 immersive animation rendering cannot be understated.

Many of the features are Lightwave exclusive.

And I'm just starting to explore the other cameras (like a faster Perspective camera if you're not using DOF and MB).

But the one-click 360 camera has to be seen to be believed. And it works in VPR (turn off Draft rendering). Astonishing.

I'll have an animation ready tomorrow. It took me 90 seconds to set up after I invested the 20 minute to read the manual throughly. It's worth a second read.

LWers: this plugin opens up an entire world of VR and 360 animation that people are craving. Go out and make some content.

Thanks Kat and Liberty3D and whoever else is involved with this seamless plugin.

www.ardee.xyz

jeric_synergy
03-18-2016, 01:21 AM
I'm liking the product, but to me there's 'WAYYYYyyyyy to much preamble in the manual. It's kinda backwards: a bunch of fluff up front, followed by stuff you will actually want to know.

ConjureBunny
03-19-2016, 01:42 PM
I've received some excellent feedback on this version, but if you have any questions or complaints, speak up!

What would make your pipeline easier?

-Chilton

Markc
03-19-2016, 04:27 PM
Chilton, any progress on the Mac Rift Viewer :thumbsup:

SteveH
03-19-2016, 06:05 PM
I've received some excellent feedback on this version, but if you have any questions or complaints, speak up!

What would make your pipeline easier?

-Chilton

I believe a super model to hold my beer would really help my pipeline Chilton! :D
Couldn't hurt at any rate.

Markc
03-20-2016, 08:43 AM
There are only two plugins in the OSX folder.
Should Mac users still use the old version of L3DStereoFOSX?

Playing with Version 2.4 Immersive Cam at 3840x3840
Aspect 1.0 - bad result
Aspect 1.16 - good result

ConjureBunny
03-20-2016, 10:49 AM
There are only two plugins in the OSX folder.
Should Mac users still use the old version of L3DStereoFOSX?

Playing with Version 2.4 Immersive Cam at 3840x3840
Aspect 1.0 - bad result
Aspect 1.16 - good result

Aspect should always be 1.0. You can change it if you are super weird and need super weird results. I think we have one user who does need that, which is why I left it in.

I mean, you can halve the dimensions and set the aspect to .5, but I'm not sure what that actually buys you, if anything.

There should be one folder in the Mac folder, which installs everything, I think.

I'm still working on Oculus Rift support for the Mac, but an update should be along soon.

And lastly, I'm in the middle of solving a programming problem at the moment, so if this message sounds cold and mechanical, that's why :)

...will return to human mode later.

-Chilton

Markc
03-20-2016, 11:04 AM
Understood.
When your back to human mode, my render doesn't show properly at 1.0 aspect, but does if I tweak it.
This is not urgent :)

ConjureBunny
03-20-2016, 01:45 PM
Understood.
When your back to human mode, my render doesn't show properly at 1.0 aspect, but does if I tweak it.
This is not urgent :)

Hi,

OH. I just noticed what you said. I read it wrong the first time.

WTH. I'll take a look!

-Chilton

ConjureBunny
03-20-2016, 04:58 PM
BTW, we're doing a 25% off sale at the moment.

http://www.liberty3d.com/2016/03/its-the-liberty3d-com-spring-training-sale-get-25-off/

So if you're thinking of buying it, don't buy it without using the coupon code.

-Chilton

JamesCurtis
03-20-2016, 06:14 PM
Any idea if the crashing experienced with the plugin when you have a render, setting, or other window on a second monitor, can be fixed. I'm using Windows 8.1 and I like to have a render window open on my second monitor because otherwise it hides the camera view.

ConjureBunny
03-20-2016, 08:31 PM
I didn't know there was a second monitor problem. I'll take a look!

-Chilton

ConjureBunny
03-21-2016, 12:45 AM
Hi,


There are only two plugins in the OSX folder.
Should Mac users still use the old version of L3DStereoFOSX?

Playing with Version 2.4 Immersive Cam at 3840x3840
Aspect 1.0 - bad result
Aspect 1.16 - good result

Bigger question I forgot to ask: why are you shooting at 3840x3840 instead of 2048 or 4096?

-Chilton

spherical
03-21-2016, 12:52 AM
Out of my depth in 360 video ATM, but 3840 is the native resolution of this 4K monitor. Yes, it will do 4096 but is a bit overkill, no? Doesn't YouTube specify 3840? Or am I all wet? :D I mean, you're da man on 360 Vid around here, so your question caught my eye, as it doesn't line up with that which I thought I knew. Wouldn't be the first time.... :)

spherical
03-21-2016, 01:31 AM
New build, 2.4 in da haus...

Have the final build email links gone out? I see people playing with it, so I'm guessing: "yes". I received a message from Kat on 2/29/2016 about the new version. It contained an attached beta but, since then, zip. It was a reply to an earlier email conversation, so not part of the database email blast stream. Perhaps the database may not be updated to the email that we prefer to use, but nothing has arrived to the old email either. We run our own email server, so checking spam traps is a default. I have an internal project that would be a great use for it. Thanks, guys.

ConjureBunny
03-21-2016, 11:44 AM
Hi,

Bah. Everyone knows 4k is 4000 x 4000

:D

I'm coming at this from the other end--this time last year I was helping design hardware, and the assumption there is that all video would be 4096 (even multiple of 256 or something--I forget what that's called) so the lower end hardware could push it around faster. 4096 is the cap for some Droid devices.

But then, what I know about the way the video is displayed in the apps is that it's any even multiple. So yours should work fine.

I've just been using 4096 out of habit, but now that you made me think about it, hmmm.....

So the S7 is the current flagship. It displays a vertical (during landscape orientation) size of 1440 pixels.

But you only see part of the image at a time.

There's a way to solve this with math but my head's not in the game at the moment. If anyone else wants to figure it out, I'm all ears.

So I guess there are two things to take from this stream of consciousness post...
1) I use 4096 out of habit.
2) It should look right at any equal size, even 3123 x 3123, and it doesn't. that's a bug I need to fix.

-Chilton

spherical
03-21-2016, 04:25 PM
IIRC, when doing VR and/or planar movies in QuickTime ages ago, the recommendation was that any pixel dimension was OK as long as it was divisible by 8 in each direction order to make downsampling faster.

The question I have is, you say you use 4096 out of habit. If 2K is 1920 then 4K is 3840. Why render the extra? I think that the rounded up label is making folks think that "true" 4K is, well, 4 x 1024 = 4096. Yes, it is. Doesn't mean that the display device native resolution equals that, however.

ConjureBunny
03-21-2016, 06:09 PM
IIRC, when doing VR and/or planar movies in QuickTime ages ago, the recommendation was that any pixel dimension was OK as long as it was divisible by 8 in each direction order to make downsampling faster.

The question I have is, you say you use 4096 out of habit. If 2K is 1920 then 4K is 3840. Why render the extra? I think that the rounded up label is making folks think that "true" 4K is, well, 4 x 1024 = 4096. Yes, it is. Doesn't mean that the display device native resolution equals that, however.

That makes sense to me.

I meant habit, as in programming habit. Everything's in powers of 2. 1024, 2048, 4096, etc.

There are so many issues with VR at the moment and pixel display that I think the rulebook can be thrown out, ripped apart, burned, and snorted.

The lens only uses a subset of the screen.
The lens distortion only allows a subset of that to be really usable. The rest is blurry.
The software only shows a subset of the entire image.
The focal length and field of view dictate how much of that is visible, and how zoomed in it is.

And all of that neglects the really fun stuff like distortion due to dominant eye preference, visual saturation during movement, IPD blurring, and stuff like that.

-Chilton

spherical
03-21-2016, 06:45 PM
Figured that to be the case. Finding the lower limit of all that could lead to a sweet spot that yields a faster render in both the source application and the VR playback. IOW, if 20% of the image can only be displayed at one time and the viewport is only 1920, a lower source resolution may yield output that is undiscernible from the full rez version and be a faster frame rate.

ConjureBunny
03-21-2016, 07:09 PM
Figured that to be the case. Finding the lower limit of all that could lead to a sweet spot that yields a faster render in both the source application and the VR playback. IOW, if 20% of the image can only be displayed at one time and the viewport is only 1920, a lower source resolution may yield output that is undiscernible from the full rez version and be a faster frame rate.

Exactly!

Until they make better headsets and you have to re-render everything :D

-Chilton

spherical
03-21-2016, 09:39 PM
Well, there is that.... :)

Paul Brunson
03-22-2016, 02:05 PM
I'm rendering a series of tests at 4096 x 4096, it definitely takes a while. Once I've got the renders done I'll be using Fusion to scale it down and do some resolution tests. Idea being to start at the highest res currently supported and go down from there, looking for a lower resolution that looks 'good enough'. (I'll do some 30 fps and 60 fps versions as well)

In my tests on stills I've actually found that sometimes higher resolutions can make things look worse. For example, stepping patterns along edges due to the image being scaled down.


Question for Chilton:
I notice that the contest entries for Otoy and Oculus "render the metaverse" are in cubic format. 12 images, a stereo pair per each side (6). Granted these are stills. Are there any oculus video formats that support cubic mapping? Seems like it would allow for no distortion at the poles.

Some links:
https://render.otoy.com/vr_gallery.php
https://www.reddit.com/r/oculus/comments/2v02d1/carmack_says_cube_maps_the_ideal_target_format/

ConjureBunny
03-22-2016, 02:34 PM
Question for Chilton:
I notice that the contest entries for Otoy and Oculus "render the metaverse" are in cubic format. 12 images, a stereo pair per each side (6). Granted these are stills. Are there any oculus video formats that support cubic mapping? Seems like it would allow for no distortion at the poles.



You know, I noticed that. That seems odd to me. I need to find out how this format is rendered back, because as far as I know, it's just a cube map. If it's just a cube map, it might not have polar distortion, but it certainly can't do stereo poles, since the way the players work doesn't allow that. So I'm not sure what the benefit is, if there is one. I need to do some research though to find out.

-Chilton

spherical
03-22-2016, 05:07 PM
but it certainly can't do stereo poles,

Can you elaborate, please?

ConjureBunny
03-23-2016, 03:01 PM
New build en route, v2.4.1, which fixes an issue with the location of the start of the ray from the eye for each pass.
It's a trivial issue, but one I wanted to make sure was precise.

133073

-Chilton

ConjureBunny
03-23-2016, 06:21 PM
Also, I think this build fixes Markc's problem:

133089

-Chilton

Nicolas Jordan
03-24-2016, 11:59 AM
I use UberCam mainly for the Panoramic Camera and find version 2.4 to be very stable compared to previous versions. I haven't any crashes yet! :thumbsup:

spherical
03-24-2016, 10:39 PM
Is Radial Shift intended to be interactive in the camera viewport? I get no feedback.

spherical
03-25-2016, 03:54 AM
Now we're talkin' the future of VR:

http://www.pcworld.com/article/3047549/virtual-reality/pornhubs-wild-free-vr-porn-channel-will-blow-your-mind.htmlhttp://www.pcworld.com/article/3047549/virtual-reality/pornhubs-wild-free-vr-porn-channel-will-blow-your-mind.html

Paul Brunson
03-25-2016, 10:15 AM
Now we're talkin' the future of VR

Personally I'm hoping other industries see and grab onto the potential of VR. I hope VR doesn't get a "isn't that tech mainly used for porn" stigma attached to it.

I'm still angry at Hollywood for completely botching the 3D experience in theaters. All of the terrible "stereo in post" work has really reduced it to just a gimmick.

ConjureBunny
03-25-2016, 10:17 AM
Is Radial Shift intended to be interactive in the camera viewport? I get no feedback.

I ... don't know. Interesting.

IIRC that was a camera added by popular request, and I'm not completely sure if it was meant to be interactive or not. I'll take a look.

-Chilton

ConjureBunny
03-25-2016, 10:20 AM
Personally I'm hoping other industries see and grab onto the potential of VR. I hope VR doesn't get a "isn't that tech mainly used for porn" stigma attached to it.

That's an interesting point. If the industry doesn't take off, people might assume anyone watching VR content is watching porn. Hmm.

-Chilton

Paul Brunson
03-25-2016, 10:23 AM
In other news! Oculus recently updated their Photo 360 viewer on the Gear VR. MASSIVE improvement in the viewing quality at the poles of images rendered in equatorial format (ie ubercam's Stereo Immersive 360 camera)

At first I thought my most recent render had somehow come out amazingly better, had to check my prior renders on the Gear VR again. Its really quite a huge improvement, it brought the quality level at the poles up to a similar level as the cubic mapped images.

Paul Brunson
03-25-2016, 10:38 AM
That's an interesting point. If the industry doesn't take off, people might assume anyone watching VR content is watching porn. Hmm.

Motivates me to try and create other cool content. Outside my various test scenes, I'm working on a model of a family cabin we're building. Everyone has different opinions on paint colors etc. I'm hoping I can have them stand in the unfinished rooms, put the Gear VR on and see a preview of what it might look like.

I think architectural clients would really enjoy being able to go to a location. Put on the headset and see their new house, building, renovation etc right there on the spot.

ConjureBunny
03-25-2016, 10:46 AM
In other news! Oculus recently updated their Photo 360 viewer on the Gear VR. MASSIVE improvement in the viewing quality at the poles of images rendered in equatorial format (ie ubercam's Stereo Immersive 360 camera)

At first I thought my most recent render had somehow come out amazingly better, had to check my prior renders on the Gear VR again. Its really quite a huge improvement, it brought the quality level at the poles up to a similar level as the cubic mapped images.

Interesting! I hadn't looked closely. Going to go do that now.

spherical
03-25-2016, 02:38 PM
I ... don't know. Interesting.

IIRC that was a camera added by popular request, and I'm not completely sure if it was meant to be interactive or not. I'll take a look.

Thanks. That's one camera that needs to be interactive. Difficult to control perspective otherwise. Usually, it's a fairly fine adjustment in swing and dialing that in through a blizzard of F9s would be tedious.

While we're on the subject.... is there a possibility of making a combination of Radial Shift and Real Lens cameras? I have made a swing/shift camera from the native Advanced Camera using the "shoot through a plane" trick and it works great. Just like a view camera where the swing happens all on its own, without having to set a factor then rotate the camera back into position to reframe the shot, see if it looks good, rinse, repeat. IOW, the perspective just changes right there; with the camera locked down. All well and good. BUT, it would be a boon to be able to use the series of lens data available and put them on the Advanced Camera while using the "shoot through a plane" trick.

Having the ability to rotate the lens plane and film plane also allows for DOF skewing. Don't think that this is possible with the current Shift cameras, as there's no rotation of either; only shift and recenter. IIRC, there's a plugin from Cool Museum that has a rotate that works well, but it's 32-bit only.

Markc
03-27-2016, 08:45 AM
Hey Chilton,

Here are a couple of screen grabs which may be useful in the aspect problem.

The 2D Imm Cam seems to extend to the boundary of the environment (which renders properly).
The 3D Imm Cam doesn't extend as far (which doesn't render properly on Aspect 1.0, but does on Aspect 1.16).
But I also noticed taking these screen grabs, the Aspect 1.0 shows a 'broken' segment in the camera!

ConjureBunny
03-27-2016, 09:28 AM
Interesting. That looks like more of that wonky frame setting. Aspect 1.0 should wrap around perfectly, under all conditions, but right now it's affected by frame size.

What is Frame set to in these?

-Chilton

Markc
03-27-2016, 09:59 AM
Frame is set at default 0.5079 (my default anyway).

On a separate issue, I have tried cloning a 2D Imm Cam in scene editor, hit save/save increment/save as, it bombs out LW, if I add a new camera it's fine.

Since LW bombed a couple of times, after reopening LW adding a new camera, it defaults to Frame 0.5906, Aspect 1.0, and it displays fine in 2D/3D Imm cam......strange.....BUT there is something weird going on depending how I select a camera.
It will show differently in VPR depending if I pick a camera from the bottom of LW interface/the camera panel/scene editor/camera selection top of interface, e.g. if I select a 2D camera from camera panel, then select Current camera from LW interface, then select 3D camera from same pulldown it displays fine on Aspect 1.0, if I re-select 3D camera in Camera panel it doesn't.

Sorry if this is a bit confusing.
...

Markc
03-27-2016, 10:06 AM
Ok, I might have it sussed.
Aspect set to 1.0, Frame at 0.5906.
If Current Camera is selected in the Camera pulldown in LW interface and I select from the bottom Camera pulldown menu all is fine.
The trouble seems to be when selecting from Camera panel or Scene editor.

ConjureBunny
03-27-2016, 11:11 AM
Frame is set at default 0.5079 (my default anyway).

On a separate issue, I have tried cloning a 2D Imm Cam in scene editor, hit save/save increment/save as, it bombs out LW, if I add a new camera it's fine.

Since LW bombed a couple of times, after reopening LW adding a new camera, it defaults to Frame 0.5906, Aspect 1.0, and it displays fine in 2D/3D Imm cam......strange.....BUT there is something weird going on depending how I select a camera.
It will show differently in VPR depending if I pick a camera from the bottom of LW interface/the camera panel/scene editor/camera selection top of interface, e.g. if I select a 2D camera from camera panel, then select Current camera from LW interface, then select 3D camera from same pulldown it displays fine on Aspect 1.0, if I re-select 3D camera in Camera panel it doesn't.

Sorry if this is a bit confusing.
...

Not confusing. A wee bit irritating though. None of this should be this hard. I'll see what I can do to overcome the frame issue.
As for it crashing, that's a bug, and I need to figure that one out. The fact that it happens on your Mac is encouraging, since I prefer the Mac debugger over the Windows offerings.

Does that happen on an empty scene, and if not, can you send me a scene it happens in?

Thank you!
-Chilton

Markc
03-28-2016, 05:00 AM
Hey Chilton,

Just tried a fresh scene, it only bombs out when I save, if I clone a 2D Immersive Camera (havn't tried any of the other L3D cams, but 3D Stereo Imm Cam is ok).
(btw I am on LW 2015.3)

Markc
03-29-2016, 03:39 PM
Hi Chilton,

Do you know if Ubercam works with the SNUB Launcher?
I tried a render yesterday and the frames where black.
I have asked Mike from Dreamlight, who says if Ubercam works with LWSN it should work.

ConjureBunny
03-29-2016, 05:47 PM
Well.

I'll check with Mike on that and get it figured out. Thanks for letting me know!

-Chilton

Zerowaitstate
03-30-2016, 08:24 AM
i just jumped into runtime 1.3 to check the oculus store side of things out on DK2 is this going to kill my ubercam, or conversley will updating my uber cam to latest version kill the 1.3 run time >?

Markc
03-30-2016, 12:37 PM
AFAIK, the Oculus runtime should not affect LW/Ubercam (Mac version is only version 0.5.0.1, since they dropped mac support).
It will probably only affect what content you can view from the store.

jeric_synergy
04-08-2016, 01:29 PM
Chilton, does UC have issues with Lens Flare representation in the UI?? Like, an offset?

Markc
04-10-2016, 08:09 AM
Chilton, is the Mac Rift Viewer any closer to working?
Even a beta!

ConjureBunny
04-11-2016, 03:41 PM
Hi,


Chilton, does UC have issues with Lens Flare representation in the UI?? Like, an offset?

I think that's done as a post process, in which case it wouldn't quite work right. I'm not sure if there's anything I can do about it, but if you can throw together a sample scene that includes it, I'll take a look.

In case you're wondering why I can't put together something simple like that, here's why: I am lazy, and if I have to take more than a few steps to set something up, I'll find some way to procrastinate around it. If you make it real easy for me to replicate, I'll open it, think, "huh. that's weird", and then there's a much better chance it will get addressed :)


-Chilton

- - - Updated - - -


Chilton, is the Mac Rift Viewer any closer to working?
Even a beta!


Well, as for *closer*, who knows. But I was working on it a few minutes ago before I started checking the forums. It's my #1 annoyance right now, so it'll get done soon.

-Chilton

Markc
04-12-2016, 12:56 PM
Thanks, much appreciated.

jwiede
04-12-2016, 01:16 PM
Chilton, is the Mac Rift Viewer any closer to working?
Even a beta!

Seconded! Some of us are getting quite desperate. Any update?

Thanks!

- - - Updated - - -


Well, as for *closer*, who knows. But I was working on it a few minutes ago before I started checking the forums. It's my #1 annoyance right now, so it'll get done soon.

Fair enough! Thanks again!

Markc
04-17-2016, 09:14 AM
Am trying to compare images on Oculus Rift to Gear VR.
I have copied a 360 image and a stereo 360 image to gallery on S6 phone, but they don't show up in the Oculus 360 photos on Gear VR.
Do they need to be a particular format or resolution?

SteveH
04-17-2016, 09:21 AM
the images need to go in the Oculus/360 photos folder to view them.
Is that where you put your renders?

There should be no spaces - so it's oculus/360photos

Hope this helps - if not let us know.

Markc
04-17-2016, 09:39 AM
No, they are just in the gallery application.
Can I access that folder from the phone?
I don't have any Samsung software on my Mac to access the phone at present.

SteveH
04-17-2016, 09:46 AM
try pressing on the picture and holding until you get a check mark on it. Then click on the 3 little circles, and choose copy to - or move to, then click on the plus sign and I "think" you can make a folder from there in the oculus folder. I say "think" because to be honest - I'm old and phones are a practical mystery to me most of the time. If they were more like a computer then it would make it way easier. Man I'm old......

Markc
04-17-2016, 10:32 AM
Thanks, that put me in the right direction.
I needed a folder called 360Photos in the Oculus folder.
If it is not exact syntax it doesn't work :thumbsup:

SteveH
04-21-2016, 05:04 PM
Chilton

How is that new viewer coming along?
I tried to send you a private email - but "somebody" hasn't cleaned out their messaging folder so they are too full to receive any more...:D

ConjureBunny
04-21-2016, 08:02 PM
oh crap I forgot that's even possible.

Yeah it's coming along really well. We'll be sending a beta out to registered users soon. Probably this weekend.

My goal with this app is to give Ubercam users something they can use to actively promote their services to their clients. It's pretty cool. More info soon.

-Chilton

omichon
04-22-2016, 01:18 AM
My goal with this app is to give Ubercam users something they can use to actively promote their services to their clients. It's pretty cool. More info soon.

-Chilton
That would be very welcome :thumbsup:

Markc
08-07-2016, 04:36 AM
Chilton, any updates to the beta :D

jwiede
08-08-2016, 02:34 PM
Chilton, any updates to the beta :D

And/or regarding the Mac viewer for Oculus?

tonyrizo2003
08-08-2016, 04:58 PM
wow!! very cool!!

Lito
08-09-2016, 08:21 AM
Does this work with the HTC Vive? I am thinking about getting one.

Danner
08-10-2016, 10:29 AM
Yes, pretty much any 360 stereoscopic video can be played with the Vive, you'll need to download a viewer tho.

metallo
09-09-2016, 12:58 PM
I rendered this: http://bit.ly/Sully3D (http://bit.ly/Sully3D)using UberCam. At about half the time per frame compared to built-in "Advanced" camera in Layout, I'm thankful to have had the plugin. Over ~60 days total computer time (i7-4930 and i7-3770), monovision (NOT stereo), countless 32MB Tiff frames (uncompressed, with alpha), 30fps, 4096 x 2048, radiosity background only. Yes, my radiosity should have been cached and it's pretty ugly, but by the time I figured out my mistake, it was too late to restart everything.I considered full stereo render but again time was an issue and I just don't think stereo is convincing or practical yet with this type of render (willing to be proven wrong). The issue of phasing the left/right offsets with all possible head angles is not trivial. I also looked into GPU rendering but with very large textures and not much time to test, I wasn't overly confident...plus my video cards are a couple generations old already.

I composited frames in Adobe Premiere, exported to MP4 container, and used the YouTube spatial data injector (https://support.google.com/youtube/answer/6178631?hl=en) to add the tags to make it recognized upon upload. If you don't have one, get a cheap Google Cardboard type viewer (~$12 on Amazon) and enjoy this new 360 world.

More on the project at http://bit.ly/15493D

134405134406134407134408

SteveH
09-09-2016, 02:14 PM
Awesome job on it. I love the audio as well as the visuals - surprised there was so little audio from the plane itself. I never heard him say he was going to ditch in the river - but maybe it was there. It would have been cool to hear what he said to the passengers (and what he was saying to the co-pilot) as well. No clue if that is publically available though.What was your camera distance form the plane itself? Was there a reason you chose it to be so far off the side of the plane as opposed to say the front of the nose? Just curious. Not sure how I'd go about viewing this on my Gear VR - but think it would be very cool. Great use of UberCam!

metallo
09-09-2016, 03:04 PM
In this video, I focused a little more on some of the audio among the ground facilities, as that has often been overlooked. The majority of the transmissions from 1549 are still there. The audio from inside the cockpit is not public, the only way we can review that is via the transcript. It's difficult to read, but the full transcript is on the floating radar screen underneath you in this 360 video. I have a couple other versions on YouTube that are much better in terms of following the accident timeline and dialog. The camera position is really just meant to give some "dramatic angles" at various times. I experimented with separate camera locations e.g. one along the runway so you could watch takeoff, then one in the air following, then one at the bird impact, etc, but the jarring between spatial locations is difficult to follow when you're trying to view in 360. Continuity was, for me, important. I would say the camera is mostly a couple hundred meters.

Regarding Gear VR and YouTube, I had no idea until just now that it wouldn't let you watch YT videos. What an incredibly crippling "feature". I found a number of ways people try to work around this, one of which is this: https://play.google.com/store/apps/details?id=com.kunkunsoft.cardboardappforgearvr


Awesome job on it. I love the audio as well as the visuals - surprised there was so little audio from the plane itself. I never heard him say he was going to ditch in the river - but maybe it was there. It would have been cool to hear what he said to the passengers (and what he was saying to the co-pilot) as well. No clue if that is publically available though.What was your camera distance form the plane itself? Was there a reason you chose it to be so far off the side of the plane as opposed to say the front of the nose? Just curious. Not sure how I'd go about viewing this on my Gear VR - but think it would be very cool. Great use of UberCam!

Dillon
09-15-2016, 12:31 AM
After reading the comment/thread on facebook about the ominous silence coming from LW3D Group, it appears that Kelly/Kat Meyers is very very VERY unhappy with the team developing (are they?) LightWave. I think it's safe to assume that Ubercam / Oculus VR development is dead for LW. I can't imagine with the vitriol spewing out of Kat's mouth that he's interested in developing plugins for Lightwave anymore :(

Sh!t. I am moving in the direction of VR. I was really hoping that LightWave would be a part of this traversal. Dammit

jeric_synergy
09-15-2016, 01:34 AM
I think Chilton has a say in it too. I wouldn't worry.

Dillon
09-15-2016, 02:09 AM
Neither Chilton or Kelly have participated in this thread for almost 5 months now....

About the same time LW3D Group went silent.

Ugh.


I think Chilton has a say in it too. I wouldn't worry.

Zerowaitstate
09-15-2016, 02:33 AM
Touching base with this, i am attending the Connect 3 conference in a few weeks if there is any information or contacts i can make while there to help get this product through more iterations. Please let me know.

I think it would be very hard to invest time im LW dependent plugs, knowing that a huge engine upgrade is in the works, and not knowing if the man hours invested are being pissed up the wall

jeric_synergy
09-15-2016, 10:27 AM
Neither Chilton or Kelly have participated in this thread for almost 5 months now....
Both C & K are quite active on FB.

And maybe not in this thread, but I'm pretty sure Chilton has posted in others.

Kevbarnes
09-15-2016, 12:53 PM
@metallo
Brilliant love the render - Take a look at www.littlstar.com

Its a VR platform allowing downloads