PDA

View Full Version : HDRI challenge



Blochi
06-23-2003, 06:11 AM
Hello,

This is an open call to take part in an exciting challenge about HDRI. You get a scene file, backdrop, and matching environment - all set up already.
This one is perfect for HDRI and IBL newbies, as well as for people that want to get deeper into the whole thing.

This is the stage:

http://www.Blochi.com/gfx/hdri/pics/Apartment_Example_01.jpg


And this is where you get the scene:

www.Blochi.com/gfx/hdri


And - most important - This is the board where you can post your result, and discuss them:

http://hdri.cgtechniques.com/~blochi

Please, DO post there - I put a lot of work in those HDRIs, you get them for free, all I ask you in exchange is to submit a render.


The interesting part is that we are trying to get this HDRI setup converted and rendered in as many different packages as possible. So, spread the word ....

and have fun!

Blochi

riki
06-23-2003, 06:51 AM
I'd love to play around with HRD but being on a Mac, my options are pretty limited at the moment. I'll check out your links though.

best

r

facial deluxe
06-27-2003, 02:51 AM
I'm looking for this since ages ! Where the hell did you get a decent chrome ball ???

Blochi
06-27-2003, 02:57 AM
Hehe,

This one was from home depot, called a "Beautiful Gazing Ball". It's far from beeing perfect, but good enough.

I heard people using pinballs, bearing balls (kings bearings is linked on the HDRShop website), and even a quality soup ladle is supposed to work.

-Blochi

EDIT: Good enough means, that dents and scratches on such a large ball have less affect on image quality than on a small glass ball. It's quite easy to compensate by taking 3 instead of 2 views, and compe the good parts together....

Arnie Cachelin
06-27-2003, 10:44 AM
According to Paul Debevec, the glass 'Comtemplation Spheres' are only about 70% reflective. He recommended taking a well lit image of the sphere with some white paper, so the paper is directly shot and reflected. You then can see how much brighter the direct white paper is over the reflected version. Once you know the reflection percentage, you can brighten all your exposures to compensate for the loss.

Blochi
06-28-2003, 07:41 AM
Thanks Arnie,

This IS a good note, since I totally forgot about that already. :D
There is a weird brightness issue with HDRShop-generated HDRIs anyway, because they always tend to be very dark and have a comparably small Range. It seems to align the composed range to the darkest source image, totally ignoring the actual 'world' values. So I had to put a lot of eyeballing in the environments. Usually stopping up by 5 or 6. PhotoSphere doesn't seem to have this problem, by the way....

Another issue on usuability seems to be the gamma.

I mean, per definition an HDRI is linear.
But - there is a lot of confusion about how a rendering engine is supposed to work with that information. If you light a scene with that linear HDRI, you allways get an extreme harsh lighting with a lot of color bleeding. It usually looks much better if you set a Gamma of 1.8 in Image Editor. Then you start getting all the subtle lighting effects you would expect. And from what the HDRI challenge shows so far, this is the same in other 3d packages.

I've tried to explain this standpoint on this challenge site: HDRI challenge Forum - Tipps (http://hdri.cgtechniques.com/~blochi/show.php?id=326&PHPSESSID=fed4470a7d80cfdcc32978ef6addcef2)
I would love to hear Arnie's opinion on this.

Could it be that all engines are used to work with gamma-dependent-LDR images? Maybe they work in a linear fashion under the hood, and so the actual render agorithms are made to compensate for that gamma mapping. So they try to compensate on an linear HDRI, too - which results in effectively counterpulling the gamma down somehow...

It definetly is a rather confusing thing to reverse-engeneer... But for some reason doing it isn't as simple as watching Magic Paul doing it. That's the actual reason why the whole HDRI challenge looks so similar to Paul's IBL show - but done by ordinary people with ordinary software.

best wishes,
Blochi

Arnie Cachelin
06-28-2003, 05:20 PM
There should be no gamma correction done on the images used by rendering packages for illumination or even texturing. I know LW takes the values directly, and renders them directly. The only place for gamma correction is between the final rendered FP pixel values and the display device. That said, it is quite possible that the incoming images have been subjected to some correction to make them look good. This gould give less accurate results, or require preprocessing to undo that effect, if possible.

That's my opinion.

Blochi
06-29-2003, 05:08 AM
I apologize for the rude sound of my last post. It was not meant to be offensive, on the contrary: I am huge fan of Paul Debevec's work, he is one of the brightest heads in this game. Referring to basketball teminology was meant as an honor.

Is it possible to take this discussion into private?
There are definetly more open questions on my side.


Christian Bloch

[email protected]

fabmedia
07-15-2003, 07:35 AM
Okay, so my question to this post is how do you take a picture of the globe, what do you need to do to the image, AND what program is on the Mac that helps create an HDR image?

Arlen

Blochi
07-19-2003, 06:52 AM
Hi Arlen,

You need to take bracketed exposures from the globe, for best coverage from 3 different viewpoints 120 apart - even though some people recommend 2 viewpoints 90 apart which is working as well. Then you have to combine it into an HDRI, unwarp it, and stitch it into a spherical image. Deeper information on this can be found here:

http://www.cgtechniques.com (browse the articles)

As an alternative to the globe, you can also use a 180 fisheye.

For dealing with HDRs on the Mac you need this:
Photosphere (OS X only): http://radsite.lbl.gov/pickup/photosphere.tar.gz

Retouching, painting and such can be done in Shake on the mac, but I don't think you can use this to unwarp HDRIs...

-Blochi

fabmedia
07-19-2003, 01:08 PM
Now you would remove the camera and tripod from within the globe shot. BUT how would you keep the brightness values with adjusting the photo(s)?

Arlen

Blochi
07-19-2003, 02:28 PM
hmm ... I'm not quite sure what you mean.

You take the a lot of pictures with variing exposures, from 2 to 3 viewpoints. Then you basically have two roads that you can go: You can first stitch a panorama for every eposure, and then combine them into an HDRI. Or, you can first combine all exposures of the individual viewpoints, and use HDR-capable software only (like HDRShop and Photogenics) to stitch them into a panorama. The absolute brightness values are preserved all the way, either as JPEG pixel brightness in combination with the corresponding exposure value (in the JPEG header data), or as HDR value in an HDRI.

But, you are right, as soon as the final HDR environment is treated with a gamma correction, the absolute brightness values are lost. I think this is what Arnie pointed out.

Anyway, the link to PhotoSphere has changed to: http://www.anyhere.com/ (sorry for posting a dead link)

Also, I have a new link for 'facial deluxe', where you can by gazing globes online:
http://www.outdoordecor.com/cgi-local/SoftCart.exe/online-store/scstore/c-GazingGlobes.html?E+scstore

take care,
Blochi

fabmedia
07-19-2003, 02:32 PM
So how would you remove the camera from the globe shot(s) without screwing the exposure settings of the image itself across the different exposures? This is what I don't understand. Or would you just leave the camera in there?

Arlen

Blochi
07-19-2003, 02:53 PM
ooo.
Now I get you.
Well, it seems to be common to read out the JPEG header once, and put the exposure time in the name of the files. Because otherwise you would need to read it out several times, and would get confused very easily.
Check out Greg Downings tutorial on HDRI stitching, he explains this quite well.

http://www.gregdowning.com/HDRI/stitched/

-Blochi

fabmedia
07-19-2003, 03:15 PM
And since we're on the topic... what size should an HDR backdrop image be? So if I was to output an animation to film, what resolution would I need?

Do you know of any resources on this one?

Arlen

fabmedia
07-19-2003, 03:17 PM
And... how many exposures are needed for an HDR image? I sometimes see 10, 12, 6. I'm thinking of the minimum.

Arlen

Blochi
07-19-2003, 04:54 PM
well, if you want to use the HDR as a backdrop, this would need to be the size of film itself. Speaking of 2k, I think its 1920 by ... something. If you just want to use it for lighting, as a spherical environment, then a really small image works best. Small and blurry, basically, that helps to reduce radiosity artefacts.

But - if you want to have both -> the a spherical HDR, that covers the whole round in film res, you will have to multiply your camera pixel resolution by a factor that is determined by 360/FieldOfView (can read this out in the camera panel). But - be warned: It's gonna be huge. There's no way to get such a big HDR Environment except to go exactly Greg Downings way. Keep reading up on that, it's a hell lot of work to stitch this way...

-Blochi

Blochi
07-19-2003, 04:56 PM
2 exposures are minimum, 6 are better, 12 will work in any situation.

richpr
07-19-2003, 06:28 PM
Great thread.... verrry interesting! ;)

chewey
07-20-2003, 12:12 PM
I'd like to see a lot more info on the hdri thang myself.

fabmedia
07-20-2003, 01:05 PM
I would figure that you would need an odd number like 3 or 5 so that you would have 2 bracketed above and below your "display" image. Hmmm. This is very interesting. I'd love to get a decent FishEye lense and create a couple. I know that you can set up the exposure for a length of time with a small aperture, you can get a scene on a busy street appear to be vacant. That would be interesting indeed. I guess all of the traditional rules would apply on image editing. 2 shots per image, bracketing 5 or 7 times would yield a very quick and high quality image.

Ah-ha! I have a REALLY good question now. What about capturing a moving environment? like a street or in a mall? Now tell me if it's just plain stupid to think of such nonsense. Like as if I'm going to try it, but is it conceiveable?

I guess it would be if you were to composite via a blue screen.

Arlen

Blochi
07-20-2003, 02:19 PM
Well, 2 shots are theoretical minimum for the merging algorithms to work at all. That doesn't mean it's leading to a useful quality. In practical application you would need about 6. As a rule of thumb, every pixel should be clearly visible in at least 2 exposeres, so if you take your series with 2 fstops apart, from the darkest to the brightest you would be fine. You will get a better quality with every 1 fstop - because the merging process is reducing image noise (by nature).

Capturing a moving environment in one shot is a very good question indeed. Possible future solutions are coming up from CCD dev teams, so you'd be able to shoot HDRIs in one shot - ergo capture video (but most certainly not to tape :) ). But I've also seen a low-tech approach, consisting of a video camera, a bearing ball mounted on a long stick, and a split-beam-lens (you know, the one from 80's music videos, with 5 identical images in one frame). Every facet of that effect filter is coated with an ND gel, effectively resulting in 5 fstops in one frame. .... cannot find the link right now ... sorry ... but that's basically the idea.

-Blochi

fabmedia
07-20-2003, 02:58 PM
I never thought of that, being an 80's child and all. Wow those were the days. Hee hee. I wonder how many other low-tech approaches that can be thought of... I have trouble thinking about some of this stuff.

I'll figure out something else to question about in a while...

Arlen

nuclearchutney
07-20-2003, 06:48 PM
Originally posted by fabmedia
And... how many exposures are needed for an HDR image? I sometimes see 10, 12, 6. I'm thinking of the minimum.

Arlen

If you are shooting the bracketed images for the first time you wanna get at least 5 images over and under the metered exposure.

Then take those images into Paul Debevec's HDRShop to generate camera response curve. You need to clamp when the curves start to deviate from each other.
(Like Chris mentioned in his , check out Greg Downing's site. His writeup on HDRi and the section in that writeup on camera curve generation is pretty good. http://www.gregdowning.com)

Once you have generated a response curve for your camera you can generate HDRs by taking only three images, that being a metered shot, a 3 stop over exposure and a 3 stop under exposure.

fabmedia
07-20-2003, 07:08 PM
Ah yes. The dreaded HDRshop. I'm on a Mac I can't remember the name, but I believe I down loaded a copy.

I can't wait to try it. So regular, one over, and one under, by 3 stops. Cool.

Thanks!!!
Arlen

nuclearchutney
07-20-2003, 07:26 PM
Originally posted by fabmedia
Ah yes. The dreaded HDRshop. I'm on a Mac I can't remember the name, but I believe I down loaded a copy.

I can't wait to try it. So regular, one over, and one under, by 3 stops. Cool.

Thanks!!!
Arlen

Yeah It might be dreaded but there's relief on the way. :)
There are a few people working on a MacOSX port of HDRShop.
Oh my! I have said too much already :)

In the meantime, use Greg Ward's Photosphere. Its a great program. One great thing that Photosphere can do that HDRshop can't is correct slight registration problems (camera movement). Plus it can output OpenEXRs :)

The one thing that keeps me from using Photosphere is the fact that it has no camera curve calibration functions.

Blochi
07-21-2003, 01:08 AM
Hi nuclearchutney,

Well, I am a big PhotoSphere fan. It does create a camera curve, you just don't have to generate it yourself.... I find PhotoSphere so much more convenient. And, yes, I was concerned about quality first, too.

Until I did the test:
A - PhotoSphere , B - HDRShop, C - Phtogenics

http://www.Blochi.com/gfx/hdri/pics/33_HDR_Vergleich1.jpg

http://www.Blochi.com/gfx/hdri/pics/34_HDR_Vergleich_CloseUp.jpg

Apparently the Automatic Align function DOES make a difference.
But - Photosphere likes it best if you have your exposures one stop apart.

Cheers,
Blochi

kenneth
07-29-2003, 04:13 AM
Hey folks. I ported mkhdr (the command line tool) to OS X around a year ago. I posted it in the old forums but I guess word didn't get around. =)

http://www.kennethwoodruff.com/digitalart/files/mkhdr_OSX_beta2.sit

Original1
07-29-2003, 04:20 AM
Originally posted by fabmedia
Ah yes. The dreaded HDRshop. I'm on a Mac I can't remember the name, but I believe I down loaded a copy.

I can't wait to try it. So regular, one over, and one under, by 3 stops. Cool.

Thanks!!!
Arlen


If you are looking for something to manipulate HDR images try

http://cinepaint.sourceforge.net/

Open source but a good cross platform tool

fabmedia
07-29-2003, 09:47 AM
Ah the goodness of kind hearted people. I've been out of touch for a bit, but thanks for all of the great information. I've been trying to finish off a building in the last little while and hopefully I'll be able to light it with an HDR image.

Here's a quick question for you. Can you simulate an exposure with one image through PhotoShop? AND I use Eki's SkyGen to create panoramic skys in LW. Do you know how to take multiple exposures with LW or how I can fake it?

Arlen

Original1
07-29-2003, 10:34 AM
Originally posted by fabmedia
Ah the goodness of kind hearted people. I've been out of touch for a bit, but thanks for all of the great information. I've been trying to finish off a building in the last little while and hopefully I'll be able to light it with an HDR image.

Here's a quick question for you. Can you simulate an exposure with one image through PhotoShop? AND I use Eki's SkyGen to create panoramic skys in LW. Do you know how to take multiple exposures with LW or how I can fake it?

Arlen

Try saving a refgen image as .FLX which will retain the Floating point information, map this on to a skydone and use it for your radiosity illumination, you can also use it for your reflection maps,

In a couple of cases I used a simular trick to surface bake some of the static shadows to the background geometry and then used a light in the same position as the sun to iluminate only the moving objects and cast shadows (though I guess I could have used overcaster) though you have to render out your HDRI sky and bake your geometry first you can then switch radiousity off and animated part renders really quick

fabmedia
07-30-2003, 11:05 AM
Could you use image world instead of mapping it to a dome?



Arlen

Blochi
07-30-2003, 11:31 AM
You need to convert it from Latitude/Longitude into an Angular Map first. HDRShop can do that, but not on an .FLX- so you have to use .HDR.

cheers,
Blochi

fabmedia
07-30-2003, 11:32 AM
Thanks!

Cheers mate!
Arlen

Original1
07-30-2003, 12:50 PM
yeah,

it will need some tweaking, sometimes HDR exposure, in some cases it may be a bit pixelated, so mapping onto a sphere solves the problem you use a lightprobe or image world for the illumination.

fabmedia
07-31-2003, 01:58 PM
Okay, I might be flogging a dead horse here, but I used Eki's SkyGen to create a sky and used RefGen to create a spherical map (which somehow gets pinched when maped to an environment dome). BUT I turned the luminosity of the sun to 500%, which I'm sure is not enough. When I go to render out a scene, it's heavily blue. now I wouldn't think that this should be a problem. But I'm talking like as if a blue mask has been put over the whole image. I'm not too sure what to do about this. Damn, I keep hitting some horrible pitfalls.

Any suggestions?

Arlen

Blochi
09-09-2003, 02:10 PM
now that sounds interesting.
I'm not quite sure if the original range is getting transported all the way through his multiple render/stitching process - I got it to work with an interior scene once, but never tried something outdoor. I know that it's using FBX as in-between format, but I don't know if the stitching by using textures and rendering it again might mess it up.
You could try composing the environment by handstitching the source files into a HCross/Cube projection map, and do the transform in HDRShop.
I'd love to take a look at your scene, and especially the map.

Blochi