View Full Version : Between the CPU and the Dream

05-21-2003, 12:04 PM
At the moment our visions are impeeded by the time it takes to render.
Imagine how much our industry would change with next to realtime rendering. Everyone would be pumping out imagery that would otherwise have been impossible. OpenGL radiosity? Accurate shading and shadows from all lights in the viewport..

I guess my question is .. will it ever be possible?

Imagine manipulating photo-quality 3D worlds in real-time.
Because at the moment I look back on my 3 years of using Lightwave and realize a lot of what is holding me back is the hardware.. I end up creating lovely looking still shots.

The digital artists these days have to find a balance between quality and rendering time.

I could mumble on all day.. but I'd like to know what everyone else thinks about this.


05-21-2003, 04:59 PM
Well it's a two-fold problem.
First, yes, the computers in a year or few, will allow you to do everything you do now in real time.

BUT... in a year or few, just think how many more functions the software people will add for the computers to process.

It's like the Internet. I remember when I got my first 28.8 modem. Man it kicked arse over its predecessor!
Then the websites got more complex.
56K was the savior.
Then the websites and downloads/uploads got bigger.
Then DSL kicked arse.

Now DSL isn't fast enough for me. Not because the technology is bad, it's because our demands outgrow the technology.

I'm afraid we'll keep adding just a little more to the software then the hardware can handle until the end of time.:(

05-24-2003, 09:26 AM
Frankly, I think we're a loooong way off from real time rendering... I'm doing an animation right now. By not using radiosity, and keeping the poly count lower than I like and by using a whole bag of other tricks, I hope.... HOPE... to keep my renders to around ten minutes a frame. Well... lets see. In order for my highly optimised scenes to render real time let's do the math:

10 minutes per frame
30 frames per second
60 seconds per minute

result=for this particular project my computer would have to be 18,000 times faster than it is now to achieve real time rendering. And this is on a fairly simple project. No radiousity. Low poly count. No special hair plugs or anything like that. Low res too. For TV. Nevermind film or HDTV. I don't even want to consider how long my poor computer (which is an AMD Athlon 2000+) would need for that. We're stuck with long render times for the forseeable future I'm afraid.

Real time previewing though is another thing. All kinds of shortcuts and cheats can be developed to give us better previews as we work. OpenGL, Viper and G2 are just the tip of the iceburg in what is to come I think. Maybe, 3D apps should ditch OpenGl and embrace Direct X now that it's getting all the fancy pixel shaders, shadow techniques and all sorts of other real time goodness. God bless games for bringing us at least a limited form of real time 3D rendering.

OK, I don't even know where I'm going with this anymore so I'll shut up now.

05-24-2003, 01:28 PM
It is already being done in games especiall with Cg and HLSL. Just go over to nVidia and take a look at FairyGirl. Two years ago you could not even get close to that.

I'm with everybody else here where it is a double answer. You can do real time rendering with vfx but we are always pushing the envelope. If you settle on the norm where it takes seconds instead of hours per frame everything begins to look the same. As technology advances there is always the need to make things more complex and therefore make them take a long time to render or compute. Even Realtime computeing comes at a price [around $2.5 M to start and tops off around $200 M].

05-25-2003, 02:08 AM
Well whatever happened to those parallel processing boards that were supposed to allow simultaneous processing on groups of pixels? They're using that configuration for AI research in visual pattern recognition as we speak (and have been for a few years).

Why can't I buy a "PCI parallel processing video board" that augments the power of my AGP card?

Like Gabe said, 18,000 is a huge factor. Even with Moores Law in effect, with a doubling time of 18 months, it'll take (pulls out calculator) like 261 months to get from here to there.

Wasn't parallel processing supposed to allow an "end run" around the limitations of straight serial processing?

04-12-2004, 08:53 PM

FPrime brings us closer

04-13-2004, 12:36 AM
Since we brought this topic back to life, I wish the 3D companies would get together to get a standard so that we the artist could use hardware devices such as the Volumetric Rendering Cards that you see used in medical for such things as Real Time 3D rendering of X Rays [CT and MR scans]. I did some research on this type of hardware I can see where it could benifit the industry.

The thing is to have all the companies form an oversight group to set the standard then once the hardware begins to flow the prices would drop.

04-15-2004, 01:05 AM
I agree that we are a long way off from real time rendering, unless someone comes up with dedicated hardware to take some of the load off the cpu (like in some video editing suites). Even taking Moore's law into account (processor speed doubles every 18 months) we still have a long wait.

The answer to everything is quantum computing. Don't know that much about it, but basically the more instructions it has to process, the faster it works.

We'll all just have to wait for that 256 bit 500 ghz cpu.

04-16-2004, 12:21 PM
Contrary to popular belief, I think realtime rendering would not impact the creative process that much. After all, I spend a lot more time working on other aspects rather than rendering. You would more likely need a realtime imagination-computer interface. For example, live action shooting is a realtime process, yet it is out of reach for most because of the amount of work involved, not costs. Yes, you can buy a cheap camera and shoot a short film for next to nothing, but most films done this way are basically terrible. If you really want to make something outstanding there's more to it than that. You have to dress up your sets, light them carefully, write a compelling script, work with the actors, compose your shots, storyboard complex sequences, edit your scenes, etc... It's a lot of work. Even if the actual capture process is in itself a realtime event. The same can be said about animation. You still have to model, surface, light, animate, direct, etc... And that is AFTER you have a compelling script to start with. So in short, no, I don't think realtime rendering will make much of a difference, if you had a really good project in your head it would most probably already be done or being worked on anyway. Don't blame your lazyness on the CPU :)

04-16-2004, 05:37 PM
cholo, shot stories and animations are only a part of 3D and rendering. I had a fairly basic walk round in a store visualisation i done last week, they gave me a deadline that was about a 1/3 of the time it would take to render on the computers i have. Realtime rendering would've saved me alot of hastle of getting new hardware etc etc.
Realtime rendering wont make any difference to the effectiveness of a script or its direction but it will overcome technical problems with limited hardware.

Realtime realistic rendering will never arrive soon enough, but when it does i'm sure it will be a slow entry and perhaps we're seeing the start of things to come with fprime for example, 'realtime' (as realtime as your processor is!) previewing to this extent is probably the first noticable step i'd say.

If I've got everthing I want today, what am I going to want tommrow? There'll always be something that could make it all easier. Advancement is human nature.

04-16-2004, 07:18 PM
Contrary to popular belief, I think realtime rendering would not impact the creative process that much.
Actually, I think it would impact the creative process very substantially. I'm constantly making comprimises with lighting and modeling detail simply because of render time concerns. If I didn't have to be concerned with render times I could and would do much more.

Additionally, real time rendering would mean real time feedback for lighting setups, textures and so forth. That would definately improve and streamline my 3D sets and models.

04-18-2004, 10:54 PM
Don't blame your lazyness on the CPU

I don't think anyone is blaming their lazyness on their CPU.. hmm .. well maybe a few ;) At the moment there is so much guess-work involved in setting up textures, lighting etc that I only recently, thanks to FPrime, have figured out how some things work and how things look (e.g. lighting falloff, soft-edge angle, layered surface maps) Adjusting them and seeing the results in FPrime have given me greater understanding of what the software is doing and what effects I can achieve..
Now to have realtime fur, hypervoxels, volumetrics... yummy! :)

04-19-2004, 12:05 AM
One of the short cuts I suppose would be to find ways to optimize the rendering process, ie: achieve better renders in less time through the renderer itself, not through cpu power. This seems more likely and a better way to go. This is almost inevitable as competition amongst high end 3D packages and renderers becomes more fierce. When a certain quality level is reached, what other features make your renderer better than others? One would obviously be speed, a feature that everybody wants (as long as it doesn't compromise quality).

The next step may be additional harware that aids the rendering process, maybe optimising certain aspects of the render (radiosity etc.)

Thats all we can really hope for for a long while.

04-21-2004, 01:08 AM
Yes, For setting things up and seeing how texures will work when you change a setting - Fprime simply is a G*DSEND. There has been a debate as to who will win the computer efficiency race - whether it will be the hardware or the software people.

I think that Fprime is an excellent example by Mr. Worley of how really good, efficient software can run on current hardware. Fprime doesn't use any fancy new processor technology to get its speed improvements. It simply uses some tricks and good coding and algorythyms to produce its impressive speed / feedback.

It makes me think back to my Amiga days when I used to have lightwave rendering, I would be downloading something, talking to a buddy on a chat program and playing some animations at the same time. Sure you can do it with today's machines - but not well. Efficiency and well coded software played a part then to do things that were impressive and efficient coding will play a part now. New hardware will always be on the horizon, but software (and the people who make it) need to improve as well.