View Full Version : ATI Radeon 8500 LE

02-16-2003, 07:46 PM
First of all i'm really glad of joining this forum, i've been a LW user since the amiga days.

I purchased an ATI Radeon 8500 LE video card. It's already benn installed and working fine, but i'm not seeing any change in my render times. Do i have to reinstall my LW. Or i shouldn't expect any decrease on the render times?


02-17-2003, 04:30 AM
It's normal; graphics cards are for the display only, it don't affect render times. Only CPU do. So if you want faster render, change your Cpu..

02-17-2003, 07:45 AM
The video card will only speed up your display while manipulating scenes or objects. If you have a huge high poly model and go rotating it around, you'll see the difference (in theory).

02-17-2003, 08:21 AM
Can anybody explain me why so many (noobs?) believes a new video card makes rendering faster?

02-18-2003, 04:12 AM
Originally posted by Lynx3d
Can anybody explain me why so many (noobs?) believes a new video card makes rendering faster?

Well, they are beeing marketed as graphics accelerators, right?

Also, you can play these cool 3D games, and they do everything in realtime and look so great, why shouldn't something a simple as a 3d app render faster? :D

On the other hand, I'd like a decent OGL Preview render option that supports more current features and saves to disk :cool:

02-18-2003, 07:07 AM
Yea you're right, i wouldn't mind multitexturing support, real transparency, bumpmapping etc. in the viewports...

02-18-2003, 07:14 AM
I wouldn't mind either. It doesn't even have to be realtime. As long as I could use openGL to render in (almost...) any resolution and save to disk, I'd be happy.

Stuff like motion blur and dof would be quite easy to do, bump mapping etc. would be a bit harder though.

Oh wel... :(

02-19-2003, 06:48 AM
How about the "cg" features of the upcoming GeForce FX boards? Couldn't many of these board's FX features really enhance the view windows and VIPER if LW were coded to use them?

02-19-2003, 10:05 AM
We ahve just purchased x16 radeon 9100 cards going in new machines. Apparently it is the same as the 8500 series just rebadged but doesn't work with Lightwave running windows 2000 as the Open GL preview modes in modeller just "crack up". The ATI website is pants - can anyone of you wonderful people suggest a solution, Thanks:confused:

02-21-2003, 05:06 PM
I am having a mare in layout...When i press the play button (next to the preview button) to preview an animation, all the objects in the viewport disappear and are replaced by nigh-on useless bounding boxes. It happens whether I am trying to preview in any render level at all...wireframe, shaded....

I'm using a dual P3-1GHz, 1G RAM, Radeon 8500 64MB with all the latest drivers and windows updates.The bounding box flies about really quick which suggests the system is up to the job of previewing. If I untick the "play at exact rate" button in display options, the (bounding boxed) 60 frame preview happens in a fraction of a second.

Anyone know how to make a radeon objects in layout when you press the play button?

02-21-2003, 05:22 PM
Cat hit d and set the bounding box threshold higher.

Sounds u like u may have it set as low as 0.


02-21-2003, 05:33 PM
Wow! swift reply, thanks for the suggestion. My bounding box threshold was at 1000 though.

Everything else is pretty much default settings if thats any use to anyone.

02-21-2003, 06:02 PM
If u have it set to 1000 then what happens is when ever u drag an object or the time slider if there are more than 1000 polys in the scene it will drop objects down to bounding box level and only display upto 1000 polys, 1000 is quite low to be honest, u should set it high enough to show most objects but not so high that it makes lw sluggish. A lot of ppl use it at 0 regardless so u get super fast response when moving things around. Myself i have it set to 10000 which my pc can handle without too much of a problem.


02-21-2003, 06:04 PM
James, You are a star!

10000 has worked a treat and I have an idea of what the setting means now too :)

Thank you!!!

02-21-2003, 06:52 PM
No probs :)

The speed at which layout draws the opengl preview is one of the things that made me go with lightwave over all the other apps, even with an insane ammount of polygons in a scene draging the slider instantly changes to bounding boxes with no delay at all and as soon as u stop dragging the full gl preview comes back. Programs like max just slow to a crawl in these situations and even maya/xsi are no where close to it and it really makes life so much easier when your animating. Theres nothing worse than trying to animate when your interface is lagging when ever u try to move something.


02-21-2003, 08:12 PM
Originally posted by Lynx3d
Can anybody explain me why so many (noobs?) believes a new video card makes rendering faster?

What's there to explain? It's only logical. I thought the same thing on my first computer (the ole 25 MHz 486DX with 4 MB EDO RAM!). I thought for sure that upgrading my graphics card would speed up my rendering... no one is born with knowledge, they gotta learn it one way or another - whether by trial and error, asking on a forum or whatever. :cool:

02-21-2003, 08:20 PM
Would be nice if someone came up with a way to make the processing power on your gfx card somehow merge with yer cpu when ever your doing anything thats not graphics intesive so it increases your cpu's power. That would help with render times. Prolly never happen tho.