View Full Version : confused: how does a videocard relate to my lightwave work environment

02-21-2008, 04:34 PM

I am about to buy a laptop and I am discovering that most laptops (affordable)
have integrated videocards (ex: Intel GMA X3100 w/ 384mb allocated ram
OR ATI Radeon X1250 w/ 512mb allocated ram );

I understand that these integrated videocards use the machine's ram via 'sharing';

QUESTION: is this an issue for me?
I use modeler with sometimes between 5000-20000 points among dozens and dozens of layers, creating uvmaps for texture baking and normal texture application;

I use layout to setup scenes for baking the textures and exporting the lwo's to .w3d;


02-21-2008, 05:03 PM
I don't know diddley about laptops, but yeah, anything using RAM from your system takes away from what LW can deliver when rendering.
But it depends on what you're rendering. Lots of hi-res textures are a RAM killer, and if you're baking procedurals into hi-res images, that's going to add up. Also, further subdividing objects in the Object Properties panel in Layout can *really* add up fast and suck all your RAM away.
Depends also on how much RAM your machine has left over after the video steals what it needs.
5000 - 20000 points really isn't alot. You could probably get by with 2 gigs of RAM if all you're doing is baking textures, provided you don't have too much going on.
The render window in Layout shows how much RAM is being used, so you should be able to gauge it off that.
Still, get as much RAM as you possibly can get into whatever laptop you're going with.

02-21-2008, 06:33 PM
IMI answered pretty well so I'll just say not to pray for any Lightwave miracles with a budget laptop with shared memory.

02-21-2008, 07:47 PM
Remember: in addition to what IMI said, a video card does not add to rendering speed in Lightwave. It merely increases the ease and speed of working in the viewports when modeling, etc. It's an OpenGL thing. All in the OG.