View Full Version : newbie needs help understanding LW memory limitations

11-13-2007, 03:43 PM
Hi, I would appreciate it if someone could help me with the basics on understanding how LW reads high poly models or models with SubDs.

I have been provided with a model with 1,990,367 polys and want to turn some of these into subds. I can import one instance of this model into Lightwave and render with no problems, although if I try to run both LW and Modeler at the same time I crash with a not enough memory message.

I need to import 6 or more instances of this model into my scene file, and after two are imported, LW tells me I am out of memory and won't let me either import more or clone the existing models.

Unfortunately I'm inexperienced and just don't know what the limitations of high poly models are, or where this model qualifies in the range of high poly models. Is it extravagant in detail?

I've been through my pdfs of the manual over and over again and all I could find was the possibility of freezing my SubDs. I'm trying this method as I write this query.

I only know vaguely that you're supposed to limit the number of polys and use smoothing instead, but not what the limits are. And I've already discovered that smoothing isn't enough on this model, as the curves show as angled segments unless SubDs are used.

I'm using a mac 2x2.55 GHz Dual-Core Intel Xeon Machine, with 8BG of Ram. I'm using LW 9.0, universal binary, Mac32 software. I'm not running any other software of significance, just web browser and itunes.

The video card is not ideal -- NVIDIA GeForce 7300 GT, but I think it meets the minimum requirements.

Thanks for the help!

11-14-2007, 02:15 AM
9.0 is not the universal binary version, but 9.3 is so I guess you made a typo there, with models as large as this I would suggest only loading one into Layout and then cloning it there rather than loading it several times. This doesn't help with OpenGL, but it will help with memory usage for geometry in Layout itself at at render time. Does your model really need all ~2 million polys? You might be better of trying to clean it up first if it is a model imported from another package.

Next thing is whether these models are textured? If they use images to texture then you should be aware of how much memory each texture takes - it is rarely the filesize on disk. (Have a look at this tutorial (http://www.lightwiki.com/Optimised_image_use))

PS. Some more details would help us help you.

11-14-2007, 12:14 PM
I was told that 9.0 was universal binary by the person who sold it to me.

I don't know anything about 9.3...

I don't know for sure if I need all these polys -- I didn't make the model and I just don't know enough about how the program works. I do know that smoothing was not enough to make the curves clean, so I needed to use sub d's.

I also have no idea how to clean the model up, other than merge points which I've already done.

LW won't let me clone the model either -- I get the same memory error.

Imported images on the textures I'm using are minimal, just one small HDRI image.

11-14-2007, 01:21 PM
Two million polys is already more than most existing desktop machines can easily handle. Even if you're only using 2x2 subpatches, that's effectively eight million polys per model--and you're talking about six copies of it.

My guess is you're going to have to find some other non-brute-force solution to the problem. If you post the specific details, someone here might be able to help you out.

11-14-2007, 04:16 PM
I don't know what more details to give to be helpful, since I'm a beginniner.

It does help to know definitively that this is an extra large amount of polys, so I at least know what the problem is.

I may have to render 6 separate passes with the same camera, with just one model in place in each pass.

11-14-2007, 08:18 PM
If it's 64 bit cpu, which I think it is, install Windows 64 bit XP, or Vista, and it should work.. In the worst case, it'll render using virtual memory, but at least you could use full phisical memory, which is currently not available to 32 bit system/app..

Try also qemloss3 http://amber.rc.arizona.edu/lw/qemloss3.html
It's tool for reducing number of polys without significant lost of detail..

11-14-2007, 10:54 PM
...mac users are stuck at 32 bits for the moment.

I am upgrading to 9.3 and hoping that will help.

11-16-2007, 05:05 PM
My question would be if the model has such a high poly count why are you trying to use Sub-Ds? Also, if you need six copies in Layout I would think the clone tool would work better than loading more copies.

As far as how LW handles Sub-Ds, I believe it is the same as other programs. That is to say that the Sub-Ds act as a cage from which a higher resolution mesh is interpolated. This interpolated resolution is determined in Modeler in the options panel and in Layout in the Object Properties>Geometry tab. On this tab you can set both a display sub-D level (ie what the user sees in the viewport) and a render level which can be further refined using the APS options. Once you hit the render button though, all geometry is transformed into triangles. This is true of Sub-Ds, NURBS, and regular geometry. While for most models the time required to calculate all those triangles is negligible, it may be a lot longer for a 1 million + poly model. You may be surprised to find that triangulating the polys will produce a slightly faster render by removing this conversion time from your render.

I guess what I'm getting at here is that in most production environments you would never need a model with a resolution higher than the million+ polys you have.

You can also try increasing LW's memory segment limit in the Layout options panel. The default is usually way too low for most machines these days.

How is the object surfaced? Texture maps or just surface settings?

Hope some of this helps.


11-16-2007, 05:30 PM
Why so many polys I'm not sure about since I didn't make the model, but the curves were rendering as chunky segments without the subds.

I applied subds in modeler and then froze them. That helped a lot. I also installed LW 9.3, which also helped a lot.

I was able to clone two times before running out of memory on the third clone. (Before it wouldn't clone at all -- would just crash).

The surfacing I put on there is very simple and not texture heavy, but as the errors were occurring just loading the objects, and said specifically "not enough memory for polygon data", the model's high poly count was the likey culprit.

I haven't tried tripling the models, but it's an interesting idea. I'll probably do that this weekend.

11-16-2007, 07:24 PM
Why so many polys I'm not sure about since I didn't make the model, but the curves were rendering as chunky segments without the subds.

Can you post a render or screen capture of the model? That might help people here advise you.

Andrew March
11-17-2007, 02:25 AM
You really should either A) comp the seperate passes of the model as you've already suggested yourself. or B) Use a low poly version for the models that are background.

It might help if we knew what the model was, is it a spaceship or something that interacts with the environment, like a car?

11-17-2007, 11:10 AM
The model is of DNA, and lo poly for the background strands is quite a good idea.

The weird thing is that it renders fast, just moves slowly in the program. I'm sure to an experienced person, it would be easy to manage. But upgrading to 9.3 has made an enormous difference.

Andrew March
11-18-2007, 09:36 AM
2 million polys for DNA, that sounds a bit excessive. I think that most if not all of your issues could resolved with some poly reduction.

Is the model a native lightwave one or imported other format?