View Full Version : DPI setting?

07-20-2006, 05:47 PM
I was wondering what sets DPI? Is their a tool that sets it?
I need to make some fliers and need to know if I can set DPI in LightWave.
Anyone know how?

Captain Obvious
07-20-2006, 06:26 PM
DPI is not a relevant measurement to 3D rendering. DPI, or dots per inch, is a measurement of how small the smallest possible element is in something like a printer or scanner, and has precisely NOTHING to do with how high resolution you need.

There is something called PPI, or pixels per inch, which is probably what you're after. Basically, it's exactly what it sounds like. If you have an image that's 1000x1000 pixels, and you print it at 100 PPI, it will be 10"x10". In Photoshop, if you go to Image -> Image size, you can change the printed size there (as well as the resolution). Just make sure uncheck the "change resolution" checkbox.

Dave Jerrard
07-21-2006, 06:41 PM
Dpi and PPI are ratios only. Saying an image is 300DPI says nothing about the image's size or resolution. Saying an image is 10" also says nothing about its resolution, or how many pixels are in it. Only the pixel dimensions give you the full set of info.

When rendering, you're only dealing with pixels. DPI and PPI have no value until you add in something else, like an output format. If you're rendering for something that's going to be 10" wide, well, you can render anything at any size and print it out at 10". The Pixels Per Inch is the ratio of how many pixels in the image will fill an inch of the image at that output size. I can render an image that's 100 pixels wide, print it 10" wide and I will have an output resolution for that image of 10 PPI. If I print it at 10 inch wide, then it's 100PPI. If I look at it at full scale on my monitor, I'll have a different PPI, depending on how large my monitor is and what the display resolution is set to. PPI alone is meaningless though. You have to combine it with something else - the output size or the pixel dimensions. If I tell you I have an image that's 48,000 PPI, that might sound impressive. But for all you know, I could be talking about a single pixel printed at a size of 1/48,000 of an inch.

PPI is also commonly confused with DPI. DPI is a hardware specific number, where PPI is a variable as I pointed out above. DPI - Dots Per Inch - is a hard value, specific to input and output devices. Scannera have a set number of sensors per inch that they use to scan with. Usually, there's about 300 of these sensors per inch, better scanners get 600 or 1200, or higher. Printers have print heads that produce dots of ink that are always the same size. This is listed as the printer's DPI. On printers, this is minimum spacing of the dots, and frequently they'll have two values, because the printer can advance the paper in smaller increments and essentially double its vertical resolution. Low end printers only have a DPI of about 600, but most common printers are in the range of 1440 to 2800 DPI. This means they print that many dots of a specific color in each inch. This is also the smallest detail the printer can produce.

Some monitors have DPI setting as well. Actually ALL digital monitors - mostly LCD, but I'll include digital TVs here as well, including Plasma, SED, DLP, LCoS, etc., - that have addressable pixels in them, do have a hardware defined DPI value. It's literally the number of pixels on the screen, per inch. This can't change, since it's a physical property of the monitor. If you want more less pixels, then you get another monitor. Right now, I'm using a 21" LCD monitor with a 1600x1200 native resolution. The display area of this monitor is about 16.25 inches across. This means this monitor has a display resolution of nearly 100DPI (98.46 DPI to be precise).

TV and analog monitors, as well as projectors are a different matter. Analog TVs & monitors can change their scanning frequency as well as their image size, so they don't actually have a set DPI value. It can be adjusted becauses there's no physical pixels being addressed. Some monitors are more precisely calibrated though. Older Mac monitors, for example, were set to display 72 DPI, no matter how big the monitor was. A larger monitor meant you could fit more on screen. This is where the default 72DPI comes from in various publishing and image processing software whe it's given an image that doesn't contain a DPI value (which should really be PPI).

Projectors are another matter again. Digital projectors have a set physical resolutoin on their image chips, but like other digital displays, only the total number of pixels is given (1024x768 or 1920x1080, etc.) Since these are projected onto a screen and the distance betweenthe projector and scree determines the size of the image, there is no DPI value associated with these. It can litterally be different each time you use it. You can always figure it out if your want to, just measure the image on the screen & divide the number of pixels by that.

Now, thanks to Adobe several years ago for throwing this DPI thing around so much and actually embedding it into their image files, very few people actually understand it. They seem to think it's an actual value that affects the image. It doesn't. Yet, they put it in their software and then have their software complain if it loads an image that doesn't contain this informat, like from a LW render, or they just toss the defaul old Mac monitor DPI of 72 in there. Now people looking at this will say it's too big or not big enough, because Adobe is telling them that at 72DPI, the image will be X inches big! Like it's a hardcoded size!

First of all, Adobe used the wrong term. They should be using Pixels Per Inch, because there's no specified hardware device being referred to. The image is made up of X by Y pixels. The PPI comes into play when you start talking about output sizes. As I mentioned before, you can print the same image out at all kinds of sizes, and get a whole bunch of different PPI values, but they're all from the same image and the number of pixels in that image has not changed. You're just printing them bigger or smaller. Printing an image at 600PPI will not magically give you more detail than printing the image at 300DPI. It will be the same image, just one will cover an area four times larger than the other.

To add more detail, you need more pixels. That's it. Now, if you want to output an image so that you can get a certain sharpness to it, then PPI comees into play. Remember, it's just a ratio - how many pixels in the image will fill an inch on the page. It also determines how big those pixels will be. 100PPI means each pixel is 1/100th of an inch. More pixels per inch also means smaller pixels. For most print work, say brochures, magazines, etc, 300PPI is about all you'll need for color images, and 600PPI for B&W. You can go higher, but the printing processes start to get finicky as you go higher. At 300PPI, the individual pixels are extremely hard to make out in the printed image. For B&W images, they can still be visible, so it's recommended to go a bit higher for B&W. B&W film also has a higher resolution that color film by the way.

For presentation material, like higher end magazines, books and other presentation materials, an even higher level of detail is desirable. 600DPI is usually the amount of detail used for these. Also special printing is usually used for this type of material, frequently including higher screen frequencies, extra colors and spot gloss, in addition to special printing stock. For larger material, like posters, murals, and billboards, the PPI can be much lower because these are designed to be viewed from a distance. The angular size is still about the same as holding the material at typical reading distance, but the actual resolution in PPI is much lower. In fact, for billboards, it can be as low as 5PPI, maybe even lower.

These various PPI values actually come about from an older printing technology called halftoning (http://en.wikipedia.org/wiki/Halftone), which has been around way longer than pixels. Since ink is either printed or not printed, you can only get two shades on a page, solid ot nothing. Halftoning uses a screen with a soft edged crosshatch type of pattern in it, called a Magenta Screen (because it's magenta colored). These screens come in different resolutions, or frequencies, ranging from 65 lines per inch (LPI) to 300 lines. A These screens are placed over lith film (very high contrast film that has virtually no grey tones), which is then exposed to artwork though a process camera. THe image is projected throught he screen, onto the film. The screen casts a grid-like shadow onto the film, but the trick is the soft edged lines in it. Where the lines intersect, the screen is virtually opaque, but they fade to fully transparent in the centers of the cells of the grid. A little light will only get through these small spaces, but more intense light will be able to pass through the semi-transparent edges of these lines. So, the brighter areas of an image will create larger dots on the film than dimmer areas. The result is a negative of the artwork, made up of small opaque dots of various sizes.

Generally, screens of 65 to 133 lines are used, with 133 usually being reserved for magazines & brochures. To convert an image to a halftone screen (which can be generated in special printers called Image Setters, which have a very high DPI, of about 24,000 or more), best results are achieved if the PPI of the image is at least double the LPI of the screen. So, for most work which will use a 65 line screen, 130PPI will do quite well. The highest quality work can be printed using screens of 300LPI, so double that and you get a PPI of 600. I've never seen a screen higher than 300 LPI, but they may exist. There are much lower ones as well. Bilboards are printed with a screen frequency of around 8. They have some pretty big dots in them when you see them up close.

These ae just rules of thumb though, so don't worry about getting that image to be EXACTLY 150 PPI. If it's 151 or 149, or even 140, it's not like anyone's really going to notice such a small difference. I've printed posters at 50PPI that looked better than other posters printed at higher resolution. Text is a special case, and needs a pretty high resolution to look good, but this is generated at that higher resolution automatically when you use publishing software. So you can have artwork that prints out at 50PPI and yet have test over it that's at a full 300PPI. (usually the text, which is vector based, is printed at the output device's native DPI, so it's as sharp as possible).

He Who Hopes That Cleared Up Some Of The Confusion Without Adding To It.

07-21-2006, 09:37 PM
Use this L-Script:


08-21-2006, 03:38 PM
Ah Ha! In the course of me learning C++ I have written a program that automatically calculates the resolution for me.

08-21-2006, 05:14 PM
Doesn't the print assistant do that?

08-21-2006, 05:50 PM
I took a look at print assistant and didn't understand it. Anyway here is my simple app:

08-21-2006, 07:32 PM
OT: Love your wallpaper. :D

08-21-2006, 07:55 PM
It is the result of a tutorial in the Inside LightWave [8] book.

08-21-2006, 08:17 PM
Really? I guess I need to reread the book again... Amazing the things I forget about.

08-22-2006, 11:27 AM
remember that all of the fields in lightwave can do math. In your camera property panel just type in the inches you want times the ppi you want and it will automatically calculate the resolution for you.

08-22-2006, 11:40 AM
Oh, I remember now... Well, at least I got some programing practice in.

08-23-2006, 05:25 PM
Mmmmm programming. I can do it but I have to pull my hair out in the process. As my friend says there is a place for OCD people, and that place is in programming :)

08-24-2006, 11:16 AM
As posted earlier! Just use this LScript by Eki, sets the camera up afterwards:


(Not tried in LW9, just in case it's broken with the new cameras!)

08-24-2006, 11:59 AM
Thanks, I saw that, but I just wanted to put myself to the challenge of writing my very first program without the use of a book.