In the world of imaging, PPI and DPI are used almost interchangeably, yet they are two quite different things. The misunderstanding of these two terms leads to more confusion than almost any other single thing in digital imaging.
The first things to understand is that in digital imaging, the only thing that really counts about a file is how many pixels are in a file. Terms like megapixels, DPI and file sizes in megabytes only confuse the issue. In the end, all digital images are simply X pixels by Y pixels big. In almost all cases, unless you are talking about physically how your printer is laying ink down on the paper, you are actually dealing with PPI - pixels per inch.
Dots Per Inch is an old printing term that is a measure of how many tiny droplets of ink a printer is laying down in its dither pattern to form one inch of a print. When printing, you tell the printer what DPI or mode to print in. In the bigger printers you can usually choose the DPI directly, but in other printers this is usually camouflaged by modes such as 'Photo', 'Best Photo', 'Photo RPM' or 'Fine', 'SuperFine'.
Almost all printers operate best for general photographic usage at around 1440 DPI, which is commonly 'Best Photo' mode in most Epson printers. The higher DPI modes like 'Photo RPM' are useful if you're printing really high key (i.e. all light toned) shots. They can even be detrimental for general printing though, as they can even lay down to much ink on the paper, resulting in impaired shadow detail.
And of course the more ink you use, the more ink you pay for! We suggest testing your available printing modes, using both high key and low key images to determine which work best with your printer.
Pixels Per Inch is a description of the logical number of pixels from your original image that will be used to tell the printer to print one inch on paper. Assuming the quality of sharp original shot with good technique, the higher the PPI, the better the quality print you can achieve. There have been claims that 360 PPI is the most you need, but 720 PPI images can easily be seen to be much sharper again in print, if this data is available at good quality from the original file. In our testign once you're over &20 PPI there's really no further perceptible difference in sharpness to be gained.
PPI is a logical or abstract term - changing the PPI of a particular file does not in any way affect the file itself, it is simply a decision about how many pixels of the available pixels you will use to print an inch on page. You can choose any number you like - from 1 to infinity. The existing standard for high quality, photographic printed images is 300 PPI, meaning that for each inch of the printed image, there must be 300 source pixels to use.
This is why the 'resample' check box, in the Image Size dialogue, is the single most important, and also dangerous, control in Photoshop.
When you resample an image, you are actually changing the number of pixels in your image, either adding some or throwing them away. You should only do this if you are making an explicit and informed decision to do so, as no single other thing will affect the quality of information available to you from your file as this.
For example, if we scan a 35mm transparency at 4000PPI - this will result in a file that has 5400 by 3600 pixels.
To make a print of 12 by 8 inches, we look at the PPI we have available from our file for a print of this size.
5400 pixels divided by 12 inches = 450 PPI (5400/12 = 450) or 3600/8 = 450.
This means we can use 450 pixels to represent one inch of our print.
The printer driver will now translate those 450 logical pixels per inch (PPI) into 1440 physical dots per inch (DPI) and produce a very high quality print for us.
If we wanted to know the maximum print size we can achieve at good quality, and we know from experience that given a sharp original outputted on an Epson inkjet, 240 PPI is sufficient, we can calculate the maximum print size by taking the total number of pixels available to us (5400 on the long edge) and dividing it by the PPI required to give us, in inches, the size of the print:
5400/240 = 22.5 inches
3600/240 = 15 inches
Our final print will be 22.5 by 15 inches, with 240 pixels used to represent each inch. And the printer will use its 1440 dots per inch (DPI) to actually print the image on paper.
To complicate things further, monitors have their own terms. The most commonly used term is Pixel Density, but you will also see PPI used with monitors (or even PPCM - Pixels Per Centimetre). Again, it's a measure of how many dots (full pixels) there are per inch - and of course with monitors that is a fixed physical characteristic.
Historically, monitors were about 72 to 96 PPI, but modern displays (so called 'retina' displays) - are now reaching much higher figures and 150 all the way up to (and beyond) 300PPI is possible - e.g. with 4K 24 inch screens, the pixel density is very high (and thus the display is very sharp, and excellent in particular at rendering small, detailed things like text and image thumbnails).
It's also further confused by modern LCDs technically using sub-pixels - so in fact one 'pixel' is made up of three sub pixels (usually one for each of R, G and B). The mixing of these three colours makes one visible 'dot' of the monitor's image. However, because these pixels are individually addressable, sub-pixel level manipulation can be done to, for example, improve font rendering by anti-aliasing the edges of the fonts for smoother results (AKA ClearType, LCD font smoothing). So in fact the PPI figure only tells one part of the story of how the entire system of computer + operating system + software + display will render the final output.