The Evolution of DPI Over the Years (2024)

Deepak Shankaran |Updated on: January 2, 2023

What is DPI?

DPI (dots per inch) is a terminology originally coined for the print media to describe the resolution of the printed output – a higher DPI indicated more dots of ink per inch of the output and hence a sharper text/image.

In the initial days of computers, DPI was used to indicate the density of dots (or, in this case, virtual pixels) per inch of the screen. Technically, the term that accurately describes this metric is PPI (pixels per inch), but the use of DPI, which started with early Macintosh and Windows computers, still prevails today.

There are two kinds of PPI values that are of potential interest:

Logical PPI – T PPI that the operating system assumes is provided by the display – it can be thought of as the property of the virtual screen assumed by the OS

Physical PPI – The physical pixels per inch that a display monitor provides (corresponds to its native resolution)

A brief history of DPI processing on Windows

At the onset of the personal computer age, the problems to be solved when it came to displaying content digitally on a screen were quite unique. Early display systems had a physical PPI of 72 pixels per inch – this led to a natural choice of the 72 points per inch (‘points’ are a physical unit of measure in typography, dating from the days of printing presses, where 1 point by the modern definition is 1⁄72 of the international inch) by the Macintosh OS to mirror the physical PPI of the prevalent display systems. This meant that the 72 PPI seen on display had the same physical dimensions as the 72 DPI later seen on a printout, with 1 dot in printed text equal to 1 pixel on the display screen. This also meant that some of the early display sizes corresponded to the current standard office paper sizes.

As a result of Apple’s design of one-to-one mapping of display units with print units, the standard 10-point font size from the typewriter-era would be rendered using 10 pixels on the physical display. This led to a couple of problems:

  • It made the then standard 10-point fonts to be displayed poorly, and difficult to read, particularly the lower-case characters
  • Moreover, since computer displays were typically viewed at a greater distance than print media, there was a difference in the perceived size of the same 10pt font rendered on the computer display from that on print media.

Microsoft attempted to solve both problems with what could essentially, in hindsight, be called a hack. It wrote its display software in a way to treat the screen as having 4/3rd of its actual PPI. Since most screens at the time provided 72 PPI, Microsoft essentially wrote its software to assume that every screen provides 96 PPI (as 72 x 4/3 = 96).

This hack resulted in the following short-term benefits:

  • Since the software assumes that one-third more pixels are available for rendering, fonts were rendered with greater detail
  • On displays that provided 72 PPI, every character of text would be rendered at a size one-third larger than it should be, thereby allowing a person sitting at a comfortable distance from the screen to view the rendered text clearly

On the flip side, it led to the following negative effects:

  • Exaggerated sizes of the rendered elements meant less available screen estate, in a relative sense
  • The 1:1 relationship between display and print was lost. This led to problems because more exaggerated as display screens of different sizes and PPIs started to emerge.


To solve these, Microsoft came up with a virtual screen mechanism wherein the software programs render the text and images onto a virtual screen, and the OS translates the virtual screen onto the physical screen. The logical PPI is associated with this virtual screen, and this allows older programs to still run properly irrespective of the actual PPI of the screen.

High-DPI displays started emerging in the post-Windows XP era and became mainstream around the Windows 8 era. Although display scaling by entering a custom DPI was allowed from Windows 95, it became an easily accessible feature from Windows 8, wherein the scaling was a percentage selection instead of custom DPI values being entered.

With the introduction of the GDI+ library, resolution-independent text scaling was made possible. Additionally, Windows Vista allowed programs to declare themselves as being DPI-aware via a manifest file or using an API. For programs that did not declare themselves to be DPI-aware, the OS had a compatibility feature called DPI virtualization – which essentially meant that programs would continue to assume 96 DPI (irrespective of the actual DPI setting), and the Desktop Window Manager would then scale the application’s window to match the actual DPI setting. This led to an effect akin to zooming on an image – thereby resulting in a reduction in the sharpness of the rendered images and text.

Windows 7 made the DPI settings even more accessible, with the ability to affect changes with only a log-off and log-in, instead of a full restart, and made it a per-user setting. Another big upgrade was the automatic setting of the default DPI setting by reading and matching the physical pixel density of the display monitor. In Windows 10, manual control over DPI scaling was introduced for individual monitors.

Making an application DPI-aware

As stated above, the default scaling of non-DPI-aware application windows resulted in less than optimum results, namely blurry or incorrectly sized windows in many common usage scenarios. To fully utilize the DPI technology of the OS, it becomes imperative to make applications DPI-aware. This would help achieve proper scaling of the window and its contents with changes in DPI settings and an overall sharper rendering of the text and other UI elements in the application window.

Applications built on the Universal Windows Platform (UWP) have dynamic scaling built-in, and no additional code is necessary to make them DPI-aware. For applications built using other technologies, such as Win32, Windows Forms, Windows Presentation Framework (WPF), etc., will require additional development to make them DPI-aware and hence scale properly with the DPI.

Several legacy Windows desktop applications incorrectly assume that the DPI setting would ideally not change during the lifetime of the application. The DPI can, in fact, change in the following scenarios:

  • When switching between different monitors (with different default DPI settings)
  • Moving the application between monitors in a multi-monitor setup
  • Connecting/disconnecting an external monitor to a laptop
  • Connecting via remote desktop from a high DPI device to a low DPI device (or vice versa)
  • Making DPI changes while the application is running

If applications do not dynamically respond to these changes, it can result in blurry or incorrectly sized windows.

In order to bring in DPI awareness, the application needs to first inform the OS about its level of DPI awareness:

DPI unaware – This is the default mode which is assumed if the application doesn’t inform the OS about its DPI awareness. In this case, the OS will simply perform bitmap stretching of the window to the expected size in cases of DPI setting being anything other than 96 (100% scale). This results in a blurry render.

System DPI awareness – Applications that declare themselves as being system DPI-aware will take the value of the DPI setting at the time of startup and then use that to come up with the dimensions and positions of the window as well as the size of UI artifacts being rendered. This will ensure bitmap scaling is not done by the OS at that DPI setting. However, the DPI setting is changed (or the monitor is switched to a different DPI one etc.), then the OS will perform bitmap scaling of the application, and the output will still appear blurred.

Per monitor DPI awareness – Declaring as per-monitor DPI-aware will allow applications to dynamically render correctly whenever the DPI changes – for whatever reason. For such applications, whenever the DPI changes, the OS will not allow bitmap to scale the window but will instead send the WM_DPICHANGED message to the application window. This will allow the application an opportunity to recalculate all its dimensions, positions, text sizes etc. essentially scaling everything to the newly changed DPI setting and redrawing the window. This is the recommended method to allow for a rich and dynamic window experience.

To make existing applications DPI-aware, the following checks need to be made:

  • Mark the application as being per-monitor DPI-aware in its manifest file
  • Handle the WM_DPICHANGED message. This will require re-dimensioning of the key UI artifacts and re-assigning of defaults that can impact the size and position of the UI artifacts, such as text size
  • Changing any legacy Win32 APIs that do not work with any DPI context. Some of these APIs are listed below, along with their DPI aware counterparts:

Single DPI version

Per-Monitor version

GetSystemMetrics

GetSystemMetricsForDpi

AdjustWindowRectEx

AdjustWindowRectExForDpi

SystemParametersInfo

SystemParametersInfoForDpi

GetDpiForMonitor

GetDpiForWindow

Read More:

  • AI Winter or AI Spring – Where are We Headed?
  • Choosing the Right Cloud Service Provider to Maximise Business Efficiency
  • Advantages of 64 bit OS Over 32 bit OS
The Evolution of DPI Over the Years (2024)

FAQs

Is 1200 DPI overkill? ›

The necessity of a 1200 DPI largely depends on the type of printing you are doing. For images that are 100% or smaller, a 1200 DPI may be excessive. However, for larger images, say 200% or more, it might be appropriate. For high-end art books requiring fine details, a 1200 DPI might be slightly high but not excessive.

Is 1200 DPI better than 300dpi? ›

Generally, 300dpi is a high-res print and 1200 dpi is ultra high-res beyond what is often used even for extremely detailed fine art applications.

What does DPI mean? ›

What does DPI stand for? DPI stands for Dots per Inch, referring to the number of ink droplets a printer will produce per inch while printing an image.

Is 600 DPI better than 300? ›

This means that the more dots per inch (dpi), the higher the print resolution. For example, a 300 dpi printer can print 300 dots per inch of page space, whereas a 600 dpi printer can print double that amount, creating a much higher quality print.

Is 3200 DPI overkill? ›

Most modern gaming mice have a maximum DPI setting much higher than anything a gamer will want to use day-to-day. An extremely high setting might be fun for trick shots or oddball situations but in general, a DPI setting of up to 3200 is enough for most players. Nearly every modern gaming mouse can handle that.

Is 10000 DPI overkill? ›

For most users, 10000 DPI is excessively high and may result in overly sensitive mouse movements, making precise control difficult.

What happens if DPI is too high? ›

Text and icons may appear blurry: If the DPI is too high for the screen size, text and icons may appear blurry or pixelated. Some apps may not work properly: Some apps may not be designed to work with high DPI settings, and may not display correctly or may crash.

Is 1600 DPI really better? ›

Some tests claim that 1600 dpi has a better initial response and it is better to set it to 1600, while lowering the sensitivity in the game.

Which is clearer 600 DPI or 1200 DPI? ›

The higher the dpi the better the resolution and the better copy/print quality. For example, 1200 X 1200 dpi will give you better resolution or copy/print quality than 600 X 600 dpi, thus giving you better copy/print quality and better half tones. Almost all of Kyocera copiers and printers are 1200 X 1200 dpi capable.

Does increasing DPI increase quality? ›

The DPI/PPI of an image is important for two main reasons: Print quality - The higher the DPI/PPI, the better the print quality will be. This is because there are more dots or pixels per inch, so each dot or pixel can be printed at a higher resolution. This results in sharper images and smoother gradients.

What is the DPI of a JPEG? ›

When you display or print the image, the DPI is the number of pixels divided by the physical size the image is displayed or printed at. For example, If the JPEG image is 300 x 500 pixels, and it is printed as a 3″ x 5″ picture, the DPI is 300 / 3 = 100.

What is the best DPI for color printing? ›

For printing, the recommended resolution for all images and art files is 300 dpi. The offset press cannot accurately reproduce resolutions above 300, so it is the industry standard.

Is it better to scan old photos at 300 or 600 DPI? ›

Is 300 dpi or 600 dpi better? If you have 4x6 snapshots, then 300 dpi scans are perfect for simple archiving or printing. You can still print a good-looking 7x10 enlargement. For small wallet-sized pictures, scan at 600 dpi so you can enlarge them and retain more detail.

Can you print 72 DPI images? ›

The internet displays images at 72 dpi, so that the images appear quickly over an internet connection, but under no circ*mstances should they be used for printing. If you submit low-resolution files for printing, you will not be happy with the quality of your printing.

What is the difference between DPI and PPI? ›

These two acronyms are often used interchangeably although they do have different meanings. PPI (Pixels Per Inch) refers display resolution, or, how many individual pixels are displayed in one inch of a digital image. DPI (Dots Per Inch) refers to printer resolution, or, the number of dots of ink on a printed image.

Is 1200 DPI too much for printing? ›

Well, it depends on what you are printing. If your images are at 100% or smaller, 1200 is really overdoing it. If your images are at like 200% or better, you may be fine. If you are printing a very high-end art book that requires finely detailed images, 1200 may be a little high, but not objectionable.

Is 1200 DPI scan good? ›

Best DPI for Colour or Greyscale Scanning

For online use, businesses usually choose a smaller resolution of 200 dpi. Colour photographs are often scanned at higher resolutions to capture their detail – typically from 600 to 1,200 dpi, with the higher resolution used for the best archival images.

Is 12000 DPI good for gaming? ›

Games that call for increased precision on the fly can make great use of this function. Its maximum setting might only be 12,000 DPI but this added utility more than makes up for it.

How long does 1200 DPI take? ›

It takes about 1:15 - 2:00 for a 4x6 print @ 1200DPI. But if you arrange (3) 4x6 prints on the scanner glass and do a preview with the Epson scan software you can then crop each image so that the scanner will scan just those images and create a separate file for each image.

References

Top Articles
Latest Posts
Article information

Author: Van Hayes

Last Updated:

Views: 5666

Rating: 4.6 / 5 (66 voted)

Reviews: 81% of readers found this page helpful

Author information

Name: Van Hayes

Birthday: 1994-06-07

Address: 2004 Kling Rapid, New Destiny, MT 64658-2367

Phone: +512425013758

Job: National Farming Director

Hobby: Reading, Polo, Genealogy, amateur radio, Scouting, Stand-up comedy, Cryptography

Introduction: My name is Van Hayes, I am a thankful, friendly, smiling, calm, powerful, fine, enthusiastic person who loves writing and wants to share my knowledge and understanding with you.