Tuesday, June 26, 2007

Windows' ClearType Engine more accurate at 100dpi


Today I spent a lot of time figuring out how to generate a nice bitmap of a monospaced font that I can use as a font template in our mobile phone applications. In other words, I create a bitmap object of a font that has a fixed character width, and then I cut it up into little pieces (glyphs) that draw on the screen of the phone.


When I started out by rasterizing Lucida Console and measuring character widths, I was very surprised to find that the font's width seemed inconsistent. After closer inspection, I came to the conclusion that each character was, say, 12 pixels, plus a fractional pixel in width. This was quite surprising to me and left me a little perplexed, but then I remembered a few interesting tidbits about my history in desktop publishing.

I grew up with the Macintosh. The best thing about the Mac was that the output on paper closely resembled what was displayed on the screen; they were always more accurate. And today I remembered why: the screen resolution of the Mac was generally 72 pixels per inch.

72ppi is special. In typography, most small distances are measured in points which are 1/72 of an inch. This means that on the Mac, a 12-point monospace font generally display as 12 pixels in height. As far as I can tell, typographers seem to enjoy placing font features on a grid evenly divisible by points, so everything comes out very nice on the Mac because the font doesn't have to be so aggressively hinted.

Not true on the PC. Because of display devices increasing in resolution over time, a Windows machine generally defaults to a display resolution of 96 ppi. Since 96 is 133.3% larger than 72 ppi, we end up with this dangling third-of-a-pixel everywhere, and everything is now screwed up and we get this remainder that has to be accounted for -- probably by the font's author, who hand-tweaked the font's hinting.

This notion that Windows was off-kilter led me to determine something seemingly remarkable about how ClearType seems to work. After I rasterized Lucida Console and dropped the resulting bitmap onto a grid in The Gimp, I noticed this gradual misalignment that caused the font to adhere poorly to any pixel-based grid. And then I remembered these issues about display resolution.

After several tweaks of the Display Properties to change the screen resolution (which, in turn, resulted in multiple reboots of my system), I finally discovered a dots per inch value that just seems to work (terminology note: we use dots and pixels interchangeably). Apparently, Windows uses a 100ppi grid internally when rasterizing ClearType fonts, or something. Take a look at the bitmap of 12-point Lucida Console below. The grey text was screen-captured with my display resolution set to 103% of "normal size," or 99 DPI. The black text, which seems to be aligned to the grid much more reliably, was taken with my display resolution set to 104%, or 100DPI. Of course, things were much worse when I was using the windows default of 96 DPI.

Anyway, if you use Windows XP, I suggest you turn your DPI setting to 100 DPI if you use ClearType and do a lot of work in fixed width fonts (programming, data processing, etc.) Your eyes will thank you.

No comments: