Around 1770, François-Ambroise Didot used slightly larger ciceros to fit the standard French “foot.” Didot’s pica was 0.1776 inches long and divided evenly into 12 increments. Today we call them points.
In 1886, the American Point System established a “pica” as being 0.166 inches. Six of these are 0.996 inches.
None of the units ever strayed far from 12 points per pica: 6 picas per inch = 72 points per inch. It was an important standard by 1984, when Apple prepared to introduce the first Macintosh computer. The Mac’s interface was designed to help people relate the computer to the physical world. Software engineers used the metaphor of a desk to describe the arcane workings of a computer, right down to “paper,” “folder” and “trash” icons.
Each pixel on the original Mac’s 9-inch (diagonal) and 512 x 342 pixel screen measured exactly 1 x 1 point. Hold a ruler to the glass, and you’d see that 72 pixels would actually fill 1 inch. This way, if you printed an image or piece of text and held it next to the screen, both the image and hard copy would be the same size.
But early digital pictures were clunky and jagged. As screen technology and memory improved, computers were able to display more pixels on the same size monitor. Matching a print-out to the screen became even less certain when raster and vector apps allowed users to zoom in and examine pixels closely. By the mid-1990s, Microsoft Windows could switch between 72 and 96 pixels per inch on screen. This made smaller font sizes more legible because more pixels were available per point size.