Wait, what!?! Reading this article made me feel ... old.
I've worked with (young) people who didn't believe me (until I showed them via Google) when I told them that computer screens used to work by shooting electrons through a vacuum to strike phosphorescent material that glowed. These were cathode-ray tubes (CRTs), familiar to anyone over about 30 years old – but literally museum pieces today. Those CRTs seemed magical to us in the '70s, when they started to replace teletypes as the human interface to computers. At first CRTs were only monochrome (e.g., black & white), then in the '80s full color CRTs started to become common.
Those CRTs were far from perfect. The displays were a little fuzzy, the devices were large, heavy, and power-hungry, and the fidelity near the edges was notably worse than in the center. Still, they let us put text and pictures on a screen, and they were way better than those damned teletypes.
Then the flat liquid crystal displays came along in the '90s. They got better and cheaper very quickly, and in the blink of an eye they totally replaced CRTs. They're smaller, lighter, sip electricity, and (best of all) their fidelity is darned near perfect. The fuzziness is gone ... text is crisp and clear from edge-to-edge. Lines are straight. The screen is flat, not curved. They are so much better than CRTs that nobody cried when CRTs went the way of the dodo.
And then ... these nuts go to a whole bunch of trouble to make a video game played on a modern flat screen look like the very imperfect CRTs. Because nostalgia, or something less charitable.
And here I am remembering how magical it seemed the first time I saw a teletype printing computer output.
I feel old.