Discussing computer displays without mentioning graphics or central processing units (CPU) that developed in tandem would be like discussing automobile engines without touching on the body, chassis and suspension; they tend to work in tandem and improvements in one usually answers or prompts an improvement in the others.
Displays belong to a portion of a computing system’s output devices, specifically video. From mid 1945-1979 with the use of mainframes, output graphics were more likely to be printed text than anything else. The earliest examples of video are from the Whirlwind project at MIT circa 1950 which resembles an oscilloscopes; it’s a round cathode ray tube (CRT) and monochrome. The whirlwind project was dedicated to military flight simulators and NORAD missile defense. Printed graphics were more abundant because of transportability (you could take your data with you) and CRT displays were expensive, large and heavy. 1958 would see the development of the integrated circuit (IC) chip reducing computer size and cost.
Out of a combined project from General Motors research into computer aided design for cars, IBM developed the first commercially available software for mechanical drafting and the IBM 2250 display. Although GE would develop a full color realtime flight similator for NASA in 1967, through the 1960s and much of the 1970s computer mainframe graphic display was limited to monochrome CRT and LED representation of numbers and some text, similar to the onboard Apollo computers, pocket calculators and scoreboards at sporting events. While a bit dull, the feature would standardize pixel technology when computer graphics began to become more sophisticated. The first widely available motion breakthrough came from video games which, while entertaining, were not programmable. And they cost a quarter to use, much like a legal slot machine that never paid back.
1979 things began to change and the computer and its graphics were coming into the home. The Apple II plus was released. Motorola released the 68000 32 bit processor and the IBM 3279 color terminal was produced. The Atari 2600 home gaming system flew off store shelves. The convenient thing about Atari and later, the Commodore systems would be that an ordinary color TV would provide good graphics. The video output was admittedly crude, but until this, it wasn’t even available or in color. The IGES graphics file format was standardized, which meant that different platforms could potentially now handle the same data.
During the 1980s display and graphics processing advancements that weren’t monopolized by the military, tended to stay in the hands of film makers and television advertisers. Commodore computers would compete with gaming consoles in the retail market and CPU advancement was gradual as the graphics grew incrementally better from 8 color to 256 color. Hard Disks Drives were becoming common on Apple and IBM PCs but were 10 times the cost of a gaming system. Realistic graphics, even for the film makers was still technologically out of reach.
The 1990s saw a boom in CPU capabilities and that directly improved graphics quality. Films like “Terminator II: Judgment Day” were an indication of the potential in home PCs. The 486 processor was finally strong enough to carry some video load without dropping other processing functions. Video cards that handled the exclusive workload of drawing the screen and providing 3D representation emerged by 1995. TVs were no longer fast enough to keep up with the graphics output of PCs. Gaming consoles had a re-emergence with the Nintendo 64 but TV was still acceptable for play.
The displays have started to develop in their own right with flat-panel LCD technology giving way to LED high definition screens. The current trend is toward virtual reality and more realistic 3D. I do not think it unrealistic to anticipate a commercial environment that will operate in a fashion similar to the holodeck presented as crew recreation in the TV series “Star Trek:The Next Generation.”