Tales In Tech History: Computer Monitors

monitoring workplace PC

Computer monitors began as small monochrome CRT displays but now come in all sorts of wonderful shapes, sizes and even curves

It is hard to imagine but early computers often did not have a monitor to facilitate human interaction with the machine.

Indeed, these early computers instead utilised a range of differing interfaces, including a panel of light bulbs, plain old punch cards, or even a television.

But as computers became more powerful, our need to interact with the machine became more of a necessity, which lead to the arrival of the computer monitor as we know it today.

colossusJourney To Mainstream

Monitors were first used on computers involved in data processing tasks. On the other hand, computers associated more with entertainment, such as the Sinclair Spectrum or Commodore computers, rather opted to use televisions for their visual displays.

It took the advent of the IBM compatible PC in the 1980s for the computer monitor to become a staple piece of equipment in most offices and homes.

And it should be remembered that early monitors were bulky affairs, mostly because they used cathode ray tubes (CRT). Modern displays of course are much more svelte-like thanks to thin film transistor liquid crystal displays (TFT-LCD) or a flat panel LED displays.

Early monitors were also monochrome and gave the user a stunning choice of either green, orange, or sometimes white text displays. They also were only able to display limited amount of information, hence a dot matrix printer was often the principle output device, while the monitor was used to drive the computer’s operations.

Some experts point out that early Apple computers in 1970s drove the delivery of more sophisticated displays that could show more information. That is true, but Apple was a (sizeable) niche player at the time, and it was IBM that took up the monitor mantle in the 1980s.

For example in 1981 IBM introduced the colour graphics adapter, which could only display four colours with a resolution of 320 x 200 pixels, or it could produce 640 x 200 pixels with two colours. Then in 1984 it produced the Enhanced Graphics Adapter which was capable of producing 16 colours at a resolution of 640 x 350.

But people wanted their screens to display more data, and by the end of the 1980s colour CRT monitors that could display 1024 x 768 pixels became more affordable.


Typically these units were connected to the computer via VGA ports, but later connections were available on a range of ports (i.e. HDMI, DisplayPort, Thunderbolt etc), depending on what machine is used.

Bulky CRT-based monitors also remained the norm into the new millennium, partly because of its cheap manufacturing costs and viewing angles of nearly 180 degrees.

LCD screens had arrived in the 1990s but were simply too costly. The life of CRT monitors was further extended as early LCD screens tended to provide a poor viewing experience compared to CRT units.

But LCD manufacturers worked hard and gradually perfected their displays and lowered the purchase price, and it was in 2003 when TFT-LCDs finally outsold CRTs for the first time.

The advantages of TFT-LCDs displays was obvious. They took up less desk space, consumed less power, and were much lighter than CRT screens.

But technology development is relentless, and LCD screens now face a challenge from organic light-emitting diode (OLED) monitors, which provide higher contrast and better viewing angles than LCD’s.

Some have questioned the long term viability of traditional computer monitors with the advent of virtual reality. Microsoft’s HoloLens for example utilises augmented reality to project a display into a real world setting, such as your living room.

Microsoft HoloLensIf developments like this become the norm, it may be that the days of the traditional computer monitor are numbered.

The technology industry is characterised by rapid change and populated by colourful figures. New developments are often so transformational they seem hard to believe… and in some cases natural scepticism is justified. But can you spot the fake stories from the real ones?