Intro to Computer Systems

Chapter 1: Introduction

History of Computing

Computers are not a new invention; computational devices in one way, shape or form have been with us for thousands of years. It was not until the invention of electricity, and in particular the integrated circuit, that computers became commonplace in our landscape.

Mechanical Computers

Before the invention of electricity, humankind was still able to craft tools to help him perform basic calculations. The most well-known of these early devices is the abacus, which dates as far back to approximately 2000 BC.

modern wooden abacuses

Abacuses can be used not only for simple counting, but also more complex mathematical functions such as multiplication, division, and roots.

One of the most elaborate of these mechanical computers is the Difference Engine, devised by Charles Babbage in 1822. This machine was able to perform complex mathematical functions. He followed up his design with the far more complex Analytical Engine in 1834, which was capable of complex logic; a significant step towards a general-purpose computer.

Babbage's Difference Engine was a mechanical computer that could perform complicated mathematics.
Babbage's Difference Engine was a mechanical computer that could perform complicated mathematics.

Due to manufacturing limitations of the time, Babbage was never able to build these machines for himself, but in coming decades others were able to produce machines based on his Difference Engine design.

Valved Electrical Computers

The sheer complexity of a general purpose logic machine was simply too much for a feasible mechanical design. Some success was had with electromechanical designs (which utilised electric relays and switches), but true Turing-complete computers could not be built properly until the development of vacuum tubes.

A "Turing-complete" machine (named after its discoverer, Alan Turing) is one that, directly or indirectly, has the ability to perform any sort of computational logic task.
The ENIAC computer, developed in 1943 by the U.S. Army.
The ENIAC computer, developed in 1943 by the U.S. Army. Photo: U.S. Army

The British defence and intelligence services developed the Colossus machine in 1944 to help decipher coded German battlefield messages, and the United States Army developed and built the ENIAC machine from 1943-1946 for artillery calculations. These were widely known as the first general-purpose digital computers.

Konrad Zuse's Z3 computer pre-dated Colossus and Eniac by a number of years.
Konrad Zuse's Z3 computer pre-dated Colossus and Eniac by a number of years.

However, it was later discovered that the German engineer Konrad Zuse developed the Turing-complete Z3 computer back in 1941 for civil engineering tasks. Unlike Colossus and ENIAC, it was built from older style electromagnetic relays and switches.

Integrated Circuits

Development of the transistor in 1947 by Bardeen, Brattain and Shockley of Bell Labs ushered in a new era of computation. Instead of relying on mechanical principles, transistors were solid state -- they contained no moving parts. This greatly increased their reliability, resistance to abuse, and energy efficiency compared to vacuum tubes. Computers were now only the size of whole desks, rather than entire rooms or gymnasiums!

(A picture of) the first ever integrated circuit.
The first ever integrated circuit. Photo: Texas Instruments

Further research into miniaturisation led to the development of the integrated circuit in 1958 by Jack Kilby of Texas Instruments. This led to the creation of the modern "microchip", which integrated many thousands (and now, hundreds of millions) of transistors on one compact package.

A modern processor chip alongside a noticeboard pin for scale.
A modern processor chip alongside a noticeboard pin for scale. Photo: IBM

 

Mainframes and Supercomputers

Until the arrival of the personal computer in the 1970s, computers were incredibly expensive tools -- only within the reach of governments and large businesses. Their primary benefit to these customers was that of data processing: being able to organise, calculate and solve huge amounts of data. These systems are known as mainframes.

A small (!) IBM System/360 mainframe computer.
A small (!) IBM System/360 mainframe computer. Photo: IBM

First released in 1964 and still available today, IBM's System/360 family (now known as zSeries) is the most enduring mainframe system. The System/360 has a legendary reputation for the scalability and consistency of its architecture; attributes which epitomise the mainframe philosophy.

A system's scalability refers to a computer system's ability to be upgraded from very small to very large installations.

A system's consistency ensures that the system can be maintained and upgraded while protecting other parts of the customer's investment: for example, System/360 software written in the 1960s can run without modification on a zSeries mainframe ordered today.

Supercomputers, on the other hand, are specialised systems which are designed to perform a particular type of calculation incredibly quickly. They are used mainly in the scientific and military sectors, for tasks such as weather prediction and code-breaking. Under this definition, the early World War II computer systems could be classified as "supercomputers".

The Cray-2 supercomputer.
The Cray-2 supercomputer. Photo: U.S. Army

The name Cray is synonymous with supercomputing; Seymour Cray was a computer architect who devoted his life to building supercomputers. The first Cray-1 was built in 1976, and to this day his company continues to develop supercomputer systems.

More modern, civilian examples of supercomputers today are the Earth Simulator, a Japanese supercomputer used for weather prediction; and Deep Blue, a chess-playing supercomputer famous for beating Garry Kasparov, the strongest chess player in history, in 1997.

The Personal Computing Revolution

Personal computing simply didn't exist intil 1975, when Ed Roberts of MITS sold a hobbyist computer kit called the Altair 8800. This, along with other similar products of the time, were not really recognisable as computers in the contemporary sense; they were machines with switches and lights, strictly destined for hobbyist and enthusiast work benches.

The Altair 8800, the first
The Altair 8800, the first "personal" computer. Photo: Tom Carlson

Personal computing in the contemporary sense started soon after, though, with the Apple I developed by Steve Jobs and Steve Wozniak in 1976. (To market their computer, they founded Apple Computer.) These computers were pre-built, and allowed the use of a monitor and keyboard.

A complete Commodore 64 home computer system.
A complete Commodore 64 home computer system.

 

What followed was an explosion in home computing. Radio Shack, Commodore, Atari, and the BBC among others developed home computer platforms. The most successful of these was the Commodore 64, which sold over 22 million units in its lifetime -- more than any other computer in history.

Many of these systems lived on through the 80s, but most eventually died out to leave two market victors: IBM's Personal Computer (introduced 1981), and Apple Computer's Macintosh (introduced 1984).

The original IBM PC.
The original IBM PC. Photo: IBM

The IBM PC was originally marketed to smaller business managers, with spreadsheet tools being their compelling feature. The heart of IBM's PC was soon reverse-engineered by Compaq Corporation, which paved the way for "IBM-Compatible" clone systems to enter the market.

The cloning of the IBM PC and the open development of standards has turned the architecture into a commodity market, where several vendors offer equivalent and equally compatible components. This, along with IBM's reputation in the business field, helped the IBM PC platform become the dominant desktop computing system today.

IBM no longer builds PCs; instead they have decided to concentrate on servers and services, which are more profitable to the company. (Their PC-manufacturing arm was sold to Chinese manufacturer Lenovo in 2004.)

The Macintosh Plus, combined with the LaserWriter, introduced the concept of desktop publishing.
The Macintosh Plus, combined with the LaserWriter, introduced the concept of desktop publishing.
Photo: Apple Computer

The Macintosh, billed as a computer for "the rest of us", was the first commercially successful computer system with a mouse and graphical user interface. In 1985, Apple's LaserWriter desktop laser printer (along with newer, more powerful Macintosh models) created a new market for personal computers: desktop publishing. The company continues to focus heavily on the creative sector, and has led to Apple's significant market share in the fields of photo, print and film.

Pocket Computing

Before the early 1990s, there was little attention was paid to electronic personal organisers. They were often difficult and fidgety to use, and provided little in the way of compelling new features.

An original PalmPilot (left) with an Apple Newton (right).
An original PalmPilot (left) with an Apple Newton (right).
Photos: U.S. Robotics, Apple Computer

Starting first with Apple's Newton in 1993, the concept of a personal digital assistant (PDA) took shape. It took the 1996 introduction of the PalmPilot for the class of devices to become widely successful; in the following years the Palm platform (as well as Microsoft's Windows Mobile) grew to be the most dominant forces in the sector in the turn of the century.

A modern smartphone running the Windows Mobile operating system.
Four modern smartphones (left to right): Apple iOS, Google Android, Microsoft Windows Phone 7, RIM BlackBerry. Photo: Gizmodo

It wasn't too long before these personal organiser devices merged with mobile telephones to create smartphones -- these are cellular phones which included much, if not all of, the functionality of PDAs. This, along with tablet computing, has become the "new frontier" of the consumer technology world. The market currently hosts a number of platforms: