Dawn of the digital information era, Детальна інформація

Dawn of the digital information era
Тип документу: Реферат
Сторінок: 3
Предмет: Іншомовні роботи
Автор: Олексій
Розмір: 7.4
Скачувань: 1417
Dawn of the digital information era.

Successive waves of computing technology over the past 50 years have led to huge changes in business and social life. But the internet revolution is just beginning, writes Paul Taylor.

Thomas Watson, who founded one of the giants of the information technology world, could not have been more wrong. In 1946, the head of

International Business Machines, said: "I think there is a world market for maybe five computers." Today, half a century later, as we head towards 1bn people with access to the internet, the true scale of his miscalculation is apparent.

Computers, and the semiconductors that power them, have invaded almost every aspect of our lives and become the engine for perhaps the greatest changes since the industrial revolution - the dawn of a digital information era based upon the ones and zeros of computer binary code.

The last 50 years have seen at least three phases of computing, each building on, rather than replacing, the last.

These "waves" have included mainframes and departmental mini- computers, the PC era and client/server computing and, most recently, the emergence of the internet computing model built around the standards and technologies of the internet.

Each wave has enabled a shift in business processes: mainframes have automated complex tasks, personal computers have provided users with personal productivity tools and internet computing promises to deliver huge gains in productivity and efficiency, as well as the ability to access huge volumes of information.

The technological foundations for these changes began to be laid more than 350 years ago by Blaise Pascal, the French scientist who built the first adding machine which used a series of interconnected cogs to add numbers. Almost 200 years later, in Britain Charles Babbage, the

"father of the computer", begun designing the steam-powered analytical engine which would have used punched cards for input and output and included a memory unit, had it ever been completed.

But the modern computer age was really ushered in by Alan Turing who in 1937 conceived of the concept of a "universal machine" able to execute any algorithm - a breakthrough which ultimately led to the building of the code-breaking Colossus machine by the British during the second world war.

In 1946, the Electronic Numeric Integrator and Calculator

(ENIAC) computer which contained 18,000 vacuum tubes was built in the

US. Two years later scientists at Manchester completed "Baby", the first stored program machine and ushered in the commercial computing era.

Since then, computer architecture has largely followed principles laid down by John von Neumann, a pioneer of computer science in the

1940s who made significant contributions to the development of logical design and advocated the bit as a measurement of computer memory.

In 1964, IBM introduced the System/360, the first mainframe computer family and ushered in what has been called the first wave of computing.

From a business perspective, the mainframe era enabled companies to cut costs and improve efficiency by automating difficult and time consuming processes.

Typically, the mainframe, based on proprietary technology developed by IBM or one of a handful of competitors, was housed in an air-conditioned room which became known as the "glasshouse" and was tended by white-coated technicians.

Data were input from "green screen" or "dumb" terminals hooked into the mainframe over a rudimentary network.

The mainframe provided a highly secure and usually reliable platform for corporate computing, but it had some serious drawbacks. In particular, its proprietary technology made it costly and the need to write custom-built programs for each application limited flexibility.

The next computing wave was led by the minicomputer makers which built scaled-down mainframe machines dubbed departmental minis or mid- range systems. These still used proprietary technology, but provided much wider departmental access to their resources via desktop terminals.

Among manufacturers leading this wave of computing was Digital

Equipment with its Vax range of machines and Wang which developed a widely used proprietary word-processing system.

A key factor driving down the cost of computing power over this period was significant advances in the underlying technology and in particular, semiconductors.

In 1947, scientists at Bell telephone laboratories in the US had invented the "transfer resistance" device or "transistor" which would eventually provide computers with a reliability unachievable with vacuum tubes.

By the end of the 1950s, integrated circuits had arrived - a development that would enable millions of transistors to be etched onto a single silicon chip and collapse the price of computing power dramatically.

In 1971, Intel produced the 4004, launching a family of

"processors on a chip" leading to the development of the 8080 8-bit microprocessor three years later and opening the door for the emergence of the first mass produced personal computer, the Altair 8800.

The development of the personal computer and personal productivity software - the third wave of computing - was led by Apple Computer and

The online video editor trusted by teams to make professional video in minutes