Dawn of the digital information era

Dawn of the digitalinformation era.Successive waves ofcomputing technology over the past 50 years have led to huge changes inbusiness and social life. But the internet revolution is just beginning, writesPaul Taylor.Thomas Watson,who founded one of the giants of theinformation tech nology world, could not have been more wrong. In 1946, thehead of International Business Machines, said I think there is a worldmarket for maybe five computers.Today, half a cen tury later, as we head towards 1bn people with access to the internet, thetrue scale of his mis calculation is apparent.Computers, and the semicon ductors that power them, have invaded almostevery aspect of our lives and become the engine for perhaps the greatestchanges since the industrial revolution - the dawn of a digital information erabased upon the ones and zeros of computer binary code. The last 50 yearshave seen at least three phases of computing, each building on, rather thanreplacing, the last.These waves have included mainframes and departmental mini-computers, thePC era and client server computing and, most recently, the emergence of theinternet computing model built around the standards and technologies of theinternet.Each wave hasenabled a shift in business processes main frames have automated complextasks, personal computers have provided users with personal pro ductivity toolsand internet com puting promises to deliver huge gains in productivity and effi ciency,as well as the ability to access huge volumes of informa tion.The technologicalfoundations for these changes began to be laid more than 350 years ago byBlaise Pascal, the French scien tist who built the first adding machine whichused a series of interconnected cogs to add num bers. Almost 200 years later,in Britain Charles Babbage, the father of the computer , begundesigning the steam-powered analytical engine which would have used punchedcards for input and output and included a memory unit, had it ever beencompleted.But the moderncomputer age was really ushered in by Alan Turing who in 1937 conceived of theconcept of a universal machine able to execute any algorithm - abreakthrough which ultimately led to the build ing of the code-breakingColossus machine by the British during the second world war. In 1946, theElectronic Numeric Integrator and Calculator ENIAC computer which con tained18,000 vacuum tubes was built in the US. Two years later scientists atManchester com pleted Baby , the first stored program machine andushered in the commercial computing era.Since then,computer architec ture has largely followed princi ples laid down by John vonNeu mann, a pioneer of computer science in the 1940s who made significantcontributions to the development of logical design and advocated the bit as amea surement of computer memory.In 1964, IBMintroduced the System 360, the first mainframe computer family and ushered in whathas been called the first wave of computing.From a businessperspective, the mainframe era enabled com panies to cut costs and improveefficiency by automating difficult and time consuming processes.Typically, themainframe, based on proprietary technology developed by IBM or one of a handfulof competitors, was housed in an air-conditioned room which became known as the glasshouse and was tended by white-coated technicians.Data were inputfrom green screen or dumb terminals hooked into the mainframeover a rudimentary network.The mainframeprovided a highly secure and usually reli able platform for corporate com puting,but it had some serious drawbacks.

In particular, its pro prietary technologymade it costly and the need to write cus tom-built programs for eachapplication limited flexibility.The next computingwave was led by the minicomputer makers which built scaled-down main framemachines dubbed depart mental minis or mid-range systems.

These still usedpropri etary technology, but provided much wider departmental access to theirresources via desktop ter minals.Among manufacturersleading this wave of computing was Digi tal Equipment with its Vax range ofmachines and Wang which developed a widely used propri etary word-processing system.A key factordriving down the cost of computing power over this period was significantadvances in the underlying tech nology and in particular, semi conductors.In 1947, scientistsat Bell telephone laboratories in the US had invented the transfer resistance device or transistor which would eventually provide computers witha reliability unachievable with vacuum tubes.By the end of the1950s, inte grated circuits had arrived - a development that would enablemillions of transistors to be etched onto a single silicon chip and collapsethe price of comput ing power dramatically.In 1971, Intelproduced the 4004, launching a family of processors on a chip leading to the develop ment of the 8080 8-bit micropro cessor three years laterand open ing the door for the emergence of the first mass produced personalcomputer, the Altair 8800.The development ofthe per sonal computer and personal pro ductivity software - the third wave ofcomputing - was led by Apple Computer and IBM in con junction with Microsoftwhich provided IBM with the operating system for the first IBM PC in 1981.This year, anestimated 108m PCs will be sold worldwide including a growing number of sub - 500 machines which are expanding the penetration of PCs into households whichpreviously could not afford them.Sometimes, however,software development has not kept pace. As Robert Cringely, the Silicon Valleytechnology guru, notes If the automobile had followed the samedevelopment as the computer, a Rolls-Royce would today cost 100, get a millionmiles per gallon and explode once a year, killing everyone inside.

Nevertheless, forbusinesses the arrival of the desktop PCs built around relatively low coststandard components put real computing power into the hands of end-users forthe first time. This meant Individual users could create, manipulate and con troltheir own data and were freed from the constraints of dealing with a big ITdepartment.However, thelimitations of desktop PCs as islands of com puting power alsoquickly became apparent.

In particular, people discovered they needed to hooktheir machines together with local area networks to share data and peripheralsas well as exchange messages.By the start of the1990s, a new corporate computer architecture called client server computing hademerged built around desk top PCs and more powerful serv ers linked together bya local area network.Over the past fewyears, how ever, there has been growing disatisfaction, particularly among bigcorporate PC users, with the client server model mainly because of itscomplexity and high cost of lifetime ownership.As a result, therehas been a pronounced swing back towards a centralised computing model in thepast few years, accelerated by the growth of the internet.

The internet hasits origins in the 1970s and work undertaken by Vinton Cerf and otters todesign systems that would enable research and academic institu tions working onmilitary pro jects to co-operate.This led to thedevelopment of the Ethernet standard and TCP IP, the basic internet protocol.It also led Bob Metcalfe to promul gate Metcalfe s Law which statesthe value of a network is proportional to the square of the number of nodesattached to it.But arguably, itwas not until the mid-1990s and the commer cialisation of the Internet that thetrue value of internetworking became apparent.

The growth of the internet andthe world wide web in particular since then has been astonishing.With the help oftools like web browsers, the internet was trans formed in just four years froman arcane system linking mostly academic institutions into a global transportsystem with 50m users.

Today, that figure has swollen to about 160m and esti matesfor the electronic com merce that it enables are pushed up almost weekly.According to thelatest Gold-man Sachs internet report, the business-to-business e-commercemarket alone will grow to l,500bn in 2004, up from 114bn this year andvirtually nothing two years ago. Two inter-relatedtechnologies have been driving these changes semiconductors and networkcommunications.

For more than 25years, semi conductor development has broadly followed the dictum of Moore s Law laid down by Gor don Moore, co-founder of Intel.This states thatthe capacity of semiconductor chips will double every 18 months, or expressed adifferent way, that the price of computing power will halve every 18 months.

Moore s Law isexpected to hold true for at least another decade but around 20l2 scientistsbelieve semiconductor designers will run into some physical atomic roadblocksas they continue to shrink the size of the components and lines etched onto ofsilicon chips.

At that stage, somecomputer scientists believe it will be necessary to look for alternatives tosilicon-based computing.

Research intonew materials and computer architectures is mostly focusing on the potential ofquantum computing.Meanwhile, thedeadline keeps being pushed back by improvements to existing processes.

At thesame time, there have been big leaps in communications technologies and, inparticular, fibre optics and IP-based systems.Today, one strandof Qwest s US network can carry all North America s telecoms traffic and in afew years, the same strand of glass fibre will be able to carry all the world snetwork traffic. We are going to have so much bandwidth,we are not going-to know what to do with it, says John Patrick vicepresident of internet technology at IBM. I am veryoptimistic about the future.

He believes thistelecoms capacity will enable the creation of a wide range of internet-basednew services including digital video and distributed storage and medicalsystems.Buthe cautions The evolution of the internet is based upon technicalthings, but in the end it is not about technology itself, it is about what thetechnology can enable.