рефераты конспекты курсовые дипломные лекции шпоры

Реферат Курсовая Конспект

Early Computing Machines and Inventors

Early Computing Machines and Inventors - раздел Лингвистика, Аннотирование и реферирование английской научно-технической литературы The Abacus, Which Emerged About 5,000 Years Ago In Asia Minor And Is Still In...

The abacus, which emerged about 5,000 years ago in Asia Minor and is still in use today, may be considered the first computer. The device allows users to make computations using a system of sliding beads arranged on a rack. Early merchants used the abacus to keep trading transactions. But as the use of paper and pencil spread, particularly in Europe, the abacus lost it importance. It took nearly 12 centuries, however, for the next significant advance in computing devices to emerge. In 1642, Blaise Pascal (1623-1662), the 18-year-jld con of a French tax collector, invented what be called a numerical wheel calculator to help his father with his duties. This brass rectangular box, also called a Pascaline, used eight movable dials to add sums up to eight figures long. Pascal's device used a base of ten to accomplish this. For example, as one dial moved ten notches, or one complete revolution, it moved the next dial - which represented the ten's column - one place. When the ten's dial moved one revolution, the dial representing the hundred's place moved one notch and so on. The srawback to the Pascaline, of source, was its limitation to addition.

In 1694, a German mathematician and philosopher, Gottfried Withem von Leibniz (1646-1716), improved the Pascaline by creating a machine that could also multiply. Like its predecessor, Leibniz's mechanical multiplier worked by a system of gears and dials. Partly by studying Pascal's original notes and drawings, Leibniz was able to refine his machine. The centerpiece of the machine was its stopped-drum gear design, which offered an elongated version of the simple flat gear. It wasn't until 1820, however, that mechanical calculators gained widespread use. Charles Xavier Thomas de Colmar, a Frenchman, invented a machine that could perform the four basic arithmetic functions. Colmar's mechanical calculator, the arithometer, presented a more practical approach to computing because it could add, subtract, multiply and divide. With its enhanced versatility, the arithometer was widely used up until the First World Was. Although later inventors refined Colmar's calculator, together with fellow inventors Pascal and Leibniz, he helped define the age of mechanical computation.

The real beginnings of computers as we know them today, however, lay with an English mathematics professor, Charles Babbage (1791-1871). Frustrated at the many errors he found while examining calculations for the Royal Astronomical Society, Babbage declared, "I wish to God these calculations had been performed by steam!" With those words, the automation of computers had begun. By 1812, Babbage noticed a natural harmony between machines and mathematics: machines were best at performing tasks repeatedly without mistake; while mathematics, particularly the production of mathematic tables, often required the simple repetition of steps. The problem centered on applying the ability of machines to the needs of mathematics. Babbage's first attempt at solving this problem was in 1822 when hi proposed a machine to perform differential equations, called a Difference Engine. Powered by steam and large as a locomotive, the machine would have a stored program and could perform calculations and print the results automatically. After working on the Difference Engine for 10 years, Babbage was suddenly inspired to begin work on the first general-purpose computer, which he called the Analytical Engine. Babbage's assistant, Augusta Ada King, Countess of Lovelace (1815-1842) and daughter of English poet Lord Byron, was instrumental in the machine's design. One of the few people who understood the Engine's design as well as Babbage, she helped revise plans, secure funding from the British government, and communicate the specifics of the Analytical Engine to the public. Also, Lady Lovelace's fine understanding of the machine allowed her to create the instruction routines to be fed into the computer, making her the first female computer programmer. In the 1980's, the U.S. Defense Department names a programming language ADA in her honor.

Babbage's steam-powered Engine, although ultimately never constructed, may seem primitive by today's standards. However, it outlined the basic elements of a modem general purpose computer and was a breakthrough concept. Consisting of over 50,000 components, the basic design of the Analytical Engine included input devices in the form of perforated cards containing operating instructions and a "store" for memory of 1,000 numbers of up to 50 decimal digits long. It also contained a "mill" with a control unit that allowed processing instructions in any sequence, and output devices to produce printed results. Babbage borrowed the idea of punch cards to encode the machine's instructions from the Jacquard loom. The loom, produced in 1820 and named after its inventor, Joseph-Marie Jacquard, used punched boards that controlled the patterns to be women.

In 1889, an American inventor, Herman Hollerith (1860-1929), also applied the Jacguard loom concept to computing. His first task was to fond a faster way to compute the U.S. census. The previous census in 1880 had taken nearly seven years to count and with an expanding population, the bureau feared it would take 10 years to count the latest census. Unlike Babbage's idea of using perforated cards to instruct the machine, Hollerith's method used cards to store data information, which he fed into a machine that compiled the results mechanically. Each punch on a card represented one number, and combinations of two punches represented one letter. As many as 80 variables could be stored on a single card. Instead of ten years, census takers complied their results in just six weeks with Hollerith's machine. In addition to their speed, the punch cards served as a storage method for data and they helped reduce computational errors. Hollerith brought his punch card reader into the business world, founding Tabulating Machine Company in 1896, later to become International Business Machines (IBM) in 1924 after a series of mergers. Other companies such as Remington Rand and Burroughs also manufactured punch readers for business use. Both business and government used punch cards for data processing until the 1960's.

In the ensuing years, several engineers made other significant advances. Vannevar Bush (1890-1974) developed a calculator for solving differential equations in 1931. The machine could solve complex differential equations that had long left scientists and mathematicians baffled. The machine was cumbersome because hundreds of gears and shafts were required to represent numbers and their various relationships to each other. Tp eliminate this bulkiness, John V. Atanasoff (b. 1903), a professor at Iowa State College (now called Iowa State University) and his graduate student Clifford Berry, envisioned an all-electronic computer that applied Boolean algebra to computer circuitry. This approach was based on the mid-19th century work of George Boole (1815-1864) who clarified the binary system of algebra, which stated that any mathematical equations could be stated simply as either true or false. By extending this concept to electronic circuits in the form of on or off, Atanasoff and Berry had developed the first all-electronic computer by 1940. Their project, however, lost its funding and their work was overshadowed, by similar developments by other scientists.

 

 

Five Generations of Modern Computers

First Generation (1945-1956)

 

With the onset of the Second World War, governments sought to develop computers to exploit their potential strategic importance. This increased funding for computer development projects hastened technical progress. By 1941 German engineer Konrad Zuse had developed a computer, the Z3, to design airplanes and missiles. The Allied forces, however, made greater strides in developing powerful computers. In 1943, the British completed a secret code-breaking computer called Colossus to decode German messages. The Colossus's impact on the development of the computer industry was rather limited for two important reasons. First, Colossus was not a general-purpose computer; it was only designed to decode secret messages. Second, the existence of the machine was kept secret until decades after the war.

American efforts produced a broader achievement. Howard H. Aiken (1900-1973), a Harvard engineer working with IBM, succeeded in producing an all-electronic calculator by 1944. The purpose of the computer was to create ballistic charts for the U.S. Navy. It was about half as long as a football field and contained about 500 miles of wiring. The Harvard-IBM Automatic Sequence Controlled Calculator, or Mark I for short, was a electronic relay computer. It used electromagnetic signals to move mechanical parts. The machine was slow (taking 3-5 seconds per calculation) and inflexible (in that sequences of calculations could not change); but it could perform basic arithmetic as well as more complex equations.

Another computer development spurred by the war was the Electronic Numerical Integrator and Computer (ENIAC), produced by a partnership between the U.S. government and the University of Pennsylvania. Consisting of 18,000 vacuum tubes, 70,000 resistors and 5 million soldered Joints, the computer was such a massive piece of machinery that it consumed 160 kilowatts of electrical power, enough energy to dim the lights in an entire section of Philadelphia. Developed by John Prosper Eckert (1919-1995) and John W. Mauchly (1907-1980), ENIAC, unlike the Colossus and Mark I, was a general-purpose computer that computed at speeds 1,000 times faster than Mark I.

In the mid-1940's John von Neumann (1903-1957) joined the University of Pennsylvania team, initiating concepts in computer design that remained central to computer engineering for the next 40 years. Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to hold both a stored program as well as date. This "stored memory" technique as well as the "conditional control transfer," that allowed the computer to be stopped at any point and then resumed, allowed for greater versatility in computer programming. The key element to the von Neumann architecture was the central processing unit, which allowed all computer functions to be coordinated through a single source. In 1951, the UNIVAC I (Universal Automatic Computer), built by Remington Rand, became one of the first commercially available computers to take advantage of these advances. Both the U.S. Census Bureau and General Electric owned UNIVACs. One of UNIVAC's impressive early achievements was predicting the winner of the 1952 presidential election, Dwight D Eisenhower.

First generation computers were characterized by the fact dial operating instructions were made-to-order for the specific task for which the computer was to be used. Each computer had a different binary-coded program called a machine language that told it how to operate. This made the computer difficult to program and limited its versatility and speed. Other distinctive features of first generation computers were the use of vacuum tubes (responsible for their breathtaking size) and magnetic drums for data storage.

 

Second Generation Computers (1956-1963)

 

By 1948, the invention of the transistor greatly changed the computers development. The transistor replaced the large, cumbersome vacuum tube in televisions, radios and computers. As a result, the size of electronic machinery has been shrinking ever since. The transistor was at work in the computer by 1956. Coupled with early advances in magnetic-core memory, transistors led to second generation computers that were smaller, faster, more reliable and more energy-efficient than their predecessors. The first large-scale machines to take advantage of this transistor technology were early supercomputers. Stretch by IBM and LARC by Sperry-Rand. These computers, both developed for atomic energy laboratories, could handle an enormous amount of data, a capability much in demand by atomic scientists. The machines were costly, however, and tended to be too powerful the business sector's computing needs, thereby limiting their attractiveness. Only two LARCs were ever installed: one in the Lawrence Radiation Labs in Livermore, California, for which the computer was named (Livermore Atomic Research Computer) and the other at the U.S. Navy Research and Development Center in Washington, D.C. Second generation computers replaced machine language with assembly language, allowing abbreviated programming codes to replace long, difficult binary codes.

Throughout the early 1960's, there were a number of commercially successful second generation computers used m business, universities, and government from companies such as Burroughs, Control Data, Honeywell, IBM, Sperry-Rand, and others. These second generation computers were also of solid state design, and contained transistors in place of vacuum tubes. They also contained all the components we associate with the modern day computer: printers, tape storage, disk storage, memory, operating systems, and stored programs. One important example was the IBM 1401, which was universally accepted throughout industry, and is considered by many to be the Model T of the computer industry. By 1965, most large business routinely processed financial information using second generation computers.

It was the stored program and programming language that gave computers the flexibility to finally be cost effective and productive for business use. The stored program concept meant that instruction to run a computer for a specific function (known as a program) were held inside the computers memory, and could quickly be replaced by a different set of instructions for a different function. A computer could print customer invoices and minutes later design products or calculate paychecks. More sophisticated high-level languages such as COBOL (Common Business-Oriented Language) and FORTRAN (Formula Translator) came into common use during this time, and have expanded to the current day. These languages replaced cryptic binary machine code with words, sentences, and mathematical formulas, making it much easier to program a computer. New types of careers (programmer, analyst, and computer systems expert) and the entire software industry began with second generation computers.

 

Third Generation Computers (1964-1971)

 

Though transistors were clearly an improvement over the vacuum tube, they still generated a great deal of heat, which damaged the computer's sensitive internal parts. The quartz rock eliminated this problem, Jack Kilby, an engineer with Texas Instruments, developed the integrated circuit (IC) in 1958. The IC combined three electronic components onto a small silicon disc, which was made from quartz. Scientists later managed to fit even more components on a single chip, called a semiconductor As a result, computers became ever smaller as more components were squeezed onto the chip. Another third-generation development included the use of an operating system that allowed machines to run many different programs at once with a central program that monitored and coordinated the computer's memory.

 

Fourth Generation (1971-Present)

 

After the integrated circuits, the only place to go was down - in size, that is. Large scale integration (LSI) could fit hundreds of components onto one chip. By the 1980's, very large scale integration (VLSI) squeezed hundreds of thousands of components onto a chip. Ultra-large scale integration (ULSI) increased that number into the millions. The ability to fit so much onto an area about half the size of a U.S. dime helped diminish the size and price of computers. It also increased their power, efficiency and reliability. The Intel 4004 chip, developed m 1971, took the integrated circuit one step further by locating all the components of a computer (central processing unit, memory, and input and output controls) on a minuscule chip. Whereas previously the integrated circuit had had to be manufactured to fit a special purpose, now one microprocessor could be manufactured and then programmed to meet any number of demands. Soon everyday household items such as microwave ovens, television sets and automobiles with electronic fuel injection incorporated microprocessors.

Such condensed power allowed everyday people to harness a computer's power. They were no longer developed exclusively for large business or government contracts. By the mid-1970's, computer manufacturers sought to bring computers to general consumers. These minicomputers came complete with user-friendly software packages that offered even non-technical users an array of applications, most popularly word processing and spreadsheet programs. Pioneers in this field were Commodore, Radio Shack and Apple Computers. In the early 1980's, arcade video games such as Pac Man and home video game systems such as tile Atari 2600 ignited consumer interest for more sophisticated, programmable home computers.

In 1981, IBM introduced its personal computer (PC) for use m the home, office and schools. The 1980's saw an expansion in computer use in all three arenas as clones of the IBM PC made the personal computer even more affordable. The number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used. Computers continued their trend toward a smaller size, working their way down from desktop to laptop computers (which could fit inside a briefcase) to palmtop (able to fit inside a breast pocket). In direct competition with IBM's PC was Apple's Macintosh line, introduced in 1984. Notable for its user-friendly design, the Macintosh offered an operating system that allowed users to move screen icons instead of typing instructions. Users controlled the screen cursor using a mouse, a device that mimicked the movement of one's hand on the computer screen.

As computers became more widespread in the workplace, new ways to harness their potential developed. As smaller computers became more powerful, they could be linked together, or networked, to share memory space, software, information and communicate with each other. As opposed to a mainframe computer, which was one powerful computer that shared time with many terminals for many applications, networked computers allowed individual computers to form electronic co-ops. Using either direct wiring, called a Local Area Network (LAN), or telephone lines, these networks could reach enormous proportions. A global web of computer circuitry, the Internet, for example, links computers worldwide into a single network of information. During die 1992 U.S. presidential election, vice-presidential candidate Al Gore promised to make the development of this so-called "information superhighway" an administrative priority. Though the possibilities envisioned by Gore and others for such a large network are often years (if not decades) away from realization, the most popular use today for computer networks such as the Internet is electronic mail, or E-mail, which allows users to type in a computer address and send messages through networked terminals across the office or across the world.

 

Fifth Generation (Present and Beyond)

 

Defining the fifth generation of computers is somewhat difficult because the field is in its infancy. The most famous example of a fifth generation computer is the fictional HAL9000 from Artur C. Clarke's novel, 2001: A Space-Odyssey. HAL performed all of the functions currently envisioned for real-life fifth generation computers. With artificial intelligence, HAL could reason well enough to hold conversations with its human operators, use visual input, and learn from its own experiences. (Unfortunately, HAL was a little too human and had a psychotic breakdown, commandeering a spaceship and killing most humans on board.)

Though the wayward HAL9000 may be far from the reach of real-life computer designers, many of its functions are not. Using recent engineering advances, computers may be able to accept spoken word instructions and imitate human reasoning. The ability to translate a foreign language is also a major goal of fifth generation computers. This feat seemed a simple objective at first, but appeared much more difficult when programmers realized that human understanding relies as much on context and meaning as it does on the simple translation of words.

Many advances in the science of computer design and technology are coming together to enable the creation of fifth-generation computers. Two such engineering advances are parallel processing, which replaces von Neumann's single central processing unit design with a system harnessing the power of many CPUs to work as one. Another advance is superconductor technology, which allows the flow of electricity with little or no resistance, greatly improving the speed of information flow. Computers today have some attributes of fifth generation computers. For example, expert systems assist doctors in making diagnoses by applying the problem-solving steps a doctor might use in assessing a patient's needs. It will take several more years of development before expert systems are in widespread use.

 

          Exercise:

 

1. Look through the whole text.

2. Name the early computing machines and inventors, write down them into your note-book.

3. Point out the main idea of each paragraph connected with the fife generations modern computers.

4. Write out the most important terms with the Russian equivalents used n the text and try to remember them.

5. Compile the list of computing terms.

 

Text 2

The history, development and classification of robots

 

The concept of robotics, although not referred to by that term until relatively recently, has captured man's imagination for centuries. One of the first automatic animals - a wooden bird that could - wan built by Plato's friend Archytas of Tareentum, who lived between 400 and 350 B. C. In the second century B. C., Hero of Alexandria described in his book, De Automatis, a mecanical theater with robot-like figures that danced and marched in temple ceremonies.

The precursors of programmable robots are classified as automata in contrast to toys because of their lenth and complexity of their operating cycles. Two examples of automata are shown in Figure 3-1.

A French engineer, Jaoques de Vaucanson (1709-1782), was elected to the Academie des Sciences for his work, which included the creation of a life-size, flute-playing shepherd. Pierre and Henri Jacquet-Droz constructed life-like automata driven by springs (Spilhaus, 1932).

In 1921, Karl Capek, the Czech playwright, novelist and essayist wrote the satirical drama R.U.R. (Rossum's Universal Robots), which introduced the word "robot" into the English language. The playwright coined the word to mean forced labor; the machines in his play resembled people, but worked twice as hard. Capek pictured robot as machine-like human lookalikes, with arms and legs and personalities. The fact that this image still prevails today us illustrated by the character C3PO from the 1977 movis Star Wars, although the industrial robots in today's factories look nothing like humans. The Germans were the first to pat robots on the screen, in a 1926 movie called Metropolis. In 1939, Elektro, a walking robot, and his dog Sparko were displayed at the New York World's Fair. In the same year science fiction writer Isaac Asimov started writing stories about robots. Asimov's stories fired the imagination of a Columbia University physics student named Joeeph F. Engelberger. In 1956 Engelberger had a conversation with George G. Devol, the inventor of something he called a programmed articulated transfer device. By the time Devol's patent application was granted in 1961, Engelberger had started Unimation Inc., whieh bought the rights and built developmental versions of Devol's device, now called a robot.

In the early 1960's, George Devol and Joseph Engelberger introduced the first industrial robot through Unimation, Inc. (Froehlich, 1981). The idea was to build a machine that was flexible enough to do a variety of jobs automatically and could be easily taught or programmed so that if the part or process changed, the robot could adapt to its new job without expensive retooling as was the case with hard automation. It was this mating of a computer to a flexible manipulator that has opened the door to new methods of manufacturiing. Dr. James Ablus in a recent book on the effects of computers and robots, wrote:

The human race it now poised on the brink of a new industrial revolution which will at least equal, if not far exceed, the first industrial revolution in its impact on mankind. The first industrial revolution was based on the substitution of mechanical energy for muscle power. The next industrial revolution will be based on the substitution of electronic computers for the human brain in the controller machines and industrial processes. ("Automatic Factory: Identify Its Fingerprints" 1981).

 

Table 3-1. A Chronology of Developments in Robotics

 

1770 Pierre and Henri Jaoquet-Droz construct life-like automata that can write, draw, and play musical intruments and are controlled by cams and driven by springs.

1801  A programmable loom is designed by Joseph Jacquard in France.

1946 Qeorge Devol develops the magnetic controller which is a playback device.

J. Peckert and John Menchley build the ENIAC computer at the Universitu of Pennsylvania.

Bullard Company develops and sells the MAN-AU-TROL automatic machine control system.

1952 The first numerically controlled machine is built at MIT.

1953 George Devol develops the first programmable robot.

1962 General Motors installs its first robot from Unimation.

1968 An intelligent mobile robot, Shakey, is built at SRI.

1973  Richard Hohn develops T3, the first commercially available robot from Cincinnati Milacron.

1976  NASA's Viking 1 and 2 landers perform on Mars with their samplecollecting arms.

1978  The first PUMA arm is shipped to General Motors by Unimation.

1982 Unimation is acquired by Westinghouse.

 

A chronology of developments in robotics in given in Table 3-1.

The use of computers, sensors, and mechanical actuators or manipulators as coordinated system utilized in manufacturing system has been a subject of study and application for several decades. Manually controlled manipulators for space systems and for nuclear fuel control have been designed and implemented for over 25 years. Robots combine computer intelligence, modern sensors, and manipulator arms to provide flexible devices that can economically increase the productivity of manufacturing processes.

 

Definitions of robots

 

The Electric Machinery Law of Japan definies an industrial robot as an all-purpose machine equipped with a memory device and a terminal device (for holding things), capable of rotation and of replacing human labor by automatic performance of movements. Japan classifies industrial robots by the method of input information and teaching as follows:

1. Manual manipulator a manipulator that is worked by an operator.

2. Fixed sequence robot - a manipulator which repetitively performs successive steps of a given operation according to a predetermined sequance, condition and position and whose set information can be easily changed.

3. NC (numerical control) robot - a manipulator that can perform a given task according to the sequence, conditions and position, as commanded via numerical data. The software used for these robots includes punched tapes, cards, and digital switches. This robot has the same control mode as an NC machine.

4. Intelligent robot - this robot uses sensory perception (visual and/or tactile) to independently detect changes in the work environment or work condition and, by its own decision-making faculty, proceed with its operation accordingly.

 

 

Exercises:

 

1. Read the text, be sure your understand every item of it.

2. Divide the text into fifth main parts and name them.

3. Write down the names of the people who took an active part in creating robots.

4. Make up a plan for the text.

5. Write down the summary of the text.

 

TEXT 3

A brief history of the Internet

 

The Internet has revolutionized the computer and communications world like nothing before. The invention of the telegraph, telephone, radio, and computer set the stage for this unprecedented integration of capability, a mechanism for information dissemination, and a medium for collaboration and interaction between individuals and their computers without regard for geographic location.

The Internet represents one of the most successful examples of the benefits of sustained investment and commitment to research and development of information infrastructure. Beginning with the early research in packet switching the government, industry and academia has been partners in evolving and deploying this exciting new technology. Today, terms like "leiner mcc. com" and "http://www.acm.org" trip lightly off the tongue of the random person on the street.

This is intended to be a brief, necessarily cursory and incom-plate history. Much material currently exists about the Internet, covering history, technology, and usage. A trip to almost any book store will find shelves of material written about the Internet.

In this paper, 3 several of us involved in the development and evolution of the Internet share our views of its origins and history. This history revolves around four distinct aspects. There is the technological evolution that began with early research on packet switching and the ARPANET (and related technologies), and where current research continues to expand the horizons of the infrastructure along several dimensions such as scale, performance, and higher level functionality. There it the operations and management aspect of a global and complex operational infrastructure. There is the social aspect, which resulted in a broad community of Internauts working together to create and evolve the technology. And their is the commercialization aspect, resulting in an extremely effective transition of research results into a broadly deployed and available information infrastructure.

The Internet today is a widespread information infrastructure, the initial prototype of what is often called the National (or Global or Galactic) Information Infrastructure. Its history is complex and involves many aspects technological, organizational, and community. And its influence reaches not only to the technical fields of computer communications but throughout society as we move toward increasing use of online tools to accomplish electronic commerce, information acquisition and community operations.

Origins of the Internet

 

The first recorded description of the social interactions that could be enabled through networking was a series of memos written by J. C. R. Lickider of MIT in August 1962 discussing his "Galactic Network" concept. He envisioned a globally interconnected set of through which everyone could quickly access data and programs from any site. In spirit, the concept was very much like the Internet of today. Licklider was the first head of the computer research program at DARPA, 4 starting in October 1962. While at DARPA he convinced his successors at DARPA, Ivan Sutherland, Bob Taylor, and MIT researcher Lawrence G. Roberts, of the importance of this networking concept.

Leonard Kleiarock at MIT published the first paper on packet switching theory in July 1961 and the first book on the subject in t964. Kleiarock convinced Roberts of the theoretical feasibility of communications using packets rather then circuits, which was a major step along the path towards computer networking. The other key step was to make the computers talk together. To explore this, in 1965 working with Thomas Merill, Roberts connected the TX-2 computer in mass. To the Q-32 in California with a low speed dial-up telephone line creating the first (however small) wide-area computer network ever built. The result of this experiment was the realization that the time-shared computers could work well together, running programs and retrieving data as necessary on the remote machine, but that the circuit switched telephone system was totally inadequate for the job. Kleinrocke's conviction of the need for packet switching was confirmed.

 

Exercises:

 

1. Read the text.

2. Divide the given text into main parts.

3. Write down the historical chronology of Internet.

4. What is Internet today?

5. What aspect does it involve?

6. What type of invention can Internet be referred to?

 

Text 4

Computer assisted language learning

 

Perhaps not surprisingly, the world of language teaching has now well and truly entered the computer age. Normal word-processing programs are being used to help develop learners writing skills. Computer Simulations have also proved their worth in providing the basis for extensive and natural language practice in relation to real-life situations of various kinds. However, the most welcome development so far seems to be so-called "authoring programs", which enable teachers to produce exercises of various kinds without any knowledge of programming or other computer skills being necessary. If a teacher can type (even with only two fingers), he or she can learn to use the programs in a very short time. A series of eight such programs is now being published for IBM and IBM-compatible personal computers by the Foundation for European Language and Educational Centres (EUROCEN-TRES).

 

Infinite patience

 

The great attraction of authoring programs is that they enable teachers to turn out materials of various kinds (depending on the nature of the individual programs) which look thoroughly professional and which are targeted on the needs of their own learners. These types of programs concentrate on what computers can help with most - developing learners' accuracy. And computers have infinite patience; unlike teachers, they don't feel like exploding when a learner makes the same mistake yet again!

 

Autonomous learning

 

Because learners generally find it fun to work with computers, motivation in connection with language learning is rarely a problem when these programs are in use. Admittedly, it can be difficult to get the students away from the computer once they start, but that is hardly something for teachers to complain about. These programs come into their own especially when a school has computer facilities which students can use outside class time. The setting of homework assignments in such circumstances is made easy for the teacher, and the work really does get done. In a company setting, people can use their personal computer to help them maintain their language skills as a break from typing letters or bringing the accounts up to date. And with the gradual spread of personal computers into the home, people can work at their leisure on different kinds of exercises. The day of the autonomous learner has really arrived, though that does not mean working alone. In fact, the success and fun derived from working on a computer exercise is even greater when there are one or two friends or colleagues to work with in a small group.

 

"Ready-made" materials, too

 

Some hard-pressed language teachers may well be saying, "Yes, but how will I ever find the time to write the mate rial, no matter how easy it might be?" Bearing in mind both this point, and the fact that many learners will want to be able to buy computer-based languagelearning exercises for themselves, Eurocentres will also be offering ready-made materials prepared for their series of authoring programs. Materials will be available for English from early 1989, and for French, German, Italian and Spanish some time later that year.

 

 

Ready now

 

At present, the first two programs of the series, Storyboard and Choicemaster, are available in all the languages mentioned. Storyboard is a text reconstruction program; Choicemaster is a multiple-choice program with special feature, which help users to learn from their mistakes. The whole series of programs will be published by 1999. For further details you can contact John Arnold, Eurocentres Learning Service, Seestrasse 247, 8038 Zurich, Switzerland.

 

Exercises:

 

1. Look through the text.

2. Give the translation of the titles.

3. Make up summary of the text.

4. Express the main idea of the article by one sentence.

 

Text 5

A chip off the off block

 

The Robots which weld and paint in a typical car assembly line of today are extremely stupid; they only do precisely what they are programmed to do, always assuming that the same car will arrive in precisely the same position to be worked on. In Japan recently a man was killed by one of these unthinking, unseeing robots. Similary today's computers are extremely stupid, as anyone who has been sent a telephone bill for 0.00 will testify.

Workers in Artificial Intelligence dream of the day when intelligent robots will be able to see and feel each workpiece, and adapt their movements accordingly, when we will simply tell computers what to do, without telling them exactly how to do it in slavish detail; and when we will talk to computers in English or Italian rather than some stylised computer language such as COBOL or BASIC. Until recently these were just pipe dreams. However, twenty years' steady progress in computer hardware and software has brought them much closer to reality, and has produced some striking demonstrations of intelligent computers.

Artificial intelligence (known as AI) is the attempt to give computers some of the flexibility and creativity which people display when solving problems or when conversing with one another. This attempt has produced a wide range of different styles of computing - some motivated by psychology, some by mathematics and logic - with different degrees of success. In the early days, researchers tried to find powerful general methods of inference which would solve all problems; this effort failed and the field was discredited for several years, especially in Europe. In Britain, the Lighthill Report on Artificial Intelligence research expressed this disillusionment with the field and effectively cut off research funding for several years.

Meanwhile in the United States a different approach developed, which concentrated not on powerful general methods of deduction, but on endowing computers with much of the practical problem-solving knowledge of an expert. These developments, called "expert systems" or "knowledge based systems" were more successful, and are the mainstream of current AI. There are now Artificial Intelligence programs which can play chess almost to international level, which have played and beaten the world champion at backgammon, which have discovered new theorems in pure mathematics and important deposits of minerals underground, which can diagnose certain diseases as well as the best medical experts, which can understand and answer questions in everyday English. These successes have triggered an explosion of interest in applied artificial intelligence; governments are funding grandiose research programs, and many companies are building "Expert systems" to help them plan or invest or manufacture.

Artificial Intelligence has also raised a host of philosophical and psychological questions which will be debated for years to come. If these computers can mimic so many of our highest intellectual achievements, do they also mimic our internal mental processes? Are they in fact the best model we have for understanding our own psychology? Might they even be conscious in the same way as we are, and should we already be considering their rights as individuals? There are some who even believe that computers are somehow conscious today, and philosophers who think they can prove that computers will never be conscious. To most workers in the field, it does not matter.

There are good reasons to believe, however, that the term "Artificial Intelligence" is premature as a description of these computer programs, and that their functioning is very different from the functioning of our minds. Consider the basic paradox of Artificial Intelligence - that while we can successfully mimic expert performance in rarified fields such as chess and mathematics, we cannot even approach the skill of a three-year old child in learning a language, recognising a face or running across a room. The successes of Artificial Intelligence have all been in narrow and well defined domains; broad general knowledge and common sense are well beyond our grasp. We have only just taken the first faltering steps towards truly intelligent computers.

 

Exercises:

 

1. Translate the title of the text.

2. Divide the paper into main parts.

3. Give the title to every part.

4. Find a key sentence in every paragraph.

5. Write an abstract of the text.

Text 6

The Electronic Tongue

 

          Our sense of taste results from our tongue's ability to identify sweet, salty, bitter, and sour substances. Different sub stances stimulate unique combinations of these four characteristics, and our tongue can distinguish subtleties in these combinations with great accuracy. Although our tongues can easily differentiate various flavors of ice cream, for example, they usually can not identify the chemical composition of the ice cream. Nor can they perform complicated medical tests.

To chemically identify substances, scientists around the world are trying to develop artificial taste sensors that mimic the human tongue. Recently researchers at the University of Texas in Austin have developed an electronic sensor that has the potential to detect taste as well as to identify the chemicals of any substances. It has uses for food and beverage development as well as medical applications.

Besides the tongue, an electronic nose has also recently been developed to mimic the sense of smell, but it can only detect volatile molecules in the air. Since many chemicals of interest, such as those in food and beverages, are not easily transported into vapor phase, there needs to be a way of detecting a combination of them in solution, such as the electronic tongue.

A team of engineers and chemists in Austin has come up with a prototype of the artificial tongue which resembles the mammalian tongue in some ways. The surface of the human tongue contains cavities which hold chemical receptors known as taste buds. The artificial tongue consists of an array of tiny chemical sensors on a square-centimeter chip. The sensors are actually polymer microbeads placed inside micromachined wells on a silicon wafer which mimics real taste buds on a human tongue. Each bead responds to specific conditions (for example, high acidity or charged ions). A special camera records those colors, which the researchers can then monitor on a computer.

One obvious application of the electronic tongue is in the rapid testing of new foods and beverages; the results could be quickly compared with databases of known popular consumer tastes. When developed further, the electronic tongue should be able to analyze chemical processing streams, biological fluids and other complex mixtures without exposing human beings to possibly harmful substances such as antigens, toxins, and bacteria. For the tasting of ice cream, though, we will likely continue to use our own tongues.

 

Text 7

Bill Takes it on the Chin

 

In a blow to the heart of Microsoft, a judge finds that Gates's empire is a monopolistic force whose tactics hurt competition and consumers. So what will happen next?

By Steven Levy and Jared Sandberg

Judge Thomas Penfield Jackson had let it be known that it would be late on a Friday afternoon after Wall Street shut down for the weekend - when he would finally break his silence on the biggest antitrust showdown in the digital age. By midday, everybody sensed it: after 76 catfighting days in court, more than 100 teeth-pulling depositions and enough pages of evidence to paper a certain billionaire's megamansion on Lake Washington, this was that Friday. Jackson would release his findings of fact, the first step toward judgment in the Department of Justice's sweeping case against the $ 500 billion software company Microsoft. At company headquarters ill Redmond, Wash., PR infantry, lieutenants and generals organized a "rapid response team," with guides assigned to connect targeted journalists to the proper spinmeisters. At the DOJ's antitrust division, which had been working for two years to prove that the world's most successful software company played dirty, the vigil was "what it's like for a father waiting for a baby to lie born," said its leader, Joel Klein. His team was confident that the judge would side with its version of events, but it also expected Judge Jackson to throw Microsoft some hones. "You almost never have someone's head on a spike," said one Justice insider.

But in a tersely worded 207-page ruling, Judge Jackson seemed to have done just that. In the course of his tome, bound by the Government Printing Office and instantly down loaded by thousands of Internet looky-loos, he slickly mounted one tousle-chaired, bespectacled billionaire's noggin on the halberd of the evidence. Presumably these findings of fact are only a prelude to the actual ruling of law expected early next year. But considering that Judge Jackson documents at length how Microsoft is a monopolistic violator that not only bulllies its competitors but also rips off' the public by stifling innovation and overcharging for its software, there's little doubt about his subsequent ruling. Bring me the head of Bill Gates!

Here are some of Jackson's facts: the way to determine whether Microsoft is a monopoly is not the overall computer marketplace but only the market for Intel-compatible computers that are almost solely Microsoft's domain. Therefore Microsoft is a monopoly. What's more, the monopoly is self-sustaining and unlikely to be challenged. (Forget about those threats from Linux software, Internet computers or palmtops.) Even Microsoft's huge expenditures on R&D don't mean that it's providing innovations for customer benefit: it's done to "push the emergence of competition even farther into the future." In fact (as the judge has it) Microsoft's actions "have harmed consumers in ways that are immediate and easily discernible." By suppressing the competition, he concludes, Microsoft has made computers less innovative, more expensive, more troublesome and harder to use-all to the detriment of the schmoes behind the keyboard.

This last contention pleased the government the most. Many observers had believed that the trial established how a Microsoft monopoly bulldozed its competitors. It was a shakier proposition whether computer users suffered from Microsoft's actions. But Judge Jackson concluded that even building Internet software into Windows, gratis, was no bonanza for consumers. It made the system run slower, he griped, and caused more crashes.

The Feds, of course, were exuberant. "This is a great victory for consumers," crowed Attorney General Janet Reno. Before going to bed on Friday, the wife of Reno's hired hired litigator, David Boies, told him, "Today was a great career for you."

It was also a banner day among the legions of Microsoft haters in Silicon Valley. "It's a vindication of what we've been saying all along," says Jim Barksdale, former' CEO of Netscape, who rates Judge Jackson's handiwork, "11 on a scale of 10". Some Microsoft critics didn't even wait until the bits were downloaded before declaring that the only proper judicial remedy for such misdeeds was a dismantling of the Campus That Bill Built. Bill Campbell, the acting CEO of Intuit, declared that "nothing short of a lasting structural remedy might suffice."

And how did the alleged monopolists respond? In what could have stood for a paradigm of Microsoft's tone-deafness throughout this legal and public-relations debacle, it offered a not-very-convincing declaration of business as usual. Bill Gates - who has privately raged about the government's attacks on him-took apparent pains to appear only mildly perturbed. Rustling back from a semiannual "Think Week", where he brainstorms Microsoft's future moves, he offered a boilerplate reaffirmation of his company's virtues. "Microsoft competes vigorously and fairly", he said. It gave the impression of the ownerofa burning house insisting that the foundation was sound.

More telling were the comments of Microsoft's chief lawyer, William Neukom. After reluctantly admitting that the judge's fact-finding "was more consistent with the government side than ours," he spoke about where Microsoft obviously believes the case is going-to an appeals court. The only time Neukom became ruffled was at a question about all the competitors' now predicting that Judge Jackson will play sushi chef, chopping Microsoft into slices of tekka maki business units. "Relief, if any, has to he commensurate (with the violations)", said Neukom. "What does structural relief have to do with (the judges conclusions)?"

In some respects. Judge Jackson's brief was reminiscent of the notorious Starr Report. Not that the judge veered toward sensationalism; he studiously avoided recounting the numerous humiliations of Microsoft's witnesses by the government's grand inquisitor Boies. But like Ken Starr's heavily tilted brief, "Court's Findings of Fact" reads like a narrative driven by a central character whose actions are invariably cast in the worst possible light. During the trial Microsoft's lawyers continually warned observers not to read too much into the government's apparent success in coming up with incriminating e-mail or memos. These were just "snippets," we were told, trivia that would not tilt the judge away from what Microsoft considered the bedrock truth of the case: it played hard but fair. But now it seems clear that Jackson determined that Microsoft's witnesses were simply not credible, and in the "Rashomon" - like contradictions that arose, Jackson almost invariably preferred accounts coming from the government side. As a result, he wrote "a Reader's Digest version of the government's findings of fact," says William Kovacic, law professor at George Washington University.

The judge even found misbehavior in actions that Microsoft lawyers considered obviously justifiable competitive practices. Could Microsoft's decision to give the Explorer browser away possibly be an expression of Internet economics and an effort to please customers? No, it was done solely to crush Netscape.

How much is Microsoft hurt by this? Judge Jackson's compelling litany of misdeeds will be difficult to refute, try as it might both in the law courts and in the court of public opinion. During the trial, the government trotted out witnesses testifying to various alleged abuses of Microsoft's power, giving the impression that Microsoft was a serial mugger, bopping Apple on the head to get it to carry its browser, sticking a gun in Intel's ribs to make it back off a potentially competitive software effort and punching Sun Microsystems in the gut in order to wrest control of Sun's Java software. But the narrative thrust of the document portrays Microsoft's brain trust as a conniving illuminati, pulling strings and shattering legal boundaries in a consistent strategy to maintain its crooked monopoly by any means. Loaded with specific dates, imbued with a newfound grasp of technical niceties and backed not only by testimony but often damning e-mail, the judge weaves a series of vignettes where Microsoft forced its partners to structure business dealings to promote the company's internal goals.

It's a characterization that Microsoft vehemently denies. And though it's very difficult to challenge findings of fact, Microsoft is expected to try. If so, it will probably start by questioning the judge's overall approach - which consisted mainly of adopting the points of view of government witnesses without specifically explaining why Microsoft's witnesses weren't credible. It will question why the judge found as factual certain items that were arguably not admissible. (Possible example: a Bill Gates quote dismissing AOL as a competitor - its source was not sworn testimony but thee margin-scrawlings of an unnamed Microsoft employee.) It will likely charge that the conclusions that the judge made weren't backed up by incontrovertible facts. And it will almost certainly claim that Judge Jackson is acting as a czar of software design in robes: a role that an appeals-court ruling deemed inappropriate for jurists.

Whether the appeal works, it will buy time. Microsoft counsel Neukom joyfully recounted how long it would take to exhaust every avenue. By the time, he figured, that the Supreme Court got hold of the matter it would probably be 2003. He didn't mention it, but long before then the new president taking office might have replaced Joel Klein with a trustbuster who views Microsoft more sanguinely. When George W. Bush addressed an audience of high-tech executives in Arizona last month, he promised them less interference from Washington, vowing to "always take the side of innovation over litigation." One of Bush's highest-profile supporters is Microsoft chief operating officer Bob Her-bold, who has hosted the candidate on the company's campus.

However, even if Microsoft manages to stave off judgment in the appeals process - thus allowing it to continue its world-championship run-up of profits and record market valuations -an unfavorable verdict may encourage the company's unhappy competitors to launch their own civil suits against Microsoft. "I think and hope that it will happen," says Sun Microsystems general counsel Michael Morris. The list of potential litigants includes AOL, the online giant that bought Netscape, die prime casualty of the browser war. One insider confirms that AOL's board of directors is aware that seeking damages might be considered by shareholders as a fiduciary responsibility. "The only way Microsoft can limit their exposure on that is to settle quick," says Jamie Love, director of die Consumer Project on Technology.

If Microsoft, does come to die table, it better he prepared to make concessions that actually limit its powers. In die last year, says California A. G. Bill Lockyer, "when there were very preliminary discussions of a settlement, Microsoft's offers were always very, very lowball offers." But after Microsoft's Freaky Friday, the price just got higher.

 

Text 8

The turning of telecoms

 

A once-natural monopoly is now a natural for competition.

Blame Maximilian. Since 1505, when the head of Europe's Habsburg dynasty awarded exclusive mail-carrying rights to Italy's Taxis family, the communications business has been a haven for monopolists. By the late 19th century, post had been joined by telegraph and telephone, a trio amalgamated by European governments eager to provide "universal" communications services. In America, by contrast, three monopolies held sway: the US Postal Service, Western Union and American Telephone and Telegraph (AT&T). Both strategies created vast communications infrastructures which shut out competitors for a century. Thus was the natural monopoly born.

Yet such monopolies need not last for ever. Today, state postal services have been eroded by private couriers, the fax machine and other forms of electronic data transmission. Telegraph services have all but disappeared, felled by the same fax. And telephone services? The old AT&T is no more, split in 1984 into one long-distance carrier (still known as AT&T) and seven regional telephone companies, known as the Baby Bells. Britain's BT was privatised and opened up to competition in the same year. And foreign competitors are nipping at the heels of many of Europe's state-owned telecoms giants, despite webs of regulation designed to keep them at bay.

So far, however, this has left intact the comerstone of the telephone companies' market power - their physical networks of transmission cables and exchanges. The advantages of owning 100 years' worth of paid-up infrastructure are enormous. Indeed, the barrier to competitive entry that they create is what makes a natural monopoly natural. It costs too much to duplicate a network of water pipes, or electricity pylons - or telephone cables. Despite the efforts of a tough regulator and of Mercury and other new competitors, BT still enjoys a 95% share of Britain's telecoms market. AT&T has two-thirds of America's long-distance market, despite fierce competition from MCI and Sprint. And the Baby Bells - beneficiaries, like AT&T, of an infrastructure honed and expanded since 1885 - still hold an average of 75% of their regional markets.

This has led to worries that deregulation and privatisation are likely simply to allow natural monopolists to increase their profits. Yet the most vital stage is only just beginning: the challenge of wireless technology to the old monopolists' infrastructure. Far from retreating, the task for regulators is to hasten this 20 final demise of the natural monopoly in telecoms.

Until recently, there was only one way to deliver a telephone call: down a cable. But a new generation of cheap, digital cellular telephones is turning that logic on its head. That is why AT&T is spending $3.8 billion (see page 71) on a one-third stake in McCaw Cellular Communications, America's biggest cellular-telephone operator, and why MCI wants to set up a consortium to run a national digital "personal communications network", offering cheaper wireless calls. AT&T's long-term strategy is to build a wireless network which could compete with state telecoms monopolies worldwide. Shorter term, the company will settle for taking on the Baby Bells, which is why they are squealing.

 

Farewell to the socket

Taking the cellular route to challenging the monopoly of wire-based networks makes sense for both operators and customers. A high-capacity digital wireless network covering the whole of Britain could cost under $2 billion to build, several times less than a wire-based network - and would not require streets to be dug up or ugly overhead cables to be installed. For consumers, competition is starting to make wireless telephones look affordable. In Britain, for instance, the threat of competition from new digital cellular-telephone services has this year pushed down the price of some analogue cellular services by a third. As wired and wireless prices converge, few people will choose a telephone wired to a socket when they could easily be freed from that inconvenience.

Regulators can ease this transition. In America, the Federal Communications Commission should ignore the whining of the Baby Bells. The court judgment which broke up Ma Bell did not forbid the reborn AT&T from straying on to the local firms' patches (unless it tried to do so by buying their assets); moreover, the Baby Bells could do with a dose of competition. Elsewhere, regulators can do two things. First, they should ignore their own telemonopolists' pleas for protection from wireless competitors, and instead hand out cellular licences liberally. Second, they should consider the benefits of breaking up those monopolies in the style of Ma Bell. It was that move which first set AT&T on the path to becoming the nimble and innovative competitor it is today: one powerful firm fighting other powerful firms. Nothing could be more natural than that.

 

Text 9

Advances in Radar Imaging

 

What is radar?

 

Radar, a contraction of the words radio detection and ranging, is an electronic device for detecting and locating objects. It operates by transmitting a particular waveform pattern and detects the nature of the echo (return) signal. Radar is used to extend the capability of the man's senses, especially that of vision. We can think of radar as being a substitute for the eye, although it can do so much more: it can see objects through such impervious conditions as darkness, haze, log, rain, and snow, for its wavelengths are much longer than those of visible or infrared light. The human eye works as a passive device, since the object is illuminated by sunlight or other light sources. However, radar produces its own illumination via eleclromagnetic waves, which means thal it is an active device.

 

Applications of radar and radar imaging

 

Radar is used in civilian applications as air-traffic-conlrol radar to guide aircraft to a sale landing, and in commercial aircraft as radar altimeters to determine height and weather avoidance, as well as wind-shear radars to navigate in severe weather conditions.

The military uses radar for surveillance and weapons control, examples of such radars are DEW (Distant Early Warning) and AEW (Airborne Early Warning), which detect aircraft, long-range search radars, and guided missile radars.

Research scientists use radar as a measurement tool. Radars have been placed on satellites, space modules, and shuttles to explore meteors, planets, and other objects in the solar system.

In the case of an imaging radar, the radar travels along an airplane's or a space shuttle flight path. The area underneath is illuminated by the radar, and the radar architecture builds the image as it moves on the lop of its footprint. The radar image's finer resolution is achieved by using a very long antenna array to focus the transmitted and received energy into a sharp beam. The beam's sharpness defines the resolution. Similarly, such optical systems as telescopes require large apertures (mirrors or lenses that are analogous to the radar antenna) to obtain fine imaging resolution. Synthetic Aperture Radar (SAR) is a common and very popular technique in radar imaging that achieves a very fine resolution. In the following sections, we introduce and explain different types of SAR imaging techniques.

 

Synthetic aperture radar (SAR)

 

SAR refers to a technique that synthesizes a very long antenna by combining echoes received by the radar when it travels. Typically, SAR is used to produce a two-dimensional (2-D) image. One dimension in the image is called range (or along track), and is a measure of the "line-of-sight" distance from the radar to the target. Range is determined by precisely measuring the time from a pulse's transmission to receiving the echo from a target. The range resolution is determined by the transmitted pulse's width (i.e., narrow pulses yield fine range resolution).

The other dimension is called azimuth (or cross track), and is perpendicular to rang

– Конец работы –

Эта тема принадлежит разделу:

Аннотирование и реферирование английской научно-технической литературы

На сайте allrefs.net читайте: "Аннотирование и реферирование английской научно-технической литературы"

Если Вам нужно дополнительный материал на эту тему, или Вы не нашли то, что искали, рекомендуем воспользоваться поиском по нашей базе работ: Early Computing Machines and Inventors

Что будем делать с полученным материалом:

Если этот материал оказался полезным ля Вас, Вы можете сохранить его на свою страничку в социальных сетях:

Все темы данного раздела:

INTRODUCTION
Тексты и упражнения, предназначенные для обучения студентов реферированию и аннотированию технических текстов, объединены нами в два раздела. Первый раздел содержит небольшие по объему нау

Exercises
1. Give Russian equivalents of the words distinghisted in the text. 2. Read the text and state the main idea. 3. Write a logical plan of the text. 4. Suggest a suitable t

Exercises
1. Give Russian equivalents of the words distinghisted in the text. 2. Read the text and state the main ideas. 3. Write a logical plan of the text. 4.  Suggest a suitable

Exercises
1. Give Russian equivalents of the words distinghisted in the text. 2. Read the text and state the main ideas. 3. Write the logical plan of the text. 4.  Suggest a suitab

Exercises
1. Give Russian equivalents of the words distinghisted in the text. 2. Read the text and state the main idea. 3. Write the logical plan of the text. 4.  

Summary of Text 5
The text "The Age of Automation" deals with the problem of employing automatic machines in our life. Firstly the author defines what is automation. He says that automation is a c

Abstract of Text 5
The text "The Age of Automation" is about employing automatic machines in our life. The author gives a definition of automation, speaks about its three main elements, about its role in ma

Exercises
  I. Give Russian equivalents of the following phrases used in the text: the fundamental principles of a computer; to receive education; to be under the influence of; to be u

Exercises
I. Give Russian equivalents of the following phrases used in the text:   to perform an experiment, to be precise, to make sure, to be accustomed to, to be engrossed in someth

Exercises
I. Read the text and state the main idea. II. Write a logical plan of the text. III. Find key sentences in every paragraph. IV. Make up a summary and an abstract of the t

SUPPLEMENT
Astronomy - astronomer Geography - geographer Biochemistry - biochemist History - historian Biology - biolog

Galileo and on
History helps to demonstrate why experiment has come to be important. The "scientific revolution" took place during the 16th and 17th centuries: explanations of nature derived from Aristo

In real laboratories
In the end, Lavoisier won the dispute with Priestley. The chemistry student in Las Palmas is producing oxygen at one electrode and hydrogen at another when the electrolyses water, not manipulating

Unit 1. Computers and Communication
Perform the following algorithm to all the texts given below.   1. Read the text and translate it with a dictionary. 2. Divide eac

COSMONAUTICS
  Text 10 Collaboration the key   A subtle change, appears to have occurred across the aero engine industry since our last review six months ago.

ECONOMICS
  Text 21 THE NATIONAL ECONOMY   Background   Britain ranks about 75th in size among the countries of the world, with about 0-18

Two Economic Issues
  Trying to understand what economics is about by studying definitions is like trying to learn to swim by reading an instruction manual. Formal analysis makes sense only once you have

Business Organization
  In the UK businesses are self-employed sole traders, partnerships, or companies. Self-employment increased throughout the 1980s and sole traders are by far the commonest type of bus

The Bank of England
  The Bank of England, usually known simply as the Bank, is the central bank of the UK. For historical reasons, it is divided into an Issue Department and a Banking Department, each w

Money and its Functions
  Although the crucial feature of money is its acceptance as the means of payment or medium of exchange, money has three other functions. It serves as a unit of account, as a store of

Money and Inflation
  In this section we develop the basic link between the nominal money supply and the price level. In turn, this provides a link between the rate of growth of the nominal money supply

Nationalized Industries
  In Chapter 16 we distinguished between government production of public goods such as defence and government production of private goods such as steel. The nationalized industries ar

The Pattern of World Trade
  Since every international transaction has both a buyer and a seller, one country's imports must be another country's exports. To get an idea of how much trade takes place, we can co

The Nature and Role of Information
  Text 32 Social welfare State and voluntary services   In Britain the State is now responsible, through either central or local government, f

LITERATURE
  1. David Begg. Economics, – London, 1991. – c 32, 96, 402, 497, 530. 2. John B.C. Haldane. Popular Scientific Essays. – М.: Наука, 1987. 3. Бурдонская Р.Д., Ляпуно

Хотите получать на электронную почту самые свежие новости?
Education Insider Sample
Подпишитесь на Нашу рассылку
Наша политика приватности обеспечивает 100% безопасность и анонимность Ваших E-Mail
Реклама
Соответствующий теме материал
  • Похожее
  • Популярное
  • Облако тегов
  • Здесь
  • Временно
  • Пусто
Теги