My business is Franchises. Ratings. Success stories. Ideas. Work and education
Site search

History of computer technology. Computing devices and devices from antiquity to the present day - document

History of development computer technology conditionally divided into 5 generations.

1st generation (1945-1954) - the time of the formation of machines with von Neumann architecture (John von Neumann), based on recording a program and its data in the memory of a computer. During this period, a typical set of structural elements that make up a computer is formed. A typical computer should consist of the following components: central processing unit (CPU), random access memory (or random access memory - RAM) and input/output devices (I/O). The CPU, in turn, must consist of an arithmetic logic unit (ALU) and a control unit (CU). Machines of this generation worked on a lamp element base, which is why they absorbed huge amounts of energy and were very unreliable. With their help, scientific problems were mainly solved. Programs for these machines could no longer be written in machine language, but in assembly language.

2nd generation (1955-1964). The change of generations was determined by the emergence of a new elemental base: instead of a bulky lamp, miniature transistors began to be used in computers; delay lines as elements of random access memory were replaced by memory on magnetic cores. This ultimately led to a reduction in size, increased reliability and performance of the computer. The computer architecture now includes index registers and hardware for performing floating point operations. Commands were developed to call subroutines. Languages ​​appeared high level- Algol, FORTRAN, COBOL, - which created the prerequisites for the emergence of portable software that does not depend on the type of computer. With the advent of high-level languages, compilers for them arose; libraries of standard routines and other things that are well known to us now: An important innovation is the appearance of input/output processors. These specialized processors made it possible to free the CPU from I/O control and perform I/O using a specialized device simultaneously with the computation process. For effective management Operating systems (OS) began to be used as machine resources.

3rd generation (1965-1970). The change of generations was again due to an update of the element base: instead of transistors, they began to use integrated circuits varying degrees of integration. Microcircuits made it possible to place dozens of elements on a plate several centimeters in size. This, in turn, not only increased the productivity of computers, but also reduced their size and cost. The increase in computer power made it possible to simultaneously execute several programs on one computer. To do this, it was necessary to learn to coordinate simultaneously performed actions, for which the functions of the operating system were expanded. Along with active developments in the field of hardware and architectural solutions, the specific gravity developments in the field of programming technologies. At this time, the theoretical foundations of programming methods, compilation, databases, operating systems, etc. are being actively developed. Application program packages are being created for a wide variety of areas of human activity. There is a tendency to create families of computers, that is, machines become compatible from the bottom up at the software and hardware level. Examples of such families were the IBM System 360 series and our domestic analogue - ES Computers.

4th generation (1970-1984). Another change in the element base led to a change of generations. In the 70s, work was actively underway to create large and ultra-large integrated circuits (LSI and VLSI), which made it possible to place tens of thousands of elements on a single chip. This resulted in a further significant reduction in the size and cost of computers. In the early 70s, Intel released the i4004 microprocessor (MP). And if before this there were only three directions in the world of computing (supercomputers, large computers (mainframes) and minicomputers), now another one has been added to them - microprocessor.

Processor is a functional block of a computer designed for logical and arithmetic processing of information based on the principle of microprogram control. Based on hardware implementation, processors can be divided into microprocessors (fully integrating all processor functions) and processors with low and medium integration. Structurally, this is expressed in the fact that microprocessors implement all the processor functions on one chip, while other types of processors implement them by connecting a large number of chips.

The 5th generation can be called microprocessor. In 1976, Intel completed development of the 16-bit i8086 microprocessor. It had a fairly large register width (16 bits) and a system address bus (20 bits), due to which it could address up to 1 MB of RAM. In 1982, i80286 was created. This microprocessor was an improved version of the i8086. It already supported several operating modes: real, when the address was formed according to the rules of the i8086, and protected, which implemented multitasking and virtual memory management in hardware. The i80286 also had a large address bus width - 24 bits versus 20 for the i8086, and therefore it could address up to 16 MB of RAM. The first computers based on this microprocessor appeared in 1984. In 1985, Intel introduced the first 32-bit microprocessor, the i80386, which was hardware-compatible upwards with all previous microprocessors from this company. It was much more powerful than its predecessors, had a 32-bit architecture and could directly address up to 4 GB of RAM. The i386 microprocessor began to support a new operating mode - the virtual i8086 mode, which ensured not only greater efficiency of programs developed for the i8086, but also allowed the parallel operation of several such programs.

Ministry of Education and Science Russian Federation

Federal Agency for Education

GOU VPO "Ural State the University of Economics»

Department of Economics and Law

USUE branch in N. Tagil

Test

by discipline:

"Computer science"

Option 8___

Topic: “History of the development of computer technology”

Executor:

student gr. 1EKIP

Gorbunova A.A.

Teacher:

Skorokhodov B.A.

Introduction………………………………………………………………………………..3

1 Stages of development of computer technology……………………………..4

2 Characteristics of computer generations……………………………………………………………...9

3 The role of computer technology in human life………………………13

Conclusion…………………………………………………………………………………14

Introduction

Knowledge of the history of the development of computer technology is an integral component of the professional competence of a future specialist in the field of information technology. The first steps of automation of mental work relate specifically to the computational activity of a person, who already at the very early stages of his civilization began to use instrumental calculation tools.

At the same time, it should be borne in mind that well-proven means of developing computer technology are still used by humans to automate various types of calculations.

Automated systems are an integral part of any business and production. Almost all management and technological processes use computer technology to one degree or another. Just one computer can significantly improve the efficiency of enterprise management, without creating additional problems. Today, personal computers are installed at every workplace and, as a rule, no one doubts their necessity. Significant volumes of computer technology and their special role in the functioning of any enterprise pose a number of new tasks for management.

This work will examine the history of the development of computer technology, which will help to understand and delve into the essence and significance of computers.

1 Stages of development of computer technology

There are several stages in the development of computer technology that people still use today.

The manual stage of development of computer technology.

The manual period of computing automation began at the dawn of human civilization and was based on the use of various parts of the body, primarily the fingers and toes.

Finger counting goes back to ancient times, found in one form or another among all peoples even today. Famous medieval mathematicians recommended finger counting as an auxiliary tool, allowing for fairly effective counting systems. The counting results were recorded different ways: notching, counting sticks, knots, etc. For example, knot counting was highly developed among the peoples of pre-Columbian America. Moreover, the system of nodules also served as a kind of chronicles and annals, having a rather complex structure. However, using it required good memory training.

Counting with the help of grouping and rearranging objects was the predecessor of counting on the abacus - the most developed counting device of antiquity, preserved to this day in the form various types accounts.

The abacus was the first developed calculating device in the history of mankind, the main difference of which from previous methods of calculation was the performance of calculations by digits. Thus, the use of the abacus already presupposes the presence of some positional number system, for example, decimal, ternary, quinary, etc. The centuries-old path of improvement of the abacus led to the creation of a calculating device of a complete classical form, used until the heyday of keyboard desktop computers. And even today you can find him in some places helping with settlement transactions. And only the advent of pocket electronic calculators in the 70s of our century created a real threat to the further use of Russian, Chinese and Japanese abacus - the three main classical forms of the abacus that have survived to this day. At the same time, the last known attempt to improve Russian accounts by combining them with the multiplication table dates back to 1921.

Well adapted to perform addition and subtraction operations, the abacus turned out to be an insufficiently efficient device for performing multiplication and division operations. Therefore, the discovery of logarithms and logarithmic tables by John Napier at the beginning of the 17th century was the next major step in the development of manual computing systems. Subsequently, a number of modifications of logarithmic tables appeared. However, in practical work the use of logarithmic tables has a number of inconveniences, so John Napier, as an alternative method, proposed special counting sticks (later called Napier sticks), which made it possible to perform multiplication and division operations directly on the original numbers. The basis this method Napier laid down the lattice method of multiplication.

Along with sticks, Napier proposed a counting board for performing multiplication, division, squaring, and square root operations in the binary system, thereby anticipating the benefits of such a number system for automating calculations.

Logarithms served as the basis for the creation of a wonderful computing tool - the slide rule, which has served engineers and technicians around the world for more than 360 years.

Mechanical stage of development of computer technology.

The development of mechanics in the 17th century became a prerequisite for the creation of computing devices and instruments using the mechanical principle of calculations. Such devices were built on mechanical elements and provided automatic transfer of the highest order.

The first mechanical machine was described in 1623 by Wilhelm Schickard, implemented in a single copy and intended to perform four arithmetic operations on 6-bit numbers.

Schickard's machine consisted of three independent devices: adding, multiplying and recording numbers. Addition was carried out by sequentially entering addends using dials, and subtraction was carried out by sequentially entering the minuend and subtrahend. The entered numbers and the result of addition and subtraction were displayed in the reading windows. The idea of ​​lattice multiplication was used to perform the multiplication operation. The third part of the machine was used to write a number of no more than 6 digits in length.

Blaise Pascal's machine used a more complex scheme for transferring high-order bits, which was rarely used in the future; but the first one built in 1642 current model machines, and then a series of 50 machines contributed to the fairly wide popularity of the invention and the formation public opinion about the possibility of automating mental work.

The first arithmometer, allowing all four arithmetic operations, was created by Gottfried Leibniz as a result of many years of work. The culmination of this work was the Leibniz arithmometer, which allows the use of an 8-bit multiplicand and a 9-bit multiplier to obtain a 16-bit product.

A special place among the developments of the mechanical stage of the development of computer technology is occupied by the work of Charles Babbage, who is rightfully considered the founder and ideologist of modern computer technology. Among Babbage's works, two main directions are clearly visible: difference and analytical computers.

The difference machine project was developed in the 20s of the 19th century and was intended for tabulating polynomial functions using the finite difference method. The main impetus for this work was the urgent need to tabulate functions and check existing mathematical tables, which were riddled with errors.

Babbage's second project was the Analytical Engine, which used the principle of program control and was the predecessor of modern computers. This project was proposed in the 30s of the 19th century, and in 1843 Aloy Lovelace wrote the world’s first fairly complex program for calculating Bernoulli numbers for Babbage’s machine.

Charles Babbage used a mechanism in his machine similar to the mechanism of the Jacquard loom, using special control punch cards. According to Babbage's idea, control should be carried out by a pair of Jacquard mechanisms with a set of punched cards in each.

Babbage had surprisingly modern ideas about computing machines, but the technical means at his disposal were far behind his ideas.

Electromechanical stage of development of computer technology.

The electromechanical stage of development of computer technology was the shortest and covers only about 60 years. The prerequisites for the creation of projects at this stage were both the need to carry out mass calculations (economics, statistics, management and planning, etc.) and the development of applied electrical engineering (electric drive and electromechanical relays), which made it possible to create electromechanical computing devices.

The classic type of means at the electromechanical stage was a counting and analytical complex designed for processing information on punched card media.

The first counting and analytical complex was created in the USA by Herman Hollerith in 1887 and consisted of: a manual punch, a sorting machine and a tabulator. The main purpose of the complex was the statistical processing of punched cards, as well as the mechanization of accounting and economic tasks. In 1897, Hollerith organized a company that later became known as IBM.

Developing the work of G. Hollerith, a number of models of counting and analytical complexes are being developed and produced in a number of countries, of which the most popular and widespread were the complexes of IBM, Remington and Buhl.

The final period (40s of the XX century) of the electromechanical stage of development of computer technology is characterized by the creation of a number of complex relay and relay-mechanical systems with program controlled, characterized by algorithmic versatility and capable of performing complex scientific and technical calculations in automatic mode at speeds an order of magnitude higher than the speed of operation of electric-driven adding machines.

Konrad Zuse pioneered the creation of a universal computer with program control and storage of information in a memory device. However, his first model Z-1 (which marked the beginning of the Z-car series) was ideologically inferior to Babbage’s design - it did not provide for conditional transfer of control. Also, in the future, the Z-2 and Z-3 models were developed.

Last major project Relay computer technology should be considered the relay computer RVM-1, built in 1957 in the USSR and operated until the end of 1964 mainly to solve economic problems.

Electronic stage of development of computer technology.

Due to its physical and technical nature, relay computing technology did not allow a significant increase in the speed of calculations; This required a transition to high-speed electronic inertia-free elements.

The first computer can be considered the English Colossus machine, created in 1943 with the participation of Alan Turing. The machine contained about 2000 vacuum tubes and had a fairly high speed, but was highly specialized.

The first computer is considered to be the ENIAC (Electronic Numerical Integrator And Computer) machine, created in the USA at the end of 1945. Initially intended to solve ballistics problems, the machine turned out to be universal, i.e. capable of solving various problems.

Even before the start of ENIAC operation, John Mauchly and Presper Eckert, commissioned by the US military department, began a project on a new computer, EDVAC (Electronic Discrete Automatic Variable Computer), which was more advanced than the first. This machine featured a large memory (1,024 44-bit words; a 4,000-word data auxiliary memory was added at completion) for both data and program.

The EDSAC computer marked the beginning of a new stage in the development of computing technology - the first generation of mainframe computers.

2 Characteristics of computer generations

Since 1950, every 7-10 years the design-technological and software-algorithmic principles of constructing and using computers have been radically updated. In this regard, it is legitimate to talk about generations of computers. Conventionally, each generation can be given 10 years.

First generation of computers 1950-1960s

Logic circuits were created using discrete radio components and electronic vacuum tubes with a filament. Random access memory devices used magnetic drums, acoustic ultrasonic mercury and electromagnetic delay lines, and cathode ray tubes. Drives on magnetic tapes, punched cards, punched tapes and plug-in switches were used as external storage devices.

The programming of this generation of computers was carried out in the binary number system in machine language, that is, the programs were strictly focused on a specific model of the machine and “died” along with these models.

In the mid-1950s, machine-oriented languages ​​such as symbolic coding languages ​​(SCLs) appeared, which made it possible to use their abbreviated verbal (letter) notation and decimal numbers instead of binary notation of commands and addresses.

Computers, starting from UNIVAC and ending with BESM-2 and the first computer models "Minsk" and "Ural", belong to the first generation of computers.

Second generation of computers: 1960-1970s

Logic circuits were built on discrete semiconductor and magnetic elements. Printed circuit diagrams were used as a design and technological basis. The block principle of machine design has become widely used, which allows you to connect a large number of different external devices to the main devices, which provides greater flexibility in the use of computers. Clock frequencies of electronic circuits have increased to hundreds of kilohertz.

External drives on hard magnetic disks and floppy disks began to be used - an intermediate level of memory between magnetic tape drives and RAM.

In 1964, the first computer monitor appeared - the IBM 2250. It was a monochrome display with a 12 x 12 inch screen and a resolution of 1024 x 1024 pixels. It had a frame rate of 40 Hz.

Control systems created on the basis of computers required more high performance, and most importantly - reliability. Error detection and correction codes and built-in control circuits have become widely used in computers.

The second generation machines were the first to implement batch processing and teleprocessing modes of information.

The first computer that partially used semiconductor devices instead of vacuum tubes was a machine created in 1951.

In the early 60s, semiconductor machines began to be produced in the USSR.

Third generation of computers: 1970-1980s

The logic circuits of 3rd generation computers were already entirely built on small integrated circuits. Clock frequencies of electronic circuits have increased to several megahertz. The supply voltage (units of volts) and the power consumed by the machine have decreased. The reliability and performance of computers have increased significantly.

Random access memories used smaller ferrite cores, ferrite plates, and magnetic films with a rectangular hysteresis loop. Disk drives have become widely used as external storage devices.

Two more levels of storage devices have appeared: ultra-random access memory devices on trigger registers, which have enormous speed but small capacity (tens of numbers), and high-speed cache memory.

Since the widespread use of integrated circuits in computers, technological progress in computing can be observed using the well-known Moore's law. One of the founders of Intel, Gordon Moore, discovered a law in 1965 according to which the number of transistors in one chip doubles every 1.5 years.

Due to the significant complexity of both the hardware and logical structure of 3rd generation computers, they often began to be called systems.

In third-generation computers, significant attention is paid to reducing the complexity of programming, the efficiency of program execution in machines, and improving communication between the operator and the machine. This is ensured by powerful operating systems, a developed programming automation system, efficient systems program interrupts, time-sharing operating modes, real-time operating modes, multi-program operating modes and new interactive communication modes. An effective video terminal device for communication between the operator and the machine has also appeared - a video monitor, or display.

Much attention is paid to increasing the reliability and reliability of computer operation and facilitating their maintenance. Reliability and reliability are ensured by the widespread use of codes with automatic error detection and correction (Hamming correction codes and cyclic codes).

Fourth generation of computers: 1980-1990s

A revolutionary event in the development of computer technology of the fourth generation of machines was the creation of large and ultra-large integrated circuits, a microprocessor and a personal computer.

Logic integrated circuits in computers began to be created on the basis of unipolar field-effect CMOS transistors with direct connections, operating with lower amplitudes of electrical voltages.

Fifth generation of computers: 1990-present

Briefly, the basic concept of a fifth-generation computer can be formulated as follows:

Computers based on ultra-complex microprocessors with a parallel-vector structure that simultaneously execute dozens of sequential program instructions.

Computers with many hundreds of parallel working processors, allowing the construction of data and knowledge processing systems, efficient network computer systems.

Sixth and subsequent generations of computers

Electronic and optoelectronic computers with massive parallelism, neural structure, with a distributed network of a large number (tens of thousands) of microprocessors modeling the architecture of neural biological systems.

3 The role of computer technology in human life.

The role of computer science in general in modern conditions is constantly increasing. The activities of both individuals and entire organizations increasingly depend on their awareness and ability to effectively use available information. Before taking any action, it is necessary to carry out a lot of work on collecting and processing information, understanding it and analyzing it. Finding rational solutions in any area requires processing large amounts of information, which is sometimes impossible without the involvement of special technical means. Introduction of computers, modern means processing and transfer of information to various industries marked the beginning of a process called the informatization of society. Modern material production and other areas of activity increasingly need information services, processing a huge amount of information. Informatization based on the introduction of computer and telecommunication technologies is society’s response to the need for a significant increase in labor productivity in the information sector social production, where more than half of the working population is concentrated.

Information technologies have entered all areas of our lives. The computer is a means of increasing the efficiency of the learning process and is involved in all types of human activity, indispensable for social sphere. Information technologies are hardware and software tools based on the use of computer technology that provide storage and processing of educational information, delivery of it to the student, interactive interaction between the student and the teacher or pedagogical software, as well as testing the student’s knowledge.

It can be assumed that the evolution of technology, in general, continues natural evolution. If the development of stone tools helped the formation of human intellect, metal ones increased the productivity of physical labor (so much so that a separate layer of society was freed up for intellectual activity), machines mechanized physical work, then information technology is designed to free a person from routine mental work and enhance his creative capabilities.

Conclusion

It is possible to live as an educated person in the 21st century only if you have a good command of information technology. After all, people’s activities increasingly depend on their awareness and ability to effectively use information. To freely navigate information flows, a modern specialist of any profile must be able to receive, process and use information using computers, telecommunications and other means of communication. They begin to talk about information as strategic resource society as a resource that determines the level of development of the state.

By studying the history of the development of computer technology, one can understand the entire structure and significance of computers in human life. This will help you better understand them and easily perceive new progressive technologies, because we must not forget that computer technologies are progressing almost every day and if you do not understand the structure of the machines that existed many years ago, it will be difficult to overcome the current generation.

In the presented work, it was possible to show where the development of computer technology began and where it ends and what important role they play for people today.

Start

A calculator and a computer are far from the only devices with which you can carry out calculations. Humanity began to think quite early on how to make the processes of division, multiplication, subtraction and addition easier for itself. One of the first such devices can be considered balance scales, which appeared in the fifth millennium BC. However, let's not dive so far into the depths of history.

Andy Grove, Robert Noyce and Gordon Moore. (wikipedia.org)

The abacus, known to us as the abacus, was born around 500 BC. They may argue for the right to be considered its homeland Ancient Greece, India, China and the Inca state. Archaeologists suspect that there were even computing mechanisms in ancient cities, although their existence has not yet been proven. However, the antiker mechanism, which we already mentioned in the previous article, may well be considered a computational mechanism.

With the advent of the Middle Ages, the skills to create such devices were lost. Those dark times were generally a period of sharp decline in science. But in the 17th century, humanity again began to think about computing machines. And they were not slow to appear.

The first computers

Creating a device that could perform calculations was the dream of the German astronomer and mathematician Wilhelm Schickard. He had many different projects, but most of them failed. Schickard was not embarrassed by failures, and he eventually achieved success. In 1623, the mathematician designed the “Counting Clock” - an incredibly complex and cumbersome mechanism, which, however, could perform simple calculations.

"Chiccard's counting clock." Drawing. (wikipedia.org)

The “counting clocks” were of considerable size and large mass; it was difficult to use them in practice. Schickard's friend, the famous astronomer Johannes Kepler, jokingly noted that it is much easier to do calculations in your head than to use a watch. However, it was Kepler who became the first user of the Schickard clock. It is known that with their help he carried out many of his calculations.

Johannes Kepler. (wikipedia.org)

This device got its name because it was based on the same mechanism that worked in wall clock. And Schickard himself can be considered the “father” of the calculator. Twenty years have passed, and the family of computers has been expanded with the invention of the French mathematician, physicist and philosopher Blaise Pascal. The scientist presented “Pascalina” in 1643.

Pascal's adding machine. (wikipedia.org)

Pascal was then 20 years old, and he made the device for his father, a tax collector who had to deal with very complex calculations. The adding machine was driven by gears. To enter the required number into it, you had to turn the wheels a certain number of times.

Thirty years later, in 1673, the German mathematician Gottfried Leibniz created his project. His device was the first in history to be called a calculator. The principle of operation was the same as that of Pascal's machine.

Gottfried Leibniz. (wikipedia.org)

There is one very interesting story connected with Leibniz's calculator. At the beginning of the 18th century, the car was seen by Peter I, who was visiting Europe as part of the Great Embassy. The future emperor was very interested in the device and even bought it. Legend has it that Peter later sent the calculator to the Kangxi Emperor of China as a gift.

From calculator to computer

The case of Pascal and Leibniz developed further. In the 18th century, many scientists made attempts to improve computers. The main idea was to create a commercially successful device. Success ultimately followed the Frenchman Charles Xavier Thomas de Colmar.

Charles Xavier Thomas de Colmar. (wikipedia.org)

In 1820, he launched mass production of computing instruments. Strictly speaking, Colmar was more of a skilled industrialist than an inventor. His “Thoma machine” was not much different from Leibniz’s calculator. Colmar was even accused of stealing someone else's invention and trying to make a fortune from someone else's labor.

In Russia, serial production of calculators began in 1890. The calculator acquired its current form already in the twentieth century. In the 1960-1970s, this industry experienced a real boom. The devices were improved every year. In 1965, for example, a calculator appeared that could calculate logarithms, and in 1970 a calculator that fit in a person’s hand was first released. But at this time the computer age had already begun, although humanity had not yet had time to feel it.

Computers

Many consider the French weaver Joseph Marie Jacquard to be the person who laid the foundations for the development of computer technology. It's hard to say whether this is a joke or not. However, it was Jacquard who invented the punch card. Back then people didn’t yet know what a memory card was. Jacquard's invention may well claim this title. The weaver invented it to control the loom. The idea was that a punch card was used to create a pattern for the fabric. That is, from the moment the punch card was launched, the pattern was applied without human intervention - automatically.

Punch card. (wikipedia.org)

Jacquard's punch card, of course, was not an electronic device. The appearance of such objects was still very far away, because Jacquard lived at the turn of the 18th-19th centuries. Ekov. However, punched cards later became widely used in other areas, going far beyond the famous loom.

In 1835, Charles Babbage described an analytical engine that could be based on punched cards. The key operating principle of such a device was programming. Thus, the English mathematician predicted the appearance of the computer. Alas, Babbage himself was never able to build the machine he invented. The world's first analog computer was born in 1927. It was created by University of Massachusetts professor Vannevar Bush.

The ancient man had his own counting instrument - ten fingers on his hands. The man bent his fingers - added them, straightened them - subtracted. And the man guessed: for counting you can use anything you can get your hands on - pebbles, sticks, bones. Then they began to tie knots on the rope and make notches on sticks and planks (Fig. 1.1).

Rice. 1.1. Nodules (A) and notches on the tablets ( b)

Abacus period. An abacus (gr. abax - board) was a board covered with a layer of dust, on which lines were drawn with a sharp stick and some objects were placed in the resulting columns according to the positional principle. In the V-IV centuries. BC e. The oldest known accounts were created - the “Salamin board” (named after the island of Salamis in the Aegean Sea), which was called “abacus” by the Greeks and Western Europe. In Ancient Rome, the abacus appeared in the 5th-6th centuries. n. e. and was called calculi or abakuli. The abacus was made from bronze, stone, ivory and colored glass. A bronze Roman abacus has survived to this day, on which pebbles moved in vertically cut grooves (Fig. 1.2).

Rice. 1.2.

In the XV-XVI centuries. In Europe, counting was common on lines or counting tables with tokens placed on them.

In the 16th century Russian abacus with a decimal number system appeared. In 1828, Major General F. M. Svobodskoy put on display an original device consisting of many accounts connected in a common frame (Fig. 1.3). All operations were reduced to the actions of addition and subtraction.

Rice. 1.3.

Period of mechanical devices. This period lasted from early XVII before late XIX V.

In 1623, Wilhelm Schickard described the design of a calculating machine in which the operations of addition and subtraction were mechanized. In 1642, the French mechanic Blaise Pascal designed the first mechanical calculating machine - “Pascalina” (Fig. 1.4).

In 1673, the German scientist Goftrid Leibniz created the first mechanical computing machine, performing

Rice. 1.4.

Show four arithmetic operations (addition, subtraction, multiplication and division). In 1770 in Lithuania, E. Jacobson created a summing machine that determined quotients and was capable of working with five-digit numbers.

In 1801 - 1804. French inventor J.M. Jacquard was the first to use punched cards to control an automatic loom.

In 1823, the English scientist Charles Babbage developed the “Difference Engine” project, which anticipated the modern program-controlled automatic car(Fig. 1.5).

In 1890, a resident of St. Petersburg, Vilgodt Odner, invented an adding machine and launched their production. By 1914, in Russia alone there were more than 22 thousand Odner adding machines. In the first quarter of the 20th century. these adding machines were the only mathematical machines that were widely used in various areas of human activity (Fig. 1.6).


Rice. 1.5. Babbage's machine Fig. 1.6. Adding machine

Computer period. This period began in 1946 and continues today. It is characterized by the combination of advances in the field of electronics with new principles for constructing computers.

In 1946, under the leadership of J. Mauchly and J. Eckert, the first computer was created in the USA - ENIAC (Fig. 1.7). It had the following characteristics: length 30 m, height 6 m, weight 35 tons, 18 thousand vacuum tubes, 1500 relays, 100 thousand resistances and capacitors, 3500 op/s. At the same time, these scientists began work on a new machine - "EDVAC" (EDVAC - Electronic


Rice. 1.7.

Discret Variable Automatic Computer - an electronic automatic computer with discrete variables), the program of which had to be stored in the computer's memory. It was supposed to use mercury tubes used in radar as internal memory.

In 1949, the EDSAC computer with a program stored in memory was built in Great Britain.

The appearance of the first computers is still controversial. Thus, the Germans consider the first computer to be a machine for artillery crews, created by Konrad Zuse in 1941, although it worked on electric relays and was thus not electronic, but electromechanical. For Americans, this is ENIAC (1946, J. Mauchly and J. Eckert). Bulgarians consider the inventor of the computer to be John (Ivan) Atanasov, who in 1941 in the USA designed a machine for solving systems of algebraic equations.

The British, having rummaged through secret archives, stated that the first electronic computer was created in 1943 in England and was intended to decrypt negotiations of the German high command. This equipment was considered so secret that after the war it was destroyed on Churchill's orders and the plans were burned to prevent the secret from falling into the wrong hands.

The Germans conducted secret everyday correspondence using Enigma encryption machines (Latin: enigma - riddle). By the beginning of World War II, the British already knew how Enigma worked and were looking for ways to decipher its messages, but the Germans had another encryption system designed only for the most important messages. It was manufactured by Lorenz in small quantity copies of the Schlusselzusatz-40 machine (the name translates as “cipher attachment”). Externally, it was a hybrid of an ordinary teletype and a mechanical cash register. The teletype translated the text typed on the keyboard into a sequence of electrical impulses and pauses between them (each letter corresponds to a set of five impulses and “ empty seats"). IN " cash register” two sets of five gears rotated, which randomly added two more sets of five pulses and skips to each letter. The wheels had different numbers of teeth, and this number could be changed: the teeth were made movable, they could be moved to the side or pulled into place. There were two more “motor” wheels, each of which rotated its own set of gears.

At the beginning of the transmission of the encrypted message, the radio operator informed the recipient the initial position of the wheels and the number of teeth on each of them. This setting data was changed before each transmission. By placing the same sets of wheels in the same position on his machine, the receiving radio operator ensured that the extra letters were automatically subtracted from the text, and the teletype printed the original message.

In 1943, mathematician Max Newman in England developed electronic machine"Colossus" The wheels of the car were modeled by 12 groups of electron tubes - thyratrons. Automatically going through different options for the states of each thyratron and their combinations (the thyratron can be in two states - to pass or not to pass electric current, i.e., to give an impulse or a pause), “Colossus” figured out the initial setting of the gears of the German machine. The first version of the “Colossus” had 1,500 thyratrons, and the second, which started operating in June 1944, had 2,500. In an hour, the machine “swallowed” 48 km of punched tape, onto which operators filled in rows of ones and zeros from German messages; 5,000 letters were processed per second. This computer had a memory based on charging and discharging capacitors. It made it possible to read the top-secret correspondence of Hitler, Kesselring, Rommel, etc.

Note. A modern computer solves the initial position of the wheels of the Schlusselzusatz-40 twice as slowly as the Colossus did, so a problem that in 1943 was solved in 15 minutes takes the Repyit PC 18 hours! The fact is that modern computers are designed as universal ones, designed to perform the most different tasks, and cannot always compete with ancient computers that could do only one action, but very quickly.

The first domestic electronic computer, MESM, was developed in 1950. It contained more than 6,000 vacuum tubes. This generation of computers includes: “BESM-1”, “M-1”, “M-2”, “M-3”, “Strela”, “Minsk-1”, “Ural-1”, “Ural- 2", "Ural-3", "M-20", "Setun", "BESM-2", "Hrazdan" (Table 1.1). Their speed did not exceed 2-3 thousand op/s, the RAM capacity was 2 K or 2048 machine words (1 K = 1024) with a length of 48 binary characters.

Table 1.1. Characteristics of domestic computers

Characters

First generation

Second generation

Targeting

Length ma-

tire spruce

va (binary digits)

Speed

Ferrite core

About half of the total data volume in information systems the world is stored on mainframe computers. For these purposes, the 1VM company back in the 1960s. began producing computers 1ВМ/360, 1ВМ/370 (Fig. 1.8), which became widespread in the world.

With the advent of the first computers in 1950, the idea of ​​using computer technology for management purposes arose technological processes. Computer-based control allows you to maintain process parameters in a mode close to optimal. As a result, the consumption of materials and energy is reduced, productivity and quality are increased, and rapid restructuring of equipment to produce a different type of product is ensured.


Rice. 1.8.

The pioneer of the industrial use of control computers abroad was the company Digital Equipment Corp. (DEC), which released in 1963 a specialized computer “PDP-5” for controlling nuclear reactors. The initial data were measurements obtained as a result of analog-to-digital conversion, the accuracy of which was 10-11 binary digits. In 1965, DEC released the first miniature computer “PDP-8” the size of a refrigerator and costing 20 thousand dollars, the element base of which was used integrated circuits.

Before the advent of integrated circuits, transistors were manufactured individually and had to be connected and soldered by hand when the circuits were assembled. In 1958, American scientist Jack Kilby figured out how to create several transistors on one semiconductor wafer. In 1959, Robert Noyce (the future founder of Intel) invented a more advanced method that made it possible to create transistors and all the necessary connections between them on one plate. The resulting electronic circuits became known as integrated circuits, or chips. Subsequently, the number of transistors that could be placed per unit area of ​​the integrated circuit approximately doubled every year. In 1968, Burroughs released the first integrated circuit computer, and in 1970, Intel began selling memory integrated circuits.

In 1970, another step was taken on the path to a personal computer - Marchian Edward Hoff from Intel designed an integrated circuit similar in its functions to the central processor of a mainframe computer. This is how the first one appeared microprocessor Intel-4004, which went on sale at the end of 1970. Of course, the capabilities of the Intel-4004 were much more modest than that of the central processor of a mainframe computer - it worked much slower and could only process 4 bits of information simultaneously (mainstream processors processed 16 or 32 bits simultaneously). In 1973, Intel released the 8-bit microprocessor Intel-8008, and in 1974, its improved version Intel-8080, which until the end of the 1970s. was the standard for the microcomputer industry (Table 1.2).

Table 1.2. Generations of computers and their main characteristics

Generation

Fourth (since 1975)

Computer element base

Electronic tubes, relays

Transistors,

parametrons

Ultra-Large ICs (VLSI)

CPU performance

Up to 3 10 5 op/s

Up to 3 10 6 op/s

Up to 3 10 7 op/s

3 10 7 op/s

Type of random access memory (RAM)

Triggers,

ferrite

cores

Miniature

ferrite

cores

Semiconductor on

Semiconductor on

More than 16 MB

Characteristic types of computers

generations

Small, medium, large, special

mini- and microcomputers

Supercomputer,

PC, special, general, computer networks

Typical generation models

IBM 7090, BESM-6

BH-2, 1VM RS/XT/AT, RB/2, Sgau, networks

Characteristic

software

security

Codes, autocodes, assemblers

Programming languages, dispatchers, automated control systems, process control systems

PPP, DBMS, CAD, Javascript, operational

DB, ES, parallel programming systems

Generations of computers are determined by the element base (lamps, semiconductors, microcircuits of varying degrees of integration (Fig. 1.9)), architecture and computing capabilities (Table 1.3).

Table 1.3. Features of computer generations

Generation

Peculiarities

I generation (1946-1954)

Application of vacuum tube technology, use of memory systems on mercury delay lines, magnetic drums, cathode ray tubes. Punched tapes and punched cards, magnetic tapes and printing devices were used for data input and output

II generation (1955-1964)

Use of transistors. Computers have become more reliable and their performance has increased. With the advent of memory on magnetic cores, its operating cycle decreased to tens of microseconds. The main principle of the structure is centralization. High-performance devices for working with magnetic tapes and magnetic disk memory devices appeared

III generation (1965-1974)

Computers were designed on the basis of integrated circuits of low degree of integration (MIS from 10 to 100 components per chip) and medium degree of integration (SIS from 10 to 1000 components per chip). At the end of the 1960s. mini-computers appeared. In 1971, the first microprocessor appeared

IV generation (since 1975)

The use of large integrated circuits (LSI from 1000 to 100 thousand components per chip) and ultra-large integrated circuits (VLSI from 100 thousand to 10 million components per chip) when creating computers. The main emphasis when creating computers is on their “intelligence”, as well as on an architecture focused on knowledge processing


a B C

Rice. 1.9. Computer element base: A - electric lamp; b - transistor;

V- integrated circuit

The first microcomputer was the Altair-8800, created in 1975 by a small company in Albuquerque (New Mexico) based on the Intel-8080 microprocessor. At the end of 1975, Paul Allen and Bill Gates (future founders of Microsoft) created a Basic language interpreter for the Altair computer, which allowed users to write programs quite simply.

Subsequently, the TRS-80 RS, RET RS and Apple computers appeared (Fig. 1.10).

Rice. 1.10.

Domestic industry produced DEC-compatible (interactive computing systems DVK-1, ..., DVK-4 based on the Elektronika MS-101, Elektronika 85, Elektronika 32 computers) and IBM PC-compatible (EC 1840 - EC 1842, EC 1845, EC 1849, EC 1861, Iskra 4861), which were significantly inferior in their characteristics to the above.

Recently, personal computers produced by US companies have become widely known: Compaq Computer, Apple (Macintosh), Hewlett Packard, Dell, DEC; UK companies: Spectrum, Amstard; by the French company Micra; by Italian company Olivetty; Japanese companies: Toshiba, Panasonic, Partner.

Personal computers from IBM (International Business Machines Corporation) are currently the most popular.

In 1983, the IBM PC XT computer with a built-in hard drive appeared, and in 1985, the IBM PC AT computer based on the 16-bit Intel 80286 processor (Fig. 1.11).

In 1989, the Intel 80486 processor was developed with modifications 486SX, 486DX, 486DX2 and 486DX4. Clock frequencies of 486DX processors, depending on the model, are 33, 66 and 100 MHz.


IBM's new family of PC models is called PS/2 (Personal System 2). The first models of the PS/2 family used the Intel 80286 processor and actually copied the AT PC, but based on a different architecture.

In 1993, Pentium processors with clock frequencies of 60 and 66 MHz appeared.

In 1994, Intel began producing Pentium processors with clock frequencies of 75, 90 and 100 MHz. In 1996, the clock speed of Pentium processors increased to 150, 166 and 200 MHz (Fig. 1.12).


System

Mouse type manipulator

Rice. 1.12. Multimedia computer configuration

In 1997, Intel released a new Pentium MMX processor with clock frequencies of 166 and 200 MHz. The abbreviation MMX meant that this processor was optimized for working with graphics and video information. In 1998, Intel announced the release of the Celeron processor with a clock frequency of 266 MHz.

Since 1998, Intel has announced a version of the Pentium® II Heop™ processor with a clock frequency of 450 MHz (Table 1.4).

Table 1.4. IBM computers

computer

CPU

Clock frequency, MHz

operational

For a long time, processor manufacturers - primarily Intel and AMD - increased their clock speed to improve processor performance. However, at clock frequencies above 3.8 GHz, the chips overheat and you can forget about the benefits. New ideas and technologies were required, one of which was the idea of ​​creating multi-core chips. In such a chip, two or more processors operate in parallel, which provide greater performance at a lower clock frequency. Performed in this moment the program divides data processing tasks into both cores. This gives maximum effect, when and operating system, And application programs designed for parallel work, such as for graphics processing.

Multi-core architecture is a variant of processor architecture that places two or more “execution,” or compute, Pentium® cores on a single processor. A multi-core processor is inserted into the processor socket, but the operating system treats each of its execution cores as a separate logical processor with all the corresponding execution resources (Figure 1.13).

This implementation of the internal processor architecture is based on the “divide and conquer” strategy. In other words, section


Rice. 1.13.

By dividing the computational work done in traditional microprocessors by a single Pentium core among multiple Pentium execution cores, a multi-core processor can perform more work in a given time interval. For this software(Software) must support load distribution between several execution cores. This functionality is called parallelism at the thread level, or the organization of threaded processing, and the applications and operating systems that support it (such as Microsoft Windows XP) are called multithreaded.

Multi-core also affects the simultaneous operation of standard applications. For example, one processor core may be responsible for a program running in the background, while an antivirus program takes up the resources of the second core. In practice, dual-core processors do not perform calculations twice as fast as single-core processors: although the performance gain is significant, it depends on the type of application.

The first dual-core processors appeared on the market in 2005. Over time, more and more successors appeared. Therefore, “old” dual-core processors have seriously fallen in price today. They can be found in computers starting at $600 and laptops starting at $900. Computers with modern dual-core chips cost about $100 more than models equipped with “older” chips. One of the main developers of multi-core processors is Intel Corporation.

Before the advent of dual-core chips, manufacturers offered single-core processors with the ability to run multiple programs in parallel. Some Pentium 4 series processors had a Hyper-Threading function that returned a byte value containing the logical and physical identifiers of the current process. It can be seen as the predecessor of the Dual-Core architecture, consisting of two optimized mobile execution cores. Dual-Core means that while one core is busy running an application, or, for example, checking for virus activity, the other core will be available to perform other tasks, for example, the user will be able to surf the Internet or work with a spreadsheet. Although the processor had one physical core, the chip was designed so that it could execute two programs simultaneously (Figure 1.14).

Control Panel

QNX Neutrino RTOS (one copy)

Command line interface (kernels 0 and 1)

Routing (Cores 0 and 1)

Management, administration and Maintenance(cores 0 and 1)

Dashboard Hardware

Dashboard Monitoring (Cores 0 and 1)

Rice. 1.14. Scheme for using multiprocessing

in the control panel

The operating system recognizes such a chip as two separate processors. Conventional processors process 32 bits per clock cycle. The latest chips manage to process twice as much data in one clock cycle, i.e. 64 bits. This advantage is especially noticeable when processing large amounts of data (for example, when processing photographs). But in order to use it, the operating system and applications must support the 64-bit processing mode.

Under specially designed 64-bit versions of Windows XP and Windows Vista, 32- and 64-bit programs are launched, depending on the need.

Early devices and counting devices

Humanity learned to use the simplest counting devices thousands of years ago. The most popular was the need to determine the number of items used in barter trade. One of the simplest solutions was to use the weight equivalent of the item being changed, which did not require an exact recalculation of the number of its components. For these purposes, the simplest balance scales were used, which thus became one of the first devices for the quantitative determination of mass.

The principle of equivalence was widely used in another, familiar to many, simplest counting devices, the Abacus or Abacus. The number of items counted corresponded to the number of dominoes of this instrument moved.

A relatively complex device for counting could be a rosary, used in the practice of many religions. The believer, as if on an abacus, counted the number of prayers said on the grains of the rosary, and when passing a full circle of the rosary, he moved special counter grains on a separate tail, indicating the number of counted circles.

With the invention of gear wheels, much more complex devices for performing calculations appeared. Antikythera Mechanism, discovered at the beginning of the 20th century, which was found at the site of the wreck of an ancient ship that sank around 65 BC. e. (according to other sources in or even 87 BC), he even knew how to simulate the movement of planets. Presumably it was used for calendar calculations for religious purposes, predicting solar and lunar eclipses, determining the time of sowing and harvesting, etc. Calculations were performed by connecting more than 30 bronze wheels and several dials; To calculate the lunar phases, differential transmission was used, the invention of which researchers for a long time attributed no earlier than the 16th century. However, with the passing of antiquity, the skills of creating such devices were forgotten; It took about one and a half thousand years for people to again learn how to create mechanisms of similar complexity.

"Counting Clocks" by Wilhelm Schickard

This was followed by machines by Blaise Pascal (Pascalina, 1642) and Gottfried Wilhelm Leibniz.

ANITA Mark VIII, 1961

In the Soviet Union at that time, the most famous and widespread calculator was the Felix mechanical adding machine, produced from 1929 to 1978 at factories in Kursk (Schetmash plant), Penza and Moscow.

The emergence of analog computers in the pre-war years

Main article: History of analog computing machines

Differential Analyzer, Cambridge, 1938

The first electromechanical digital computers

Z-series by Konrad Zuse

Reproduction of the Zuse Z1 computer in the Museum of Technology, Berlin

Zuse and his company built other computers, each of which began with a capital letter Z. The most famous machines were the Z11, sold to the optical industry and universities, and the Z22, the first computer with magnetic memory.

British Colossus

In October 1947, the directors of Lyons & Company, a British company that owned a chain of shops and restaurants, decided to become actively involved in the development of commercial computer development. The LEO I computer went live in 1951 and was the first computer in the world to be regularly used for routine office work.

The Manchester University machine became the prototype for the Ferranti Mark I. The first such machine was delivered to the university in February 1951, and at least nine others were sold between 1951 and 1957.

The second-generation IBM 1401 computer, released in the early 1960s, captured about a third of the global computer market, with more than 10,000 of these machines sold.

The use of semiconductors has improved not only the central processor, but also peripheral devices. The second generation of data storage devices made it possible to save tens of millions of characters and numbers. A division appeared into rigidly fixed ( fixed) storage devices connected to the processor by a high-speed data link, and removable ( removable) devices. Replacing a disk cassette in a removable device took only a few seconds. Although the capacity of removable media was usually lower, their replaceability made it possible to save an almost unlimited amount of data. Magnetic tape was commonly used for archiving data because it provided more storage capacity at a lower cost.

In many second-generation machines, the functions of communicating with peripheral devices were delegated to specialized coprocessors. For example, while the peripheral processor is reading or punching punch cards, the main processor is performing calculations or branching on the program. One data bus carries data between memory and the processor during the instruction fetch and execution cycle, and typically other data buses serve peripheral devices. On the PDP-1, a memory access cycle took 5 microseconds; Most instructions required 10 microseconds: 5 to fetch the instruction and another 5 to fetch the operand.

The best domestic computer of the 2nd generation is considered to be BESM-6, created in 1966.

1960s onwards: third and subsequent generations

The rapid growth in the use of computers began with the so-called. "3rd generation" of computers. This began with the invention of integrated circuits, which were independently made by Nobel Prize winner Jack Kilby and Robert Noyce. This later led to the invention of the microprocessor by Tad Hoff (Intel).

The advent of microprocessors led to the development of microcomputers, small, inexpensive computers that could be owned by small companies or individuals. Microcomputers, members of the fourth generation, first appeared in the 1970s, became ubiquitous in the 1980s and beyond. Steve Wozniak, one of the founders of Apple Computer, became known as the developer of the first mass-produced home computer, and later the first personal computer. Computers based on microcomputer architecture, with capabilities added from their larger cousins, now dominate most market segments.

In the USSR and Russia

1940s

In 1948, under the supervision of Doctor of Physical and Mathematical Sciences S. A. Lebedev, work began in Kyiv on the creation of a MESM (small electronic calculating machine). In October 1951 it came into operation.

At the end of 1948, employees of the Energy Institute named after. Krizhizhanovsky I. S. Brook and B. I. Rameev receive an author's certificate on a computer with a common bus, and in 1950-1951. create it. This machine is the first in the world to use semiconductor (cuprox) diodes instead of vacuum tubes. Since 1948, Brook has been working on electronic digital computers and control using computer technology.

At the end of the 1950s, the principles of parallelism of calculations were developed (A.I. Kitov and others), on the basis of which one of the fastest computers of that time was built - the M-100 (for military purposes).

In July 1961, the USSR launched the first semiconductor universal control machine "Dnepr" (before that there were only specialized semiconductor machines). Even before the start of serial production, experiments were carried out with it on controlling complex technological processes at