My business is Franchises. Ratings. Success stories. Ideas. Work and education
Site search

Computer technology tu. Computing devices and devices from antiquity to the present day - document

MINISTRY OF EDUCATION AND SCIENCE OF THE RUSSIAN FEDERATION

State educational institution of higher professional education

Russian State Trade and Economic University

Ufa Institute (branch)

Faculty jurisprudence and distance learning

Well 1 (5.5 g)

Speciality 080507.65 “Organization management”

Department "Management of internal

and international trade»

Zhuravlev Sergey Vladimirovich

History of development computer technology. Brief historical background. Generations of computers. Prospects for the development of computer technology.

Test

discipline: "Informatics"

I allow for defense:

Head: Zakiryanov F.K._____________

(signature)

_________________

Defense assessment

_______________________________________

Date_________Signature__________

Introduction........................................................ ................................ page 3

The initial stage of development of computer technology.................. page 4

The beginning of modern electronic history

computer technology…………………………………...……. page 7

Generations of computers................................................... ....................... page 9

Personal computers................................................ ..... page 13

What's ahead? ........................................................ ........................... page 16

Conclusion……………………………………………………… page 18

Bibliography................................................ ............... page 20

Introduction

The word “computer” means “computer”, i.e. device for

calculations. The need to automate data processing, including calculations, arose a long time ago. More than 1500 years ago, counting sticks, pebbles, etc. were used for counting.

Nowadays it is difficult to imagine that without computers it is possible

make do. But not so long ago, until the early 70s, computers were available to a very limited circle of specialists, and their use, as a rule, remained shrouded in secrecy and little known to the general public. However, in 1971, an event occurred that radically changed the situation and, with fantastic speed, turned the computer into an everyday work tool for tens of millions of people. In that undoubtedly significant year, the almost unknown company Intel from a small American town with the beautiful name of Santa Clara (California) released the first microprocessor. It is to him that we owe the emergence of a new class of computing systems - personal computers, which are now used by essentially everyone, from students primary classes and accountants to scientists and engineers.

At the end of the 20th century, it is impossible to imagine life without a personal computer. The computer has firmly entered our lives, becoming man's main assistant. Today in the world there are many computers from different companies, different complexity groups, purposes and generations.

In this essay we will look at the history of the development of computer technology, as well as short review about the possibilities of using modern computing systems and further trends in the development of personal computers.

The initial stage of development of computer technology.

It all started with the idea of ​​teaching a machine to count or at least add multi-digit integers. Around 1500, the great figure of the Enlightenment, Leonardo da Vinci, developed a sketch of a 13-bit adding device, which was the first attempt to solve this problem that has come down to us. The first operating summing machine was built in 1642 by Blaise Pascal, a famous French physicist, mathematician, and engineer. His 8-bit machine has survived to this day.

Fig.1. Blaise Pascal (1623 – 1662) and his calculating machine

Almost 250 years have passed from the wonderful curiosity that Pascal’s machine was perceived by contemporaries to the creation of a practically useful and widely used unit - an adding machine (a mechanical computing device capable of performing 4 arithmetic operations). Already at the beginning of the 19th century, the level of development of a number of sciences and areas of practical activity (mathematics, mechanics, astronomy, engineering, navigation, etc.) was so high that they urgently required the performance of a huge amount of calculations that went beyond the capabilities of an unarmed person. appropriate technology. Both outstanding world-famous scientists and hundreds of people, the names of many of whom have not reached us, who devoted their lives to the design of mechanical devices, worked on its creation and improvement. computing devices.

Back in the 70s of our century, mechanical adding machines and their “closest relatives” equipped with an electric drive - electromechanical keyboard computers - stood on store shelves. As often happens, for quite a long time they surprisingly coexisted with technology of a completely different level - automatic digital computers (ADCM), which in common parlance are more often called computers (although, strictly speaking, these concepts do not quite coincide). The history of the ACVM dates back to the first half of the last century and is associated with the name of the remarkable English mathematician and engineer Charles Babbage. In 1822, he designed and for almost 30 years built and improved a machine, first called “difference”, and then, after numerous improvements to the design, “analytical”. The “analytical” engine contained principles that became fundamental to computer technology.

1. Automatic execution of operations.

To perform large-scale calculations, it is important not only how quickly an individual arithmetic operation is performed, but also that there are no “gaps” between operations that require direct human intervention. For example, most modern calculators do not meet this requirement, although they perform every operation available to them very quickly. It is necessary that operations follow one after another without stopping.

2. Work according to the program entered “on the fly”.

To automatically perform operations, the program must be entered into the actuator at a speed commensurate with the speed of operations. Babbage proposed using punched cards, which by that time were used to control looms, to pre-record programs and enter them into the machine.

3. The need for a special device - memory - for storing data (Babbage called it a “warehouse”).

Rice. 2. Charles Babbage (1792 – 1871) and his “Analytical Engine”

These revolutionary ideas encountered the impossibility of their implementation on the basis of mechanical technology, because almost half a century remained before the appearance of the first electric motor, and almost a century remained before the first electronic radio tube! They were so ahead of their time that they were largely forgotten and rediscovered in the next century.

Automatic computing devices first appeared in the mid-20th century. This became possible thanks to the use of electromechanical relays along with mechanical structures. Work on relay machines began in the 30s and continued with varying success until in 1944, under the leadership of Howard Aiken, an American mathematician and physicist, the Mark-1 machine was launched at IBM (International Business Machines). ”, which was the first to implement Babbage’s ideas (although the developers were apparently not familiar with them). Mechanical elements (counting wheels) were used to represent numbers, and electromechanical elements were used for control. One of the most powerful relay machines RVM-1 was built in the early 50s in the USSR under the leadership of N.I. Bessonov; it performed up to 20 multiplications per second on fairly long binary numbers.

However, the appearance of relay machines was hopelessly late and they were very quickly replaced by electronic ones, which were much more productive and reliable.

The beginning of the modern history of electronic computing

A true revolution in computing has occurred in connection with the use of electronic devices. Work on them began in the late 30s simultaneously in the USA, Germany, Great Britain and the USSR. By this time, vacuum tubes, which had become the technical basis for devices for processing and storing digital information, were already widely used in radio engineering devices.

The first operating computer was ENIAC (USA, 1945 – 1946). Its name, based on the first letters of the corresponding English words, means “electronic-numerical integrator and calculator.” Its creation was led by John Mauchly and Presper Eckert, who continued the work of George Atanasov, which began in the late 30s. The machine contained about 18 thousand vacuum tubes and many electromechanical elements. Its energy consumption was 150 kW, which is quite enough to power a small plant.

Almost simultaneously, work was underway on the creation of computers in Great Britain. First of all, the name of Allan Turing, a mathematician who also made a great contribution to the theory of algorithms and coding theory, is associated with them. In 1944, the Colossus machine was launched in Great Britain.

These and a number of other first computers did not have the most important quality from the point of view of the designers of subsequent computers - the program was not stored in the machine’s memory, but was typed in a rather complex way using external switching devices.

One of the greatest American mathematicians, John von Neumann, made a huge contribution to the theory and practice of creating electronic computer technology at the initial stage of its development. “Von Neumann’s principles” have forever entered the history of science. The combination of these principles gave rise to the classical (von Neumann) computer architecture. One of the most important principles, the stored program principle, requires that the program be stored in the machine's memory in the same way that the original information is stored in it. The first stored program computer (EDSAC) was built in Great Britain in 1949.

Rice. 3. John von Neumann (1903-1957) Fig. 4. Sergei Alexandrovich Lebedev (1902-1974)

In our country, until the 70s, the creation of computers was carried out almost entirely independently and independently of the outside world (and this “world” itself was almost completely dependent on the United States). The fact is that electronic computer technology from the very moment of its initial creation was considered a top-secret strategic product, and the USSR had to develop and produce it independently. Gradually, the secrecy regime was softened, but even at the end of the 80s, our country could only buy outdated computer models abroad (and the most modern and powerful computers are still developed and produced by leading manufacturers - the USA and Japan - in secrecy mode).

The first domestic computer, MESM (“small electronic computer”), was created in 1951 under the leadership of Sergei Aleksandrovich Lebedev, the largest Soviet designer of computer technology, later an academician, laureate of state prizes, who led the creation of many domestic computers. The record among them and one of the best in the world for its time was BESM-6 (“large electronic calculating machine, 6th model”), created in the mid-60s and for a long time was the basic machine in defense, space research, scientific and technical research in the USSR. In addition to machines of the BESM series, computers of other series were also produced - “Minsk”, “Ural”, M-20, “Mir” and others, created under the leadership of I.S. Bruk and M.A. Kartsev, B.I. Rameev, V. M. Glushkov, Yu. A. Bazilevsky and other domestic designers and theorists of computer science. historical development. ... terminator 10 + T R terror 6 + T A technique 7 + T M technocratism 12 + T I technophobia... Filippov F.R. From generations To generation: sociology and...

  • Modern information technologies (2)

    Lecture >> Computer science, programming

    ... Development computing technology IN development computing technology we can highlight the background and four generations electronic computing ... prospects and opportunities for further development ... computer computing centers was the first historically ... Story development ...

  • Economics and management in modern Russian electric power industry

    Book >> Economic theory

    ... story development steam turbine technology for nuclear power plants it is story...conditional historical, political... let's give brief certificate O... development electric power industry 5.7.1. Development prospects development ... computing technology. ...new generations carried out... computer, ...

  • Kantarovich

    Law >> Historical figures

    ... certificate... the main assistants of the first generations-- V.A. Zalgallera... partly historical misunderstanding... modern history, ... For computer, ... BRIEF LIFE STORY... development computing technology. He supervised the design of new computing ... prospects economy...

  • As soon as a person discovered the concept of “quantity”, he immediately began to select tools that would optimize and facilitate counting. Today, super-powerful computers, based on the principles of mathematical calculations, process, store and transmit information - the most important resource and engine of human progress. It is not difficult to get an idea of ​​how the development of computer technology took place by briefly considering the main stages of this process.

    The main stages of the development of computer technology

    The most popular classification proposes to highlight the main stages of the development of computer technology on a chronological basis:

    • Manual stage. It began at the dawn of the human era and continued until the middle of the 17th century. During this period, the basics of counting emerged. Later, with the formation of positional number systems, devices appeared (abacus, abacus, and later a slide rule) that made calculations by digits possible.
    • Mechanical stage. It began in the middle of the 17th century and lasted almost until the end of the 19th century. The level of development of science during this period made it possible to create mechanical devices that perform basic arithmetic operations and automatically remember the highest digits.
    • The electromechanical stage is the shortest of all that unite the history of the development of computer technology. It only lasted about 60 years. This is the period between the invention of the first tabulator in 1887 until 1946, when the very first computer (ENIAC) appeared. New machines, the operation of which was based on an electric drive and an electric relay, made it possible to perform calculations with much greater speed and accuracy, but the counting process still had to be controlled by a person.
    • The electronic stage began in the second half of the last century and continues today. This is the story of six generations of electronic computers - from the very first giant units, which were based on vacuum tubes, to the ultra-powerful modern supercomputers with a huge number of parallel working processors, capable of simultaneously executing many commands.

    The stages of development of computer technology are divided according to a chronological principle rather arbitrarily. At a time when some types of computers were in use, the prerequisites for the emergence of the following were actively being created.

    The very first counting devices

    The earliest counting tool known to the history of the development of computer technology is the ten fingers on human hands. Counting results were initially recorded using fingers, notches on wood and stone, special sticks, and knots.

    With the advent of writing, various ways recording numbers, positional number systems were invented (decimal - in India, sexagesimal - in Babylon).

    Around the 4th century BC, the ancient Greeks began to count using an abacus. Initially, it was a clay flat tablet with stripes applied to it with a sharp object. Counting was carried out by placing small stones or other small objects on these stripes in a certain order.

    In China, in the 4th century AD, a seven-pointed abacus appeared - suanpan (suanpan). Wires or ropes - nine or more - were stretched onto a rectangular wooden frame. Another wire (rope), stretched perpendicular to the others, divided the suanpan into two unequal parts. In the larger compartment, called “earth,” there were five bones strung on wires, in the smaller compartment, called “sky,” there were two of them. Each of the wires corresponded to a decimal place.

    Traditional soroban abacus has become popular in Japan since the 16th century, having arrived there from China. At the same time, abacus appeared in Russia.

    In the 17th century, based on logarithms discovered by the Scottish mathematician John Napier, the Englishman Edmond Gunter invented the slide rule. This device was constantly improved and has survived to this day. It allows you to multiply and divide numbers, raise to powers, determine logarithms and trigonometric functions.

    The slide rule became a device that completed the development of computer technology at the manual (pre-mechanical) stage.

    The first mechanical calculating devices

    In 1623, the German scientist Wilhelm Schickard created the first mechanical "calculator", which he called a counting clock. The mechanism of this device resembled an ordinary clock, consisting of gears and sprockets. However, this invention became known only in the middle of the last century.

    A quantum leap in the field of computing technology was the invention of the Pascalina adding machine in 1642. Its creator, French mathematician Blaise Pascal, began work on this device when he was not even 20 years old. "Pascalina" was a mechanical device in the form of a box with a large number of interconnected gears. The numbers that needed to be added were entered into the machine by turning special wheels.

    In 1673, the Saxon mathematician and philosopher Gottfried von Leibniz invented a machine that performed the four basic mathematical operations and could extract the square root. The principle of its operation was based on the binary number system, specially invented by the scientist.

    In 1818, the Frenchman Charles (Karl) Xavier Thomas de Colmar, taking Leibniz's ideas as a basis, invented an adding machine that could multiply and divide. And two years later, the Englishman Charles Babbage began constructing a machine that would be capable of performing calculations with an accuracy of 20 decimal places. This project remained unfinished, but in 1830 its author developed another - an analytical engine for performing accurate scientific and technical calculations. The machine was supposed to be controlled by software, and perforated cards with different locations of holes were to be used to input and output information. Babbage's project foresaw the development of electronic computing technology and the problems that could be solved with its help.

    It is noteworthy that the fame of the world's first programmer belongs to a woman - Lady Ada Lovelace (nee Byron). It was she who created the first programs for Babbage's computer. One of the computer languages ​​was subsequently named after her.

    Development of the first computer analogues

    In 1887, the history of the development of computer technology reached new stage. The American engineer Herman Hollerith (Hollerith) managed to design the first electromechanical computer - the tabulator. Its mechanism had a relay, as well as counters and a special sorting box. The device read and sorted statistical records made on punched cards. Subsequently, the company founded by Hollerith became the backbone of the world-famous computer giant IBM.

    In 1930, the American Vannovar Bush created a differential analyzer. It was powered by electricity, and vacuum tubes were used to store data. This machine was capable of quickly finding solutions to complex mathematical problems.

    Six years later, the English scientist Alan Turing developed the concept of a machine, which became the theoretical basis for modern computers. She had all the main properties modern means computer technology: could step-by-step perform operations that were programmed in the internal memory.

    A year after this, George Stibitz, a scientist from the United States, invented the country's first electromechanical device capable of performing binary addition. His operations were based on Boolean algebra - mathematical logic created in the mid-19th century by George Boole: the use of the logical operators AND, OR and NOT. Later, the binary adder will become an integral part of the digital computer.

    In 1938, Claude Shannon, an employee of the University of Massachusetts, outlined the principles of the logical design of a computer that uses electrical circuits to solve Boolean algebra problems.

    The beginning of the computer era

    The governments of the countries involved in World War II were aware of the strategic role of computing in the conduct of military operations. This was the impetus for the development and parallel emergence of the first generation of computers in these countries.

    A pioneer in the field of computer engineering was Konrad Zuse, a German engineer. In 1941, he created the first computer controlled by a program. The machine, called the Z3, was built on telephone relays, and programs for it were encoded on perforated tape. This device was able to work in the binary system, as well as operate with floating point numbers.

    The next model of Zuse's machine, the Z4, is officially recognized as the first truly working programmable computer. He also went down in history as the creator of the first high-level programming language, called Plankalküll.

    In 1942, American researchers John Atanasoff (Atanasoff) and Clifford Berry created a computing device that ran on vacuum tubes. The machine also used binary code and could perform a number of logical operations.

    In 1943, in an English government laboratory, in an atmosphere of secrecy, the first computer, called “Colossus,” was built. Instead of electromechanical relays, it used 2 thousand electronic tubes for storing and processing information. It was intended to crack and decrypt the code of secret messages transmitted by the German Enigma encryption machine, which was widely used by the Wehrmacht. The existence of this device was kept in the strictest confidence for a long time. After the end of the war, the order for its destruction was signed personally by Winston Churchill.

    Architecture development

    In 1945, the Hungarian-German American mathematician John (Janos Lajos) von Neumann created the prototype for the architecture of modern computers. He proposed writing a program in the form of code directly into the machine’s memory, implying joint storage of programs and data in the computer’s memory.

    Von Neumann's architecture formed the basis for the first universal electronic computer, ENIAC, being created at that time in the United States. This giant weighed about 30 tons and was located on 170 square meters of area. 18 thousand lamps were used in the operation of the machine. This computer could perform 300 multiplication operations or 5 thousand additions in one second.

    Europe's first universal programmable computer was created in 1950 in the Soviet Union (Ukraine). A group of Kyiv scientists, led by Sergei Alekseevich Lebedev, designed a small electronic calculating machine (MESM). Its speed was 50 operations per second, it contained about 6 thousand vacuum tubes.

    In 1952, domestic computer technology was replenished with BESM, a large electronic calculating machine, also developed under the leadership of Lebedev. This computer, which performed up to 10 thousand operations per second, was at that time the fastest in Europe. Information was entered into the machine's memory using punched paper tape, and data was output via photo printing.

    During the same period, a series of large computers was produced in the USSR under the general name “Strela” (the author of the development was Yuri Yakovlevich Bazilevsky). Since 1954, serial production of the universal computer "Ural" began in Penza under the leadership of Bashir Rameev. The latest models were hardware and software compatible with each other, there was a wide selection of peripheral devices, allowing you to assemble machines of various configurations.

    Transistors. Release of the first serial computers

    However, the lamps failed very quickly, making it very difficult to work with the machine. The transistor, invented in 1947, managed to solve this problem. Using the electrical properties of semiconductors, it performed the same tasks as vacuum tubes, but occupied much less space and did not consume as much energy. Along with the advent of ferrite cores for organizing computer memory, the use of transistors made it possible to significantly reduce the size of machines, making them even more reliable and faster.

    In 1954 American company Texas Instruments began mass-producing transistors, and two years later the first second-generation transistor-based computer, the TX-O, appeared in Massachusetts.

    In the middle of the last century, a significant part of government organizations and large companies used computers for scientific, financial, engineering calculations, and working with large amounts of data. Gradually, computers acquired features familiar to us today. During this period, plotters, printers, and storage media on magnetic disks and tape appeared.

    The active use of computer technology has led to an expansion of the areas of its application and required the creation of new software technologies. High-level programming languages ​​have appeared that make it possible to transfer programs from one machine to another and simplify the process of writing code (Fortran, Cobol and others). Special translator programs have appeared that convert code from these languages ​​into commands that can be directly perceived by the machine.

    The emergence of integrated circuits

    In 1958-1960, thanks to engineers from the United States Robert Noyce and Jack Kilby, the world learned about the existence of integrated circuits. Miniature transistors and other components, sometimes up to hundreds or thousands, were mounted on a silicon or germanium crystal base. The chips, just over a centimeter in size, were much faster than transistors and consumed much less power. The history of the development of computer technology connects their appearance with the emergence of the third generation of computers.

    In 1964, IBM released the first computer of the SYSTEM 360 family, which was based on integrated circuits. From this time on, the mass production of computers can be counted. In total, more than 20 thousand copies of this computer were produced.

    In 1972, the USSR developed the ES (unified series) computer. These were standardized complexes for the operation of computer centers that had common system commands The American IBM 360 system was taken as the basis.

    The following year, DEC released the PDP-8 minicomputer, the first commercial project in this area. The relatively low cost of minicomputers has made it possible for small organizations to use them.

    During the same period, the software was constantly improved. Operating systems were developed aimed at supporting the maximum number of external devices, and new programs appeared. In 1964, they developed BASIC, a language designed specifically for training novice programmers. Five years after this, Pascal appeared, which turned out to be very convenient for solving many applied problems.

    Personal computers

    After 1970, production of the fourth generation of computers began. The development of computer technology at this time is characterized by the introduction of large integrated circuits into computer production. Such machines could now perform thousands of millions of computational operations in one second, and their RAM capacity increased to 500 million bits. A significant reduction in the cost of microcomputers has led to the fact that the opportunity to buy them gradually became available to the average person.

    Apple was one of the first manufacturers of personal computers. Those who created it Steve Jobs and Steve Wozniak designed the first PC model in 1976, calling it the Apple I. It cost only $500. A year later, the next model of this company was presented - Apple II.

    The computer of this time for the first time became similar to a household appliance: in addition to its compact size, it had an elegant design and a user-friendly interface. The proliferation of personal computers at the end of the 1970s led to the fact that the demand for mainframe computers fell markedly. This fact seriously worried their manufacturer, IBM, and in 1979 it released its first PC to the market.

    Two years later, the company's first microcomputer with an open architecture appeared, based on the 16-bit 8088 microprocessor manufactured by Intel. The computer was equipped with a monochrome display, two drives for five-inch floppy disks, and 64 kilobytes of RAM. On behalf of the creator company, Microsoft specially developed operating system for this car. Numerous IBM PC clones appeared on the market, which stimulated the growth of industrial production of personal computers.

    In 1984, Apple developed and released a new computer - the Macintosh. Its operating system was extremely user-friendly: it presented commands in the form of graphic images and allowed them to be entered using a mouse. This made the computer even more accessible, since now no special skills were required from the user.

    Some sources date computers of the fifth generation of computing technology to 1992-2013. Briefly, their main concept is formulated as follows: these are computers created on the basis of highly complex microprocessors, having a parallel-vector structure, which makes it possible to simultaneously execute dozens of sequential commands embedded in the program. Machines with several hundred processors working in parallel make it possible to process data even more accurately and quickly, as well as create efficient networks.

    The development of modern computer technology already allows us to talk about sixth generation computers. These are electronic and optoelectronic computers running on tens of thousands of microprocessors, characterized by massive parallelism and modeling the architecture of neural biological systems, which allows them to successfully recognize complex images.

    Having consistently examined all stages of the development of computer technology, an interesting fact should be noted: inventions that have proven themselves well in each of them have survived to this day and continue to be used successfully.

    Computer Science Classes

    There are various options for classifying computers.

    So, according to their purpose, computers are divided:

    • to universal ones - those that are capable of solving a wide variety of mathematical, economic, engineering, technical, scientific and other problems;
    • problem-oriented - solving problems of a narrower direction, associated, as a rule, with the management of certain processes (data recording, accumulation and processing of small amounts of information, performing calculations in accordance with simple algorithms). They have more limited software and hardware resources than the first group of computers;
    • specialized computers usually solve strictly defined tasks. They have a highly specialized structure and, with a relatively low complexity of the device and control, are quite reliable and productive in their field. These are, for example, controllers or adapters that control a number of devices, as well as programmable microprocessors.

    Based on size and productive capacity, modern electronic computing equipment is divided into:

    • to ultra-large (supercomputers);
    • large computers;
    • small computers;
    • ultra-small (microcomputers).

    Thus, we saw that devices, first invented by man to take into account resources and values, and then to quickly and accurately carry out complex calculations and computational operations, were constantly developing and improving.

    The history of the development of instrumental counting tools allows us to better understand the operation of modern computers. As Leibniz said: “Whoever wants to limit himself to the present without knowledge of the past will never understand the present.” Therefore, studying the history of the development of VT is important integral part computer science.

    Since ancient times, people have used various devices for counting. The first such “device” was our own fingers. A complete description of finger counting was compiled in medieval Europe by the Irish monk Bede the Venerable (7th century AD). Various techniques finger counting was used until the 18th century.

    Knotted ropes were used as instrumental counting tools.

    The most widespread in ancient times was the abacus, information about which has been known since the 5th century BC. The numbers in it were represented by pebbles, arranged in columns. In ancient Rome, pebbles were designated by the word Calculus, hence the words denoting counting (English calculate - count).

    The abacus, widely used in Rus', is similar in principle to the abacus.

    The need to use various devices for counting was explained by the fact that written counting was difficult. Firstly, this was due to the complex system of writing numbers, secondly, few people knew how to write, and thirdly, the means for writing (parchment) were very expensive. With the spread of Arabic numerals and the invention of paper (12-13th century), written counting began to develop widely, and the abacus was no longer needed.

    The first device that mechanized counting in the usual sense for us was the calculating machine, built in 1642 by the French scientist Blaise Pascal. It contained a set of vertically positioned wheels with the numbers 0-9 printed on them. If such a wheel made a full revolution, it engaged with the adjacent wheel and turned it one division, providing transfer from one category to another. Such a machine could add and subtract numbers and was used in Pascal's father's office to calculate the amount of taxes collected.

    Various projects and even working images of mechanical calculating machines were created before Pascal’s machine, but it was Pascal’s machine that became widely known. Pascal took out a patent for his machine and sold several dozen samples; Nobles and even kings were interested in his car; for example, one of the cars was presented to the Swedish Queen Christina.

    In 1673 German philosopher and mathematician Gottfried Leibniz created a mechanical calculating device that not only added and subtracted, but also multiplied and divided. This machine became the basis of mass calculating instruments - arithmometers. The production of mechanical calculating machines was established in the USA in 1887, in Russia in 1894. But these machines were manual, that is, they required constant human participation. They did not automate, but only mechanized the counting.

    Of great importance in the history of computing are attempts to “force” technical devices to perform any actions without human intervention, automatically.

    Such mechanical automatic machines, built on the basis of clock mechanisms, received great development in the 17th and 18th centuries. Particularly famous were the automata of the French mechanism of Jacques de Vaucanson, among which there was a toy flute player that looked like an ordinary person. But these were just toys.

    The introduction of automation into industrial production is associated with the name of the French engineer Jacquard, who invented a weaving machine control device based on punched cards—cardboards with holes. By punching holes on punched cards in different ways, it was possible to produce fabrics with different weaves of threads on machines.

    The father of computer technology is considered to be the 19th century English scientist Charles Babbage, who first attempted to build a calculating machine that worked according to a program. The machine was intended to help the British Maritime Department in compiling nautical tables. Babbage believed that a machine should have a device where numbers intended for calculations ("memory") would be stored. At the same time, there should be commands about what to do with these numbers (“stored program principle”). To perform operations on numbers, the machine must have a special device, which Babbage called a “mill”, and in modern computers it corresponds to an ALU. Numbers had to be entered into the machine manually and output to a printing device (“input/output devices”). And finally, there had to be a device that controls the operation of the entire machine (“CU”). Babbage's machine was mechanical and worked with numbers represented in the decimal system.

    Babbage's scientific ideas captivated the daughter of the famous English poet George Byron, Lady Ada Lovelace. She compiled programs that the machine could use to perform complex mathematical calculations. Many of the concepts introduced by Ada Lovelace in the description of those first programs in the world, in particular the concept of a “loop,” are widely used by modern programmers.

    The next important step towards automation of calculations was made approximately 20 years after Babbage's death by the American Herman Hollerith, who invented an electromechanical machine for computing using punched cards. The machine was used to process census data. Punch cards were manually punched with holes based on responses to census questions; a sorting machine made it possible to distribute cards into groups depending on the location of the punched holes, and a tabulator counted the number of cards in each group. Thanks to this machine, the results of the 1890 United States Census were processed three times faster than the previous one.

    In 1944, in the USA, under the leadership of Howard Aikin, an electromechanical computer was built, known as “Mark-1”, and then “Mark-2”. This machine was relay based. Since relays have two stable states, and the idea of ​​abandoning the decimal system had not yet occurred to the designers, numbers were represented in the binary-decimal system: each decimal digit was represented by four binary digits and was stored in a group of four relays. The operating speed was about 4 operations per second. At the same time, several more relay machines were created, including the Soviet relay computer RVM-1, designed in 1956 by Bessonov and successfully operating until 1966.

    The starting point of the computer era is usually taken to be February 15, 1946, when scientists at the University of Pennsylvania commissioned the world's first vacuum tube computer, the ENIAC. The first use of ENIAC was to solve problems for the top-secret atomic bomb project, and then it was used mainly for military purposes. ENIAC did not have a program stored in memory; “programming” was carried out by installing jumper wires between individual elements.

    Since 1944, John von Neumann has been involved in the development of computers. In 1946, his article was published, which formulated two of the most important principles underlying all modern computers: the use of the binary number system and the stored program principle.

    Computers also appeared in the USSR. In 1952, under the leadership of Academician Lebedev, the fastest computer in Europe, BESM, was created, and in 1953, production of the serial computer “Strela” began. Serial Soviet cars were at the level of the world's best models.

    The rapid development of VT began.

    The first computer using vacuum tubes (ENIAC) had about 20 thousand vacuum tubes, was located in a huge hall, consumed tens of kW of electricity and was very unreliable in operation - in fact, it only worked for short periods of time between repairs.

    Since then, the development of VT has come a long way. There are several generations of computers. A generation is understood as a certain stage in the development of equipment, characterized by its parameters, manufacturing technology of components, etc.

    1st generation – early 50s (BESM, Strela, Ural). Based on vacuum tubes. High power consumption, low reliability, low performance (2000 op/s), small memory capacity (several kilobytes); There were no means of organizing computing processes; the operator worked directly at the console.

    2nd generation – late 50s (Minsk – 2, Hrazdan, Nairi). Semiconductor elements, printed wiring, speed (50-60 thousand op/s); the emergence of external magnetic storage devices, primitive operating systems and translators from algorithmic languages ​​appeared.

    3rd generation – mid-60s. Built on the basis of integrated circuits, standard electronic components were used; speed up to 1.5 million op/s; developed software tools appeared.

    4th generation - built on microprocessors. Computers specialize, and different types appear: super computers - for solving very complex computing problems; mainframes - for solving economic and calculation problems within the enterprise, PCs - for individual use. Now PCs occupy the predominant part of the computer market, and their capabilities are millions of times greater than those of the first computers.

    The first Altair 8800 PC appeared in 1975 at MITS, but its capabilities were very limited, and there was no fundamental change in the use of computers. The revolution in the PC industry was carried out by two other companies - IBM and Apple Computer, whose rivalry contributed to the rapid development of high technology, improving the technical and user qualities of PCs. As a result of this competition, the computer has become an integral part of everyday life.

    The history of Apple began in 1976, when Steven Jobs and Steven Wozniak (both in their early 20s) assembled their first PC in a garage in Los Almos, California. However, real success came to the company thanks to the release of the Apple II computer, which was created based on the Motorolla microprocessor, appearance resembled an ordinary household appliance, and was affordable to the average American.

    IBM was born in 1914 and specialized in the production stationery typewriters. In the fifties, the company's founder, Thomas Watson, reoriented it to produce large computers. In the PC area, the company initially took a wait-and-see approach. Apple's wild success alerted the giant, and as soon as possible The first IBM PC was created, introduced in 1981. Using its enormous resources, the corporation literally flooded the market with its PCs, focusing on the most capacious area of ​​their application - the business world. The IBM PC was based on the latest microprocessor from Intel, which significantly expanded the capabilities of the new computer.

    To conquer the market, IBM pioneered the use of "open architecture" principles. The IBM PC was not manufactured as a single unit, but was assembled from individual modules. Any company could develop a device compatible with the IBM PC. This brought IBM enormous commercial success. But at the same time, many computers began to appear on the market - exact copies of the IBM PC - the so-called clones. The company responded to the appearance of “doubles” by sharply reducing prices and introducing new models.

    In response, Apple created the Apple Macintosh, which featured a mouse, a high-quality graphics display, and, for the first time, a microphone and sound generator. And most importantly, the software was convenient and easy to use. The Mac went on sale and had some success, but Apple failed to regain leadership in the PC market.

    In an effort to get closer to the ease of use of Apple computers, IBM stimulated the development of modern software. Microsoft's creation of the Windows OS95 played a huge role here.

    Since then, software has become more and more convenient and concept. PCs are equipped with new devices and from a device for professional activities become “digital entertainment centers”, combining the functions of various household appliances.

    At all times, starting from antiquity, people needed to count. At first, they used their own fingers or pebbles to count. However, even simple arithmetic operations with large numbers are difficult for the human brain. Therefore, already in ancient times, the simplest instrument for counting was invented - the abacus, invented more than 15 centuries ago in the Mediterranean countries. This prototype of modern accounts was a set of dominoes strung on rods and was used by merchants.

    The abacus rods in the arithmetic sense represent decimal places. Each domino on the first rod has a value of 1, on the second rod - 10, on the third rod - 100, etc. Until the 17th century, the abacus remained practically the only counting instrument.

    In Russia, the so-called Russian abacus appeared in the 16th century. They are based on the decimal number system and allow you to quickly perform arithmetic operations (Fig. 6)

    Rice. 6. Abacus

    In 1614, mathematician John Napier invented logarithms.

    A logarithm is an exponent to which a number must be raised (the base of the logarithm) to obtain another given number. Napier's discovery was that any number can be expressed in this way, and that the sum of the logarithms of any two numbers is equal to the logarithm of the product of these numbers. This made it possible to reduce the action of multiplication to the simpler action of addition. Napier created tables of logarithms. In order to multiply two numbers, you need to look at their logarithms in this table, add them and find the number corresponding to this sum in the reverse table - antilogarithms. Based on these tables, in 1654 R. Bissacar and in 1657, independently, S. Partridge developed a rectangular slide rule: the engineer’s main calculating device until the middle of the 20th century (Fig. 7).

    Rice. 7. Slide Rule

    In 1642, Blaise Pascal invented a mechanical adding machine using the decimal number system. Each decimal place was represented by a wheel with ten teeth, indicating numbers from 0 to 9. There were 8 wheels in total, that is, Pascal's machine was 8-bit.

    However, it was not the decimal number system that won in digital computing, but the binary number system. main reason This is because in nature there are many phenomena with two stable states, for example, “on/off”, “there is voltage / no voltage”, “false statement / true statement”, and phenomena with ten stable states are absent. Why is the decimal system so widespread? Yes, simply because a person has ten fingers on two hands, and they are convenient to use for simple mental counting. But in electronic computing it is much easier to use a binary number system with only two stable states of elements and simple addition and multiplication tables. In modern digital computers– computers – the binary system is used not only to record numbers on which computational operations need to be performed, but also to record the commands themselves for these calculations and even entire operation programs. In this case, all calculations and operations are reduced in a computer to the simplest arithmetic operations on binary numbers.



    One of the first to show interest in the binary system was the great German mathematician Gottfried Leibniz. In 1666, at the age of twenty, in his work “On the Art of Combinatorics,” he developed a general method that allows one to reduce any thought to precise formal statements. This opened up the possibility of transferring logic (Leibniz called it the laws of thought) from the realm of words to the realm of mathematics, where the relations between objects and statements are defined precisely and definitely. Thus, Leibniz was the founder of formal logic. He was researching the binary number system. At the same time, Leibniz endowed it with a certain mystical meaning: he associated the number 1 with God, and 0 with emptiness. From these two figures, in his opinion, everything happened. And with the help of these two numbers you can express any mathematical concept. Leibniz was the first to suggest that the binary system could become a universal logical language.

    Leibniz dreamed of building a “universal science.” He wanted to highlight the simplest concepts with the help of which certain rules concepts of any complexity can be formulated. He dreamed of creating a universal language in which any thoughts could be written down in the form of mathematical formulas. I thought about a machine that could derive theorems from axioms, about turning logical statements into arithmetic ones. In 1673, he created a new type of adding machine - a mechanical calculator that not only adds and subtracts numbers, but also multiplies, divides, raises to powers, and extracts square and cubic roots. It used the binary number system.

    The universal logical language was created in 1847 by the English mathematician George Boole. He developed propositional calculus, which was later named Boolean algebra in his honor. It represents formal logic translated into the strict language of mathematics. The formulas of Boolean algebra are similar in appearance to the formulas of the algebra that we are familiar with from school. However, this similarity is not only external, but also internal. Boolean algebra is a completely equal algebra, subject to the set of laws and rules adopted during its creation. It is a notation system applicable to any objects - numbers, letters and sentences. Using this system, you can encode any statements that need to be proven true or false, and then manipulate them like ordinary numbers in mathematics.

    George Boole (1815–1864) - English mathematician and logician, one of the founders of mathematical logic. Developed the algebra of logic (in the works “Mathematical Analysis of Logic” (1847) and “Study of the Laws of Thought” (1854)).

    The American mathematician Charles Peirce played a huge role in the spread of Boolean algebra and its development.

    Charles Pierce (1839–1914) was an American philosopher, logician, mathematician and natural scientist, known for his work on mathematical logic.

    The subject of consideration in the algebra of logic is the so-called statements, i.e. any statements that can be said to be either true or false: “Omsk is a city in Russia,” “15 is an even number.” The first statement is true, the second is false.

    Complex statements obtained from simple ones using the conjunctions AND, OR, IF...THEN, negations NOT, can also be true or false. Their truth depends only on the truth or falsity of the simple statements that form them, for example: “If it’s not raining outside, then you can go for a walk.” The main task of Boolean algebra is to study this dependence. Logical operations are considered that allow you to construct complex statements from simple ones: negation (NOT), conjunction (AND), disjunction (OR) and others.

    In 1804, J. Jacquard invented a weaving machine for producing fabrics with large patterns. This pattern was programmed using a whole deck of punched cards - rectangular cards made of cardboard. On them, information about the pattern was recorded by punching holes (perforations) located in a certain order. When the machine was operating, these punched cards were felt using special pins. It was in this mechanical way that information was read from them to weave a programmed fabric pattern. Jacquard's machine was the prototype of machines with program controlled created in the twentieth century.

    In 1820, Thomas de Colmar developed the first commercial adding machine capable of multiplying and dividing. Since the 19th century, adding machines have become widespread when performing complex calculations.

    In 1830, Charles Babbage tried to create a universal analytical engine that was supposed to perform calculations without human intervention. To do this, programs were introduced into it that were pre-recorded on punched cards made of thick paper using holes made on them in a certain order (the word “perforation” means “punching holes in paper or cardboard”). The programming principles for Babbage's Analytical Engine were developed in 1843 by Ada Lovelace, the daughter of the poet Byron.


    Rice. 8. Charles Babbage


    Rice. 9. Ada Lovelace

    An analytical engine must be able to remember data and intermediate results of calculations, that is, have memory. This machine was supposed to contain three main parts: a device for storing numbers typed using gears (memory), a device for operating on numbers (arithmetic unit), and a device for operating numbers using punched cards (program control device). The work on creating the analytical engine was not completed, but the ideas contained in it helped build the first computers in the 20th century (translated from English this word means “calculator”).

    In 1880 V.T. Odner in Russia created a mechanical adding machine with gear wheels, and in 1890 he launched its mass production. Subsequently, it was produced under the name “Felix” until the 50s of the 20th century (Fig. 11).


    Rice. 10. V.T. Odner


    Rice. 11. Mechanical adding machine "Felix"

    In 1888, Herman Hollerith (Fig. 12) created the first electromechanical calculating machine - a tabulator, in which information printed on punched cards (Fig. 13) was deciphered by electric current. This machine made it possible to reduce the counting time for the US Census several times. In 1890, Hollerith's invention was used for the first time in the 11th American Census. The work that 500 employees had previously taken as long as 7 years to complete was completed by Hollerith and 43 assistants on 43 tabulators in one month.

    In 1896, Hollerith founded a company called the Tabulating Machine Co. In 1911, this company was merged with two other companies that specialized in automating the processing of statistical data, and received its modern name IBM (International Business Machines) in 1924. It became an electronic corporation, one of the world's largest manufacturers of all types of computers and software, provider of global information networks. The founder of IBM was Thomas Watson Sr., who headed the company in 1914, essentially created the IBM Corporation and led it for more than 40 years. Since the mid-1950s, IBM has taken a leading position in the global computer market. In 1981, the company created its first personal computer, which became the industry standard. By the mid-1980s, IBM controlled about 60% of the world's production of electronic computers.


    Rice. 12. Thomas Watson Sr.

    Rice. 13. Herman Hollerith

    At the end of the 19th century, punched tape was invented - paper or celluloid film, on which information was applied with a punch in the form of a set of holes.

    Wide punched paper tape was used in the monotype, a typesetting machine invented by T. Lanston in 1892. The monotype consisted of two independent devices: a keyboard and a casting apparatus. The keyboard served to compile a typing program on punched tape, and the casting machine made the typing in accordance with the program previously compiled on the keyboard from a special typographic alloy - gart.

    Rice. 14. Punch card

    Rice. 15. Punched tapes

    The typesetter sat down at the keyboard, looked at the text standing in front of him on the music stand and pressed the appropriate keys. When one of the letter keys was struck, the needles of the punching mechanism used compressed air to punch a code combination of holes in the paper tape. This combination corresponded to a given letter, sign or space between them. After each strike on the key, the paper tape moved one step - 3 mm. Each horizontal row of holes on the punched paper corresponds to one letter, sign, or space between them. The finished (punched) spool of punched paper tape was transferred to a casting machine, in which, also using compressed air, the information encoded on it was read from the punched paper tape and a set of letters was automatically produced. Thus, the monotype is one of the first computer-controlled machines in the history of technology. It belonged to hot typesetting machines and over time gave way first to phototypesetting and then to electronic typesetting.

    Somewhat earlier than the monotype, in 1881, the pianola (or phonola) was invented - an instrument for automatically playing the piano. It also operated using compressed air. In a pianola, each key of an ordinary piano or grand piano corresponds to a hammer that strikes it. All the hammers together make up the counter-keyboard, which is attached to the piano keyboard. A wide paper punched tape wound on a roller is inserted into the pianola. The holes on the punched tape are made in advance while the pianist is playing - these are a kind of “notes”. When a pianola operates, the punched paper tape is rewound from one roller to another. The information recorded on it is read using a pneumatic mechanism. He activates hammers that correspond to holes on the punched tape, causing them to strike the keys and reproduce the pianist's performance. Thus, the pianola was also a program-controlled machine. Thanks to the preserved punched tapes, the pianolas were restored and re-recorded. modern methods the performance of such wonderful pianists of the past as composer A.N. Scriabin. The pianola was used by famous composers and pianists Rubinstein, Paderewski, Busoni.

    Later, information was read from punched tape and punched cards using electrical contacts - metal brushes, which, when contacted with a hole, closed an electrical circuit. Then the brushes were replaced with photocells, and information reading became optical, contactless. This is how information was recorded and read in the first digital computers.

    Logical operations are closely related to everyday life.

    Using one OR element for two inputs, two AND elements for two inputs and one NOT element, you can build a logical circuit of a binary half-adder capable of performing the binary addition operation of two single-digit binary numbers (i.e., fulfilling the rules of binary arithmetic):

    0 +0 =0; 0+1=1; 1+0=1; 1+1=0. In doing so, it allocates the carry bit.

    However, such a circuit does not contain a third input to which a carry signal from the previous bit of the sum of binary numbers can be applied. Therefore, the half-adder is used only in the least significant bit of the logic circuit for summing multi-bit binary numbers, where there cannot be a carry signal from the previous binary bit. A full binary adder adds two multi-bit binary numbers, taking into account the carry signals from the addition in the previous binary bits.

    By connecting binary adders in a cascade, you can obtain a logical adder circuit for binary numbers with any number of digits.

    With some modifications, these logic circuits are also used to subtract, multiply and divide binary numbers. With their help, the arithmetic devices of modern computers were built.

    In 1937, George Stibitz (Fig. 16) created a binary adder from ordinary electromechanical relays - a device capable of performing the operation of adding numbers in binary code. And today, the binary adder is still one of the main components of any computer, the basis of its arithmetic device.


    Rice. 16. George Stibitz

    In 1937–1942 John Atanasoff (Fig. 17) created a model of the first computer that ran on vacuum tubes. It used the binary number system. Punched cards were used to enter data and output calculation results. Work on this machine was almost completed in 1942, but due to the war, further funding was stopped.


    Rice. 17. John Atanasoff

    In 1937, Konrad Zuse (Fig. 12) created his first computer Z1 based on electromechanical relays. The initial data was entered into it using a keyboard, and the result of the calculations was displayed on a panel with many light bulbs. In 1938, K. Zuse created an improved model Z2. Programs were entered into it using punched tape. It was made by punching holes in used 35mm photographic film. In 1941, K. Zuse built a functioning computer Z3, and later Z4, based on the binary number system. They were used for calculations in the creation of aircraft and missiles. In 1942, Konrad Zuse and Helmut Schreier conceived the idea of ​​converting the Z3 from electromechanical relays to vacuum tubes. Such a machine was supposed to work 1000 times faster, but it was not possible to create it - the war got in the way.


    Rice. 18. Konrad Zuse

    In 1943–1944, at one of the IBM enterprises (IBM), in collaboration with scientists at Harvard University led by Howard Aiken, the Mark-1 computer was created. It weighed about 35 tons. "Mark-1" was based on the use of electromechanical relays and operated with numbers encoded on punched tape.

    When creating it, the ideas laid down by Charles Babbage in his Analytical Engine were used. Unlike Stiebitz and Zuse, Aiken did not realize the advantages of the binary number system and used the decimal system in his machine. The machine could manipulate numbers up to 23 digits long. To multiply two such numbers, she needed to spend 4 seconds. In 1947, the Mark-2 machine was created, which already used the binary number system. In this machine, addition and subtraction operations took an average of 0.125 seconds, and multiplication - 0.25 seconds.

    The abstract science of logic algebra is close to practical life. It allows you to decide different tasks management.

    The input and output signals of electromagnetic relays, like statements in Boolean algebra, also take only two values. When the winding is de-energized, the input signal is 0, and when current is flowing through the winding, the input signal is 1. When the relay contact is open, the output signal is 0, and when the contact is closed, it is 1.

    It was precisely this similarity between statements in Boolean algebra and the behavior of electromagnetic relays that was noticed by the famous physicist Paul Ehrenfest. As early as 1910, he proposed using Boolean algebra to describe the operation of relay circuits in telephone systems. According to another version, the idea of ​​​​using Boolean algebra to describe electrical switching circuits belongs to Peirce. In 1936 the founder modern theory information Claude Shannon in his doctoral dissertation combined the binary number system, mathematical logic and electrical circuits.

    It is convenient to designate connections between electromagnetic relays in circuits using the logical operations NOT, AND, OR, REPEAT (YES), etc. For example, a series connection of relay contacts implements an AND operation, and a parallel connection of these contacts implements a logical OR operation. Operations AND, OR, NOT are performed similarly in electronic circuits, where the role of relays that close and open electrical circuits is performed by contactless semiconductor elements - transistors, created in 1947–1948 by American scientists D. Bardeen, W. Brattain and W. Shockley.

    Electromechanical relays were too slow. Therefore, already in 1943, the Americans began developing a computer based on vacuum tubes. In 1946, Presper Eckert and John Mauchly (Fig. 13) built the first electronic digital computer, ENIAC. Its weight was 30 tons, it occupied 170 square meters. m area. Instead of thousands of electromechanical relays, ENIAC contained 18,000 vacuum tubes. The machine counted in the binary system and performed 5000 addition operations or 300 multiplication operations per second. Not only an arithmetic device, but also a storage device was built on vacuum tubes in this machine. Numerical data was entered using punched cards, while programs were entered into this machine using plugs and typesetting fields, that is, thousands of contacts had to be connected for each new program. Therefore, it took up to several days to prepare to solve a new problem, although the problem itself was solved in a few minutes. This was one of the main disadvantages of such a machine.


    Rice. 19. Presper Eckert and John Mauchly

    The work of three outstanding scientists - Claude Shannon, Alan Turing and John von Neumann - became the basis for creating the structure of modern computers.

    Shannon Claude (born 1916) is an American engineer and mathematician, the founder of mathematical information theory.

    In 1948, he published the work “Mathematical Theory of Communication,” with his theory of information transmission and processing, which included all types of messages, including those transmitted along nerve fibers in living organisms. Shannon introduced the concept of the amount of information as a measure of the uncertainty of the state of the system, removed when receiving information. He called this measure of uncertainty entropy, by analogy with a similar concept in statistical mechanics. When the observer receives information, entropy, that is, the degree of his ignorance about the state of the system, decreases.

    Alan Turing (1912–1954) – English mathematician. His main works are on mathematical logic and computational mathematics. In 1936–1937 wrote the seminal work “On Computable Numbers,” in which he introduced the concept of an abstract device, later called the “Turing machine.” In this device he anticipated the basic properties of the modern computer. Turing called his device a “universal machine,” since it was supposed to solve any admissible (theoretically solvable) mathematical or logical problem. Data must be entered into it from a paper tape divided into cells - cells. Each such cell either had to contain a symbol or not. The Turing machine could process symbols input from the tape and change them, that is, erase them and write new ones according to instructions stored in its internal memory.

    Neumann John von (1903–1957) - American mathematician and physicist, participant in the development of atomic and hydrogen weapons. Born in Budapest, he lived in the USA since 1930. In his report, published in 1945 and becoming the first work on digital electronic computers, he identified and described the “architecture” of the modern computer.

    In the next machine - EDVAC - its more capacious internal memory was capable of storing not only the original data, but also the calculation program. This idea - to store programs in the memory of machines - was put forward by mathematician John von Neumann along with Mauchly and Eckert. He was the first to describe the structure of a universal computer (the so-called “von Neumann architecture” of a modern computer). For versatility and efficient work According to von Neumann, a computer should contain a central arithmetic-logical unit, a central device for controlling all operations, a storage device (memory) and an information input/output device, and programs should be stored in the computer's memory.

    Von Neumann believed that a computer should operate on the basis of the binary number system, be electronic, and perform all operations sequentially, one after another. These principles are the basis of all modern computers.

    A machine using vacuum tubes worked much faster than one using electromechanical relays, but the vacuum tubes themselves were unreliable. They often failed. To replace them in 1947, John Bardeen, Walter Brattain and William Shockley proposed using the switching semiconductor elements they invented - transistors.

    John Bardeen (1908–1991) – American physicist. One of the creators of the first transistor (1956 Nobel Prize in Physics together with W. Brattain and W. Shockley for the discovery of the transistor effect). One of the authors of the microscopic theory of superconductivity (second Nobel Prize in 1957 jointly with L. Cooper and D. Schriffen).

    Brattain Walter (1902–1987) - American physicist, one of the creators of the first transistor, laureate Nobel Prize in Physics 1956.

    William Shockley (1910–1989) - American physicist, one of the creators of the first transistor, winner of the 1956 Nobel Prize in Physics.

    In modern computers, microscopic transistors in an integrated circuit chip are grouped into systems of “gates” that perform logical operations on binary numbers. For example, with their help, the binary adders described above were built, which allow adding multi-digit binary numbers, subtracting, multiplying, dividing and comparing numbers with each other. Logic gates, acting according to certain rules, control the movement of data and the execution of instructions in the computer.

    The improvement of the first types of computers led in 1951 to the creation of the UNIVAC computer, intended for commercial use. It became the first commercially produced computer.

    The serial tube computer IBM 701, which appeared in 1952, performed up to 2200 multiplication operations per second.


    IBM 701 computer

    The initiative to create this system belonged to Thomas Watson Jr. In 1937, he began working for the company as a traveling salesman. He only stopped working for IBM during the war, when he was a pilot in the United States Air Force. Returning to the company in 1946, he became its vice president and headed IBM from 1956 to 1971. While remaining a member of the IBM board of directors, Thomas Watson served as the United States Ambassador to the USSR from 1979 to 1981.


    Thomas Watson (Jr.)

    In 1964, IBM announced the creation of six models of the IBM 360 family (System 360), which became the first computers of the third generation. The models had a single command system and differed from each other in the amount of RAM and performance. When creating models of the family, a number of new principles were used, which made the machines universal and made it possible to use them with equal efficiency both for solving problems in various fields of science and technology, and for processing data in the field of management and business. IBM System/360 (S/360) is a family of mainframe class universal computers. Further development IBM/360 became the 370, 390, z9 and zSeries systems. In the USSR, the IBM/360 was cloned under the name ES COMPUTER. They were software compatible with their American prototypes. This made it possible to use Western software in conditions of underdevelopment of the domestic “programming industry.”


    IBM/360 computer


    T. Watson (Jr.) and V. Lerson at the IBM/360 computer

    The first in the USSR Small Electronic Computing Machine (MESM) using vacuum tubes was built in 1949–1951. under the leadership of academician S.A. Lebedeva. Regardless of foreign scientists S.A. Lebedev developed the principles of constructing a computer with a program stored in memory. MESM was the first such machine. And in 1952–1954. under his leadership, the High-Speed ​​Electronic Calculating Machine (BESM) was developed, performing 8,000 operations per second.


    Lebedev Sergey Alekseevich

    The creation of electronic computers was led by the largest Soviet scientists and engineers I.S. Brook, W.M. Glushkov, Yu.A. Bazilevsky, B.I. Rameev, L.I. Gutenmacher, N.P. Brusentsov.

    The first generation of Soviet computers included tube computers - “BESM-2”, “Strela”, “M-2”, “M-3”, “Minsk”, “Ural-1”, “Ural-2”, “M- 20".

    The second generation of Soviet computers includes semiconductor small computers “Nairi” and “Mir”, medium-sized computers for scientific calculations and information processing with a speed of 5-30 thousand operations per second “Minsk-2”, “Minsk-22”, “Minsk-32” ", "Ural-14", "Razdan-2", "Razdan-3", "BESM-4", "M-220" and control computers "Dnepr", "VNIIEM-3", as well as the ultra-high-speed BESM-6 with a performance of 1 million operations per second.

    The founders of Soviet microelectronics were scientists who emigrated from the USA to the USSR: F.G. Staros (Alfred Sarant) and I.V. Berg (Joel Barr). They became the initiators, organizers and managers of the microelectronics center in Zelenograd near Moscow.


    F.G. Staros

    Third-generation computers based on integrated circuits appeared in the USSR in the second half of the 1960s. Have been developed One system Computers (ES Computers) and System of Small Computers (SM Computers) and their mass production is organized. As mentioned above, this system was a clone American system IBM/360.

    Evgeniy Alekseevich Lebedev was an ardent opponent of the copying of the American IBM/360 system, which in the Soviet version was called the ES Computer, which began in the 1970s. The role of the EU Computers in the development of domestic computers is ambiguous.

    At the initial stage, the emergence of ES computers led to the unification of computer systems, made it possible to establish initial programming standards and organize large-scale projects related to the implementation of programs.

    The price of this was the widespread curtailment of their own original developments and becoming completely dependent on the ideas and concepts of IBM, which were far from the best at that time. The abrupt transition from easy-to-use Soviet machines to the much more complex hardware and software of the IBM/360 meant that many programmers had to overcome difficulties associated with the shortcomings and errors of IBM developers. The initial models of ES computers were often inferior in performance characteristics to domestic computers of that time.

    At a later stage, especially in the 80s, the widespread introduction of EU computers turned into a serious obstacle to the development of software, databases, and dialog systems. After expensive and pre-planned purchases, enterprises were forced to operate obsolete computer systems. In parallel, systems developed on small machines and on personal computers, which became more and more popular.

    At a later stage, with the beginning of perestroika, from 1988–89, our country was flooded with foreign personal computers. No measures could stop the crisis of the EU computer series. Domestic industry was unable to create analogues or substitutes for ES computers based on a new element base. The economy of the USSR did not allow by that time to spend gigantic financial resources on the creation of microelectronic equipment. As a result, there was a complete transition to imported computers. Programs for the development of domestic computers were finally curtailed. Problems arose of transferring technologies to modern computers, modernizing technologies, employing and retraining hundreds of thousands of specialists.

    Forecast S.A. Lebedeva was justified. Both in the USA and throughout the world, they subsequently followed the path that he proposed: on the one hand, supercomputers are created, and on the other, a whole series of less powerful computers aimed at various applications - personal, specialized, etc.

    The fourth generation of Soviet computers was implemented on the basis of large-scale (LSI) and ultra-large-scale (VLSI) integrated circuits.

    An example of large fourth-generation computer systems was the Elbrus-2 multiprocessor complex with a speed of up to 100 million operations per second.

    In the 1950s, the second generation of transistor-based computers was created. As a result, the speed of the machines increased 10 times, and the size and weight were significantly reduced. They began to use storage devices on magnetic ferrite cores, capable of storing information indefinitely even when computers are turned off. They were designed by Joy Forrester in 1951–1953. Large amounts of information were stored on external media, such as magnetic tape or a magnetic drum.

    The first hard disk drive in the history of computing (winchester) was developed in 1956 by a group of IBM engineers led by Reynold B. Johnson. The device was called 305 RAMAC - a random access method of accounting and control. The drive consisted of 50 aluminum disks with a diameter of 24 inches (about 60 cm) and a thickness of 2.5 cm each. A magnetic layer was applied to the surface of the aluminum plate, onto which recording was carried out. This entire structure of disks on a common axis rotated in operating mode at a constant speed of 1200 rpm, and the drive itself occupied an area measuring 3x3.5 m. Its total capacity was 5 MB. One of the most important principles used in the design of the RAMAC 305 was that the heads did not touch the surface of the disks, but hovered at a small fixed distance. For this purpose, special air nozzles were used, which directed the flow to the disk through small holes in the head holders and thereby created a gap between the head and the surface of the rotating plate.

    The Winchester (hard drive) provided computer users with the ability to store very large amounts of information and at the same time quickly retrieve the necessary data. After the creation of the hard drive in 1958, magnetic tape media was abandoned.

    In 1959, D. Kilby, D. Herney, K. Lehovec and R. Noyce (Fig. 14) invented integrated circuits (chips), in which all electronic components, along with conductors, were placed inside a silicon wafer. The use of chips in computers has made it possible to shorten the paths for current flow during switching. The speed of calculations has increased tenfold. The dimensions of the machines have also decreased significantly. The appearance of the chip made it possible to create the third generation of computers. And in 1964, IBM began producing IBM-360 computers based on integrated circuits.


    Rice. 14. D. Kilby, D. Hurney, K. Lechovec and R. Noyce

    In 1965, Douglas Engelbart (Fig. 15) created the first “mouse” - a computer handheld manipulator. It was first used in the Apple Macintosh personal computer, released later, in 1976.


    Rice. 19. Douglas Engelbart

    In 1971, IBM began producing the computer floppy disk, invented by Yoshiro Nakamatsu, a removable flexible magnetic disk (“floppy disk”) for permanent storage of information. Initially, the floppy disk had a diameter of 8 inches and a capacity of 80 KB, then - 5 inches. The modern 1.44 MB floppy disk, first released by Sony in 1982, is housed in a hard plastic case and has a diameter of 3.5 inches.

    In 1969, the creation of a defense computer network began in the United States - the progenitor of the modern worldwide Internet networks.

    In the 1970s, dot matrix printers were developed to print information output from computers.

    In 1971, Intel employee Edward Hoff (Fig. 20) created the first microprocessor, the 4004, by placing several integrated circuits on a single silicon chip. Although it was originally intended for use in calculators, it was essentially a complete microcomputer. This revolutionary invention radically changed the idea of ​​computers as bulky, ponderous monsters. The microprocessor made it possible to create fourth-generation computers that fit on the user's desk.


    Rice. 20. Edward Hoff

    In the mid-1970s, attempts began to create a personal computer (PC), a computing machine intended for the private user.

    In 1974, Edward Roberts (Fig. 21) created the first personal computer, Altair, based on the Intel 8080 microprocessor (Fig. 22). But without software it was ineffective: after all, a private user does not have his own programmer “at hand” at home.


    Rice. 21. Edward Roberts


    Rice. 22. First personal computer Altair

    In 1975, two Harvard University students, Bill Gates and Paul Allen, learned about the creation of the Altair PC (Fig. 23). They were the first to understand the urgent need to write software for personal computers and within a month they created it for the Altair PC based on the BASIC language. That same year, they founded Microsoft, which quickly became a leader in personal computer software and became the richest company in the world.


    Rice. 23. Bill Gates and Paul Allen


    Rice. 24. Bill Gates

    In 1973, IBM developed a hard magnetic disk (hard drive) for a computer. This invention made it possible to create large-capacity long-term memory, which is retained when the computer is turned off.

    The first Altair-8800 microcomputers were just a collection of parts that still needed to be assembled. In addition, they were extremely inconvenient to use: they had neither a monitor, nor a keyboard, nor a mouse. Information was entered into them using switches on the front panel, and the results were displayed using LED indicators. Later they began to display results using a teletype - a telegraph machine with a keyboard.

    In 1976, 26-year-old engineer Steve Wozniak from Hewlett-Packard created a fundamentally new microcomputer. He pioneered the use of a keyboard-like keyboard for data entry typewriter, and to display information - an ordinary TV. Symbols were displayed on its screen in 24 lines of 40 characters each. The computer had 8 KB of memory, half of which was occupied by the built-in BASIC language, and half the user could use to enter his programs. This computer was significantly superior to the Altair-8800, which had only 256 bytes of memory. S. Wozniak provided a connector (the so-called “slot”) for his new computer for connecting additional devices. Steve Wozniak’s friend Steve Jobs was the first to understand and appreciate the prospects of this computer (Fig. 25). He proposed to organize a company for its serial production. On April 1, 1976, they founded the Apple company, and officially registered it in January 1977. They called the new computer Apple-I (Fig. 26). Within 10 months, they managed to assemble and sell about 200 copies of Apple-I.


    Rice. 25. Steve Wozniak and Steve Jobs


    Rice. 26. Apple-I Personal Computer

    At this time, Wozniak was already working on improving it. A new version received the name Apple-II (Fig. 23). The computer was made in a plastic case, received a graphics mode, sound, color, expanded memory, 8 expansion connectors (slots) instead of one. It used a cassette recorder to save programs. The basis of the first Apple II model was, as in the Apple I, the 6502 microprocessor from MOS Technology with a clock frequency of 1 megahertz. BASIC was recorded in permanent memory. The 4 KB RAM capacity was expanded to 48 KB. The information was displayed on a color or black-and-white TV operating in the NTSC standard system for the USA. In text mode, 24 lines were displayed, 40 characters each, and in graphic mode, the resolution was 280 by 192 pixels (six colors). The main advantage of the Apple II was the ability to expand its RAM up to 48 KB and use 8 connectors for connecting additional devices. Thanks to the use of color graphics, it could be used for a wide variety of games (Fig. 27).


    Rice. 27. Apple II Personal Computer

    Thanks to its capabilities, the Apple II has gained popularity among people of various professions. Its users were not required to have knowledge of electronics or programming languages.

    The Apple II became the first truly personal computer for scientists, engineers, lawyers, businessmen, housewives and schoolchildren.

    In July 1978, the Apple II was supplemented with the Disk II drive, which significantly expanded its capabilities. The disk operating system Apple-DOS was created for it. And at the end of 1978, the computer was improved again and released under the name Apple II Plus. Now it could be used in the business sphere to store information, conduct business, and help in decision making. The creation of such application programs as text editors, organizers, and spreadsheets began.

    In 1979, Dan Bricklin and Bob Frankston created VisiCalc, the world's first spreadsheet. This tool was best suited for accounting calculations. Its first version was written for the Apple II, which was often purchased only to work with VisiCalc.

    Thus, in a few years, the microcomputer, largely thanks to Apple and its founders Steven Jobs and Steve Wozniak, turned into a personal computer for people of various professions.

    In 1981, the IBM PC personal computer appeared, which soon became the standard in the computer industry and displaced almost all competing personal computer models from the market. The only exception was Apple. In 1984, the Apple Macintosh was created, the first computer with a graphical interface controlled by a mouse. Thanks to its advantages, Apple managed to stay in the personal computer market. It has conquered the market in education and publishing, where the outstanding graphics capabilities of Macintoshes are used for layout and image processing.

    Today, Apple controls 8–10% of the global personal computer market, and the remaining 90% is IBM-compatible personal computers. Most of Macintosh computers are located in the United States.

    In 1979, the optical compact disc (CD) appeared, developed by Philips and intended only for listening to music recordings.

    In 1979, Intel developed the 8088 microprocessor for personal computers.

    Personal computers of the IBM PC model, created in 1981 by a group of IBM engineers led by William C. Lowe, became widespread. The IBM PC had an Intel 8088 processor with a clock frequency of 4.77 MHz, 16 Kb of memory expandable up to 256 Kb, and the DOS 1.0 operating system. (Fig. 24). The DOS 1.0 operating system was created by Microsoft. In just one month, IBM managed to sell 241,683 IBM PCs. By agreement with Microsoft executives, IBM paid a certain amount to the creators of the program for each copy of the operating system installed on the IBM PC. Thanks to the popularity of the IBM PC, Microsoft executives Bill Gates and Paul Allen soon became billionaires, and Microsoft took a leading position in the software market.


    Rice. 28. Personal computer model IBM PC

    The IBM PC applied the principle of open architecture, which made it possible to make improvements and additions to existing PC designs. This principle means the use of ready-made blocks and devices in the design when assembling a computer, as well as the standardization of methods for connecting computer devices.

    The principle of open architecture contributed to the widespread adoption of IBM PC-compatible clone microcomputers. A large number of companies around the world began assembling them from ready-made blocks and devices. Users, in turn, were able to independently upgrade their microcomputers and equip them with additional devices from hundreds of manufacturers.

    In the late 1990s, IBM PC-compatible computers accounted for 90% of the personal computer market.

    The IBM PC soon became the standard in the computer industry and drove almost all competing personal computer models out of the market. The only exception was Apple. In 1984, the Apple Macintosh was created, the first computer with a graphical interface controlled by a mouse. Thanks to its advantages, Apple managed to stay in the personal computer market. It has conquered the market in the field of education, publishing, where their outstanding graphics capabilities are used for layout and image processing.

    Today, Apple controls 8–10% of the global personal computer market, and the remaining 90% is IBM-compatible personal computers. Most Macintosh computers are owned by US users.

    Over the last decades of the 20th century, computers have greatly increased their speed and the volume of information processed and stored.

    In 1965, Gordon Moore, one of the founders of Intel Corporation, a leader in the field of computer integrated circuits - “chips”, suggested that the number of transistors in them would double every year. Over the next 10 years, this prediction came true, and then he suggested that this number would now double every 2 years. Indeed, the number of transistors in microprocessors doubles every 18 months. Now specialists computer technology This trend is called Moore's law.


    Rice. 29. Gordon Moore

    A similar pattern is observed in the development and production of RAM devices and information storage devices. By the way, I have no doubt that by the time this book is published, many digital data in terms of their capacity and speed will have become outdated.

    The development of software, without which it is generally impossible to use a personal computer, and, above all, operating systems that ensure interaction between the user and the PC, has not lagged behind.

    In 1981, Microsoft developed the MS-DOS operating system for its personal computers.

    In 1983, the improved personal computer IBM PC/XT from IBM was created.

    In the 1980s, black-and-white and color inkjet and laser printers were created to print information output from computers. They are significantly superior to dot matrix printers in terms of print quality and speed.

    In 1983–1993, the global computer network Internet and E-mail were created, which were used by millions of users around the world.

    In 1992, Microsoft released the Windows 3.1 operating system for IBM PC-compatible computers. The word “Windows” translated from English means “windows”. A windowed operating system allows you to work with several documents at once. It is a so-called “graphical interface”. This is a system of interaction with a PC in which the user deals with so-called “icons”: pictures that he can control using a computer mouse. This graphical interface and window system was first created in research center Xerox in 1975 and used for Apple PCs.

    In 1995, Microsoft released the Windows-95 operating system for IBM PC-compatible computers, more advanced than Windows-3.1, in 1998 - its modification Windows-98, and in 2000 - Windows-2000, and in 2006 – Windows XP. A number of application programs have been developed for them: Word text editor, Excel spreadsheets, a program for using the Internet system and by email E-mail – Internet Explorer, graphic editor Paint, standard application programs (calculator, clock, dialer), Microsoft Schedule diary, universal player, phonograph and laser player.

    In recent years it has become possible to combine text and graphics with sound and moving images on the personal computer. This technology is called “multimedia”. Optical CD-ROMs (Compact Disk Read Only Memory - i.e. read-only memory on a CD) are used as storage media in such multimedia computers. Outwardly, they do not differ from audio CDs used in players and music centers.

    The capacity of one CD-ROM reaches 650 MB; in terms of capacity, it occupies an intermediate position between floppy disks and a hard drive. A CD drive is used to read CDs. Information is written to a CD only once per industrial conditions, but on a PC you can only read it. A wide variety of games, encyclopedias, art albums, maps, atlases, dictionaries and reference books are published on CD-ROM. All of them are equipped with convenient search engines allowing you to quickly find the material you need. The memory capacity of two CD-ROMs is enough to accommodate an encyclopedia larger in volume than the Great Soviet Encyclopedia.

    In the late 1990s, write-once CD-R and rewritable CD-RW optical compact discs and drives were created, allowing the user to make any audio and video recordings to their liking.

    In 1990–2000, in addition to desktop personal computers, “laptop” PCs were released in the form of a portable suitcase and even smaller pocket “palmtops” (handhelds) - as their name suggests, they fit in your pocket and on the palm of your hand. Laptops are equipped with a liquid crystal display screen located in the hinged lid, and for palmtops - on the front panel of the case.

    In 1998–2000, miniature solid-state “flash memory” (without moving parts) was created. Thus, Memory Stick memory has the dimensions and weight of a piece of chewing gum, and SD memory from Panasonic has the size and weight of a postage stamp. Meanwhile, the volume of their memory, which can be stored indefinitely, is 64–128 MB and even 2–8 GB or more.

    In addition to portable personal computers, supercomputers are being created to solve complex problems in science and technology - weather and earthquake forecasts, rocket and aircraft calculations, nuclear reactions, deciphering the human genetic code. They use from several to several dozen microprocessors that perform parallel calculations. The first supercomputer was developed by Seymour Cray in 1976.

    In 2002, the NEC Earth Simulator supercomputer was built in Japan, performing 35.6 trillion operations per second. Today it is the fastest supercomputer in the world.


    Rice. 30. Seymour Cray


    Rice. 31. Supercomputer Cray-1


    Rice. 32. Supercomputer Cray-2

    In 2005, IBM developed the Blue Gene supercomputer with a performance of over 30 trillion operations per second. It contains 12,000 processors and has a thousand times more power than the famous Deep Blue, with which world champion Garry Kasparov played chess in 1997. IBM and researchers from the Swiss Polytechnic Institute in Lausanne have attempted to model the human brain for the first time.

    In 2006, personal computers turned 25 years old. Let's see how they have changed over the years. The first of them, equipped with an Intel microprocessor, operated with a clock frequency of only 4.77 MHz and had 16 KB of RAM. Modern PCs equipped with a Pentium 4 microprocessor, created in 2001, have a clock frequency of 3–4 GHz, RAM 512 MB - 1 GB and long-term memory (hard drive) with a capacity of tens and hundreds of GB and even 1 terabyte. Such gigantic progress has not been observed in any branch of technology except digital computing. If the same progress had been made in increasing the speed of aircraft, then they would have been flying at the speed of light long ago.

    Millions of computers are used in almost all sectors of the economy, industry, science, technology, pedagogy, and medicine.

    The main reasons for this progress are the unusually high rates of microminiaturization of devices digital electronics and the success of programming, which made the “communication” of ordinary users with personal computers simple and convenient.

    The first device designed to make counting easier was the abacus. With the help of abacus dominoes it was possible to perform addition and subtraction operations and simple multiplications.

    1642 - French mathematician Blaise Pascal designed the first mechanical adding machine, the Pascalina, which could mechanically perform the addition of numbers.

    1673 - Gottfried Wilhelm Leibniz designed an adding machine that could mechanically perform the four arithmetic operations.

    First half of the 19th century - English mathematician Charles Babbage tried to build a universal computing device, that is, a computer. Babbage called it the Analytical Engine. He determined that a computer must contain memory and be controlled by a program. According to Babbage, a computer is a mechanical device for which programs are set using punched cards - cards made of thick paper with information printed using holes (at that time they were already widely used in looms).

    1941 - German engineer Konrad Zuse built a small computer based on several electromechanical relays.

    1943 - in the USA, at one of the IBM enterprises, Howard Aiken created a computer called “Mark-1”. It allowed calculations to be carried out hundreds of times faster than by hand (using an adding machine) and was used for military calculations. It used a combination of electrical signals and mechanical drives. "Mark-1" had dimensions: 15 * 2-5 m and contained 750,000 parts. The machine was capable of multiplying two 32-bit numbers in 4 seconds.

    1943 - in the USA, a group of specialists led by John Mauchly and Prosper Eckert began to construct the ENIAC computer based on vacuum tubes.

    1945 - mathematician John von Neumann was brought in to work on ENIAC and prepared a report on this computer. In his report, von Neumann formulated the general principles of the functioning of computers, i.e., universal computing devices. To this day, the vast majority of computers are made in accordance with the principles laid down by John von Neumann.

    1947 - Eckert and Mauchly began development of the first electronic serial machine UNIVAC (Universal Automatic Computer). The first model of the machine (UNIVAC-1) was built for the US Census Bureau and put into operation in the spring of 1951. The synchronous, sequential computer UNIVAC-1 was created on the basis of the ENIAC and EDVAC computers. It operated with a clock frequency of 2.25 MHz and contained about 5,000 vacuum tubes. The internal storage capacity of 1000 12-bit decimal numbers was implemented on 100 mercury delay lines.

    1949 - English researcher Mornes Wilkes built the first computer, which embodied von Neumann's principles.

    1951 - J. Forrester published an article on the use of magnetic cores for storing digital information. The Whirlwind-1 machine was the first to use magnetic core memory. It consisted of 2 cubes with 32-32-17 cores, which provided storage of 2048 words for 16-bit binary numbers with one parity bit.

    1952 - IBM released its first industrial electronic computer, the IBM 701, which was a synchronous parallel computer containing 4,000 vacuum tubes and 12,000 diodes. An improved version of the IBM 704 machine was distinguished by its high speed, it used index registers and represented data in floating point form.

    After the IBM 704 computer, the IBM 709 was released, which in architectural terms was close to the machines of the second and third generations. In this machine, indirect addressing was used for the first time and input-output channels appeared for the first time.

    1952 - Remington Rand released the UNIVAC-t 103 computer, which was the first to use software interrupts. Remington Rand employees used an algebraic form of writing algorithms called “Short Code” (the first interpreter, created in 1949 by John Mauchly).

    1956 - IBM developed floating magnetic heads on an air cushion. Their invention made it possible to create a new type of memory - disk storage devices (SD), the importance of which was fully appreciated in the subsequent decades of the development of computer technology. The first disk storage devices appeared in IBM 305 and RAMAC machines. The latter had a package consisting of 50 metal disks with a magnetic coating, which rotated at a speed of 12,000 rpm. /min. The surface of the disk contained 100 tracks for recording data, each containing 10,000 characters.

    1956 - Ferranti released the Pegasus computer, in which the concept of general purpose registers (GPR) was first implemented. With the advent of RON, the distinction between index registers and accumulators was eliminated, and the programmer had at his disposal not one, but several accumulator registers.

    1957 - a group led by D. Backus completed work on the first programming language high level, called FORTRAN. The language, implemented for the first time on the IBM 704 computer, contributed to expanding the scope of computers.

    1960s - 2nd generation of computers, computer logic elements are implemented on the basis of semiconductor transistor devices, algorithmic programming languages ​​such as Algol, Pascal and others are being developed.

    1970s - 3rd generation of computers, integrated circuits containing on one semiconductor wafer thousands of transistors. OS and structured programming languages ​​began to be created.

    1974 - several companies announced the creation of a personal computer based on the Intel-8008 microprocessor - a device that performs the same functions as a large computer, but is designed for one user.

    1975 - the first commercially distributed personal computer Altair-8800 based on the Intel-8080 microprocessor appeared. This computer had only 256 bytes of RAM, and there was no keyboard or screen.

    Late 1975 - Paul Allen and Bill Gates (future founders of Microsoft) created a Basic language interpreter for the Altair computer, which allowed users to simply communicate with the computer and easily write programs for it.

    August 1981 - IBM introduced the IBM PC personal computer. The main microprocessor of the computer was a 16-bit Intel-8088 microprocessor, which allowed working with 1 megabyte of memory.

    1980s - 4th generation of computers built on large integrated circuits. Microprocessors are implemented in the form of a single chip, mass production of personal computers.

    1990s — 5th generation of computers, ultra-large integrated circuits. Processors contain millions of transistors. The emergence of global computer networks for mass use.

    2000s — 6th generation of computers. Integration of computers and household appliances, embedded computers, development of network computing.