Computer Pc History



A computer is a programmable information processing system as defined by Turing and works by sequentially reading a set of instructions, organized into programs, that make it perform logical and arithmetic operations. Its current physical structure means that all transactions are based on binary logic and numbers formed from binary numbers. As soon as it is switched on, a computer executes instructions, one after the other, that make it read, manipulate, and then rewrite a set of data determined by a dead seed memory. Conditional tests and jumps allow you to move on to the next instruction and thus act differently depending on the data or the needs of the moment or environment.

The data to be manipulated is obtained, either by reading memories or by reading information from internal or external devices (moving a mouse, key resting on a keyboard, temperature, speed, compression…). Once used, or manipulated, the results are written, either in memories or in components that can transform a binary value into a physical action (writing on a printer or monitor, accelerating or braking a vehicle, changing the temperature of an oven…). The computer can also respond to interruptions that allow it to run specific response programs and then resume sequential execution of the interrupted program.

From 1834 to 1837, Charles Babbage designed a programmable calculating machine by associating one of the descendants of the Pascaline (the first mechanical calculating machine invented by Blaise Pascal) with instructions written on the same type of perforated cards as those invented by Jacquard for his looms1. It was during this period that he imagined most of the characteristics of the modern computer.2 Babbage spent the rest of his life trying to build his analytical machine, but without success. Many people became interested and tried to develop this machine3, but it was a hundred years later, in 1937, that IBM ushered in the era of computing by beginning the development of ASCC/Mark I, a machine based on Babbage’s architecture that, once realized, would be considered the completion of its dream4.

The current computer technology dates back to the mid-20th century. Computers can be categorized according to several criteria5 such as application area, size or architecture.

History

The most famous image from the beginning of the history of computer science13
This portrait of Jacquard, woven in silk on a Jacquard loom, required the reading of 24,000 cards of more than 1,000 holes each (1839). It was only produced on request. Charles Babbage often used it to explain his ideas about what was the first sketch of a computer, his analytical machine, which used Jacquard cards for its commands and data.14
First invention
Detailed article: History of computers.
According to Bernard Cohen, author of Howard Aiken: Portrait of a computer pioneer, “Technology historians and computer scientists interested in history have adopted a number of characteristics that define a computer. Thus, the question of whether mark I was or was not a computer does not depend on a majority opinion but rather on the definition used. Often, some of the basic features needed to be considered a computer are:

It’s electronic
digital (instead of analog);
It is programmable
that he can perform the four basic operations (addition, subtraction, multiplication, division) and -often – that he can extract a square root or address a table that contains it;
that it can run programs stored in memory.
A machine is generally not classified as a computer unless it has additional features such as the ability to perform specific operations automatically and this in a controlled manner and in a sequence Predetermined. For other historians and computer scientists, the machine must also have been truly built and fully operational

1930s
The late 1930s saw, for the first time in the history of computer science, the beginning of the construction of two programmable calculating machines. They used relays and were programmed by reading perforated rollers and therefore, for some, were already computers. They were not commissioned until the early 1940s, making 1940 the first decade in which computers and fully functional programmable computing machines were found. It was first in 1937 that Howard Aiken, who had realized that Babbage’s analytical machine was the type of calculating machine he wanted to develop26, proposed to IBM to create and build it; after a feasibility study, Thomas J. Watson agreed to build it in 1939; it was tested in 1943 at IBM’s premises and was donated and moved to Harvard University in 1944, changing its name from ASCC to Harvard Mark I or Mark I.

But it was also Konrad Zuse who began the development of his Zuse 3, in secret, in 1939, and who finished it in 1941. Because the Zuse 3 remained unknown to the general public until after the end of the Second World War (except for the American secret services that destroyed it in a bombing in 1943), its very inventive solutions were not used in joint efforts. global computer development.

Facebook Comments