Computer

format_list_bulleted Contenido keyboard_arrow_down
ImprimirCitar
Personal computer, typical hardware view.
1: Monitor
2: Base plate
3: Microprocessor or CPU
4: SATA ports
5: RAM
6: Expansion plates
7: Power supply
8: Solid state unit
9: Hard drive
10: Keyboard
11: Mouse

Computer, computer or computer is a programmable digital electronic machine that executes a series of commands to process input data, conveniently obtaining information that is subsequently sent to the output units. A computer is made up of numerous and diverse integrated circuits and various support elements, extensions and accessories, which together can perform various tasks very quickly and under the control of a program (software).

It is made up of two essential parts, the hardware, which is its physical structure (electronic circuits, cables, cabinet, keyboard, mouse, etc.), and the software, which is its intangible part (programs, data, information, documentation, etc).

From a functional point of view, it is a machine that has, at least, a central processing unit (CPU), a memory unit and an input/output unit (peripheral). The input peripherals allow the entry of data, the CPU is in charge of its processing (arithmetic-logical operations) and the output devices communicate them to external media. Thus, the computer receives data, processes it and issues the resulting information, which can then be interpreted, stored, transmitted to another machine or device or simply printed; all this at the discretion of an operator or user and under the control of a computer program.

The fact that it is programmable allows it to perform a wide variety of tasks on the basis of input data since it can perform operations and solve problems in various areas of human activity (management, science, design, engineering, medicine, communications, music, etc.).

Basically, the capacity of a computer depends on its hardware components, while the diversity of tasks lies mainly in the software that it can run and has installed.

Although this machine can be of two types, analog computer or digital system, the first type is used for few and very specific purposes; the most widespread, used and well-known is the digital computer (for general purposes); in such a way that in general terms (even popular ones), when one speaks of "the computer" one is referring to a digital computer. There are mixed architecture, called hybrid computers, and these are also for special purposes.

In World War II mechanical analog computers were used, oriented to military applications, and during the same time the first digital computer was developed, which was called ENIAC; it occupied a huge space and consumed large amounts of energy, equivalent to the consumption of hundreds of current computers (PCs). Modern computers are based on integrated circuits, billions of times faster than the first machines, and occupy a small fraction of its space.

Simple computers are small enough to reside on mobile devices. Portable computers, such as tablets, netbooks, notebooks, ultrabooks, can be powered by small batteries. Personal computers in their various forms are icons of the so-called information age and are what most people think of as a "computer." However, embedded systems are also computers, and are found in many modern devices such as MP4 players, smartphones, fighter jets, toys, industrial robots, etc.

Etymology

Computer

A "humancomputer" (euphemism for support staff performing long calculations) with a microscope and a mechanical calculator.

In the Spanish spoken in America, as well as in Portuguese, German and Dutch, terms derived from the English computer are used and this in turn from the Latin computare 'calculate'. From the Latin root, also arose computator (lit., "computer"; c. 1600), 'he who calculates', and computist ("computer"; late XIV century), 'expert in calendrical or chronological computation'.

According to the Oxford English Dictionary, the first known use of the word computer in the English language is in the book The Yong Mans Gleanings (1613), by the writer Richard Braithwait, to refer to an arithmetician (arithmetic): «I haue [sic] read the truest computer of Times, and the best Arithmetician that euer [sic] breathed, and he reduced thy days into a short number». This term referred to a “human computer”, a person who performed calculations or computations. Computer continued with the same meaning until the middle of the XX century. At the end of this period, women were hired as computers because they could be paid less than their male colleagues. In 1943, the majority of human computers were women; they were referred to by the feminine form computress (" computer"), which was eventually changed to programmer.

The Oxford English Dictionary records that, in the late 19th century, computer began to be used to mean "calculating machine". The modern use of the term for "programmable digital electronic computer" dates from 1945, based on the theoretical concept of a Turing machine published in 1937. ENIAC (1946), an acronym for "Electronic Numerical Integrator And Computer", is generally considered the first of its kind.

Computer

In the Spanish spoken in Spain, "ordenador" comes from the French term ordinateur and this in turn from the Latin term ordinator. The word "ordenador" was introduced by IBM France in 1955, after François Girard, then head of the company's advertising department, had the idea of consulting his former literature professor in Paris, Jacques Perret. Together with Christian de Waldner, then president of IBM France, they asked Professor Perret to suggest a "French name for their new electronic information processing machine (IBM 650), avoiding the literal translation of the English word computer". ('calculator' or 'calculatriz'), which at that time was reserved for scientific machines».

In 1911, a description of Babbage's Analytical Engine used the word "computer" to describe its driving force:

«Pour aller prendre et reporter les nombre... et pour les soumettre à l’opération demandée, il faut qu'il y ait dans la machine un organe spécial et variable: c'est l'ordonnateur. Cet ordonnateur est constitué simplement par des feuilles de carton ajourées, analogues à celle des métiers Jacquard...».
"To take and transfer the numbers and submit them to the required operation, there must be a special and variable organ in the machine: this is the computer. This computer is simply composed of cardboard sheets with holes, similar to those used in Jacquard's looms...».
Babbage machine manual (in French)

Perret proposed a compound word centered on the ordonnateur: 'he who brings order' and has the notion of ecclesiastical order in the Catholic Church (ordinant). He suggested, more precisely, ordinatrice électronique, so that the feminine, according to him, could better distinguish the religious use from the accounting use of the word. IBM France retained the word ordinateur and initially tried to trademark this name, but as users quickly and easily adopted the word ordinateur, the company decided to leave it in the public domain.

In 1984, French academics, in the debate «Les jeunes, la technique et nous», that the use of the noun ordonnateur is incorrect, because the function of the apparatus is to process data, not to give orders. While others —like the creator of the term, Jacques Perret— who are aware of the religious origin of the term, consider it the most correct; there was talk of the fact that the word ordinateur has more to do with the function of ordering than with giving orders, which would be more correct for the modern function of these devices. The use of the word ordinateur has been exported to the languages of Spain: Aragonese, Asturian, Galician, Castilian, Catalan and Basque.

History

Far from being the invention of any one person, the computer is the evolutionary result of ideas by many people related to areas such as electronics, mechanics, semiconductor materials, logic, algebra, and programming.

Timeline

Major milestones in the history of computing, from the first hand tools for calculating to modern pocket computers.

  • 500 B.C.: Abbey is used in ancient civilizations such as Chinese or Sumerian, the first tool to make sums and subtractions.
  • Around 830: Persian mathematician and engineer Musa al-Juarismi developed the theory of the algorithm, i.e. the methodical resolution of algebra problems and numerical calculation through a well-defined, orderly and finite list of operations.
  • 1614: Scots John Napier invents the Neperian logarithm, which managed to simplify the calculation of multiplications and divisions by reducing it to a calculation with sums and subtractions.
  • 1620: English Edmund Gunter invents the calculation rule, a manual instrument used since then until the appearance of the electronic calculator to perform arithmetic operations.
  • 1623: German Wilhelm Schickard invents the first calculating machine, whose prototype disappeared shortly afterwards.
  • 1642: the French scientist and philosopher Blaise Pascal invents a machine of sumar (la pascalina), which used dented wheels, and of which some original specimens are still preserved.
  • 1671: German philosopher and mathematician Gottfried Wilhelm Leibniz invents a machine capable of multiplying and dividing.
  • 1801: Frenchman Joseph Jacquard invents for his knitting machine a pierced card that controls the operating pattern of the machine, an idea that would be used later by the first computers.
  • 1833: British mathematician and inventor Charles Babbage designs and tries to build the first computer, of mechanical operation, to which he called the "analytical machine". However, the technology of his time was not advanced enough to make his idea come true.
  • 1841: Ada Lovelace starts working alongside Charles Babbage in what would be the first algorithm to be processed by a machine, so it is considered the first computer programmer.
  • 1890: American Herman Hollerith invents a tabulator using some of Babbage's ideas, which was used to draw up the U.S. census. Hollerith later founded the company that would later become IBM.
  • 1893: Swiss scientist Otto Steiger develops the first automatic calculator that was manufactured and used on an industrial scale, known as the Millonaria.
  • 1936: English mathematician and computerologist Alan Turing forms the concepts of algorithm and machine of Turing, which would be key in the development of modern computation.
  • 1938: German engineer Konrad Zuse completes the Z1, the first computer that can be considered as such. From electromechanical operation and using relays, it was programmable (through perforated tape) and used bolean binary system and logic. She would follow the improved models Z2, Z3 and Z4.
  • 1944: In the United States IBM builds the electromechanical computer Harvard Mark Idesigned by a team headed by Howard H. Aiken. It was the first computer created in the United States.
  • 1944: Colossus computers are built in England (Colossus Mark I and Colossus Mark 2), with the aim of deciphering the communications of the Germans during the Second World War.
  • 1946: at the University of Pennsylvania the ENIAC is operational (Electronic Numerical Integrator And Calculator), which operated on valves and was the first general purpose electronic computer.
  • 1947: in the Bell Laboratories, John Bardeen, Walter Houser Brattain and William Shockley invent the transistor.
  • 1950: Kathleen Booth, creates the Ensambling Language to do computer operations without the need to change the connection cables, but through punching cards (program or operation saved to use it when necessary) which were prone to harm for this reason, at the end of this year the programming language begins to develop.
  • 1951: EDVAC began to operate, conceived by John von Neumann, who unlike ENIAC was not decimal, but binary, and had the first program designed to be stored.
  • 1953: IBM manufactures its first computer on an industrial scale, the IBM 650. The use of assembly language for computer programming is expanded. Computers with transistors replace valves, marking the beginning of the second generation of computers.
  • 1957: Jack S. Kilby builds the first integrated circuit.
  • 1964: the appearance of IBM 360 marks the beginning of the third generation of computers, in which printed circuit boards with multiple elemental components become replaced with integrated circuit boards.
  • 1965: Olivetti launches, Programma 101, the first desktop computer.
  • 1971: Nicolet Instruments Corp. launches to the market the Nicolet 1080, a scientific computer based on 20-bit records.
  • 1971: Intel presents the first commercial microprocessor, the first chip: the Intel 4004 microprocessor, designed by Federico Faggin and Marcian Hoff.
  • 1975: Bill Gates and Paul Allen founded Microsoft.
  • 1976: Steve Jobs, Steve Wozniak, Mike Markkula founded Apple.
  • 1977: Apple presents the first personal computer sold on a large scale, Apple II, developed by Steve Jobs and Steve Wozniak.
  • 1981: IBM PC, which would become a commercial success, is launched into the market, would mark a revolution in the field of personal computation and define new standards.
  • 1982: Microsoft presents its MS-DOS operating system, commissioned by IBM.
  • 1983: ARPANET is separated from the military network that originated it, going to civil use and thus becoming the origin of the Internet.
  • 1983: Richard Stallman publicly announces the GNU project.
  • 1985: Microsoft presents the Windows 1.0 operating system.
  • 1990: Tim Berners-Read the hypertext to create the World Wide Web (WWW), a new way to interact with the Internet.
  • 1991: Linus Torvalds began to develop Linux, an operating system compatible with Unix.
  • 2000: pocket computers, first PDA, appear in the early 21st century.
  • 2007: presentation of the first iPhone, by the company Apple, a smartphone or smartphone.

Components

Illustrative picture

The technologies used in digital computers have evolved a lot since the appearance of the first models in the 1940s, although most still use the

Von Neumann Architecture, published by John von Neumann in the early 1980s, attributed by other authors to John Presper Eckert and John William Mauchly.

The Von Neumann architecture describes a computer with four (4) main sections: the arithmetic logic unit, the control unit, the primary, main or central memory, and the input and output (I/O) devices. These parts are interconnected by conductor channels called buses.

Central Processing Unit

The central processing unit (CPU, for its acronym: Central Proprocessing Unit) basically consists of the following three elements:

A typical schematic symbol for an ALU: A and B are operating; R is the output; F is the entry of the control unit; D is a state of the output.
  • The logical arithmetic unit (ALU: Arithmetic Logic Unit) is the device designed and built to carry out elemental operations such as arithmetic operations (suma, subtraction), logical operations (AND, OR, XOR, investments, displacements and rotations).
  • Control unit (UC): Control Unit) follows the direction of the memory positions that contain the instruction that the computer will perform at that time; retrieves the information by placing it in the ALU for the operation it should develop. Then transfer the result to corresponding locations in memory. Once the above occurs, the control unit goes to the following instruction, being the following physically (proper of the Program Accountant) or another (through a leap instruction).
  • Registrations: non-accessible (training, data bus and direction bus) and accessible, specific use (schedule, battery pointer, accumulator, flags, etc.) or general use.

Primary memory

Main memory, known as random access memory (RAM: Random-Access Memory), is a set of storage cells arranged in such a way that the memory address can be accessed numerically. Each cell corresponds to a bit or minimum unit of information. It is accessed by 8-bit sequences. An instruction is a specific operational action, a sequence that tells the ALU the operation to perform (addition, subtraction, logical operations, etc). The main memory bytes store both the data and the opcodes needed to carry out the instructions. The memory capacity is given by the number of cells it contains, measured in bytes or multiples. The technologies used to manufacture memories have changed a lot; from the electromechanical relays of the first computers, tubes with mercury in which the acoustic pulses were formed, permanent magnet arrays, individual transistors to the current integrated circuits with millions of cells on a single chip. They are subdivided into static memories (SRAM) with six integrated transistors per bit and the much more widely used dynamic memory (DRAM), with one integrated transistor and capacitor per bit. RAM memory can be rewritten several million times; unlike ROM, which can only be flashed once.

Input, output or I/O peripherals

Input devices allow the entry of data and information while output devices are responsible for externalizing the information processed by the computer. There are peripherals that are both input and output. As an example, a typical input device is keyboard, output device is monitor, input/output device is hard drive. There is a very wide range of I/O devices, such as keyboard, monitor, printer, mouse, floppy drive, webcam, etc.

Desktop computer

Buses

The three basic units in a computer, the CPU, the memory, and the I/O element, are communicated with each other by buses, or communication channels:

  • Address Bus: allows you to select the address of the data or the peripheral to which you want to access,
  • Control Bus: controls the external and internal operation of the CPU.
  • Data search: contains the information (data) that circulates through the system.

Other data and concepts

In modern computers, a user has the impression that computers can run several programs "at the same time", this is known as multitasking. Actually, the CPU executes instructions from one program and then after a short time, switches execution to a second program and executes some of its instructions. Since this process is very fast, it creates the illusion that several programs are running simultaneously; you are actually dividing CPU time between programs, one at a time. The operating system is the one that controls the distribution of time. Truly simultaneous processing is done on computers that have more than one CPU, giving rise to multiprocessing.

The operating system is the program that manages and administers all the resources of the computer, it controls, for example, which programs are executed and when, it manages the memory and the accesses to the I/O devices, it provides the interfaces between devices, even between the computer and the user.

Currently some widely used programs are usually included in the operating system distributions; such as Internet browsers, word processors, email programs, network interfaces, movie players, and other programs that previously had to be purchased and installed separately.

Early large and expensive digital computers were used primarily for scientific calculations. ENIAC was created with the purpose of solving the ballistics problems of the United States Army. CSIRAC, Australia's first computer, made it possible to assess rainfall patterns for a large hydroelectric generation project.

With the commercial manufacture of computers, governments and companies systematized many of their data collection and processing tasks, which were previously performed manually. In the academic world, scientists from all fields began to use computers to do their analysis and calculations; the continued decline in the prices of these devices allowed their use by increasingly smaller companies. Businesses, organizations, and governments began to use large numbers of small computers to perform tasks that were previously done by large, expensive mainframe computers.

With the invention of the microprocessor in 1970, it became possible to make computers ever cheaper. The microcomputer was born and then the personal computer appeared, the latter became popular to carry out routine tasks such as writing and printing documents, calculating probabilities, performing analysis and calculation with spreadsheets, communicating via email and the Internet. The wide availability of computers and their easy adaptation to the needs of each person have led them to be used for a variety of tasks, which include the most diverse fields of application.

At the same time, small fixed programming computers (embedded systems) began to find their way into applications for the home, automobiles, airplanes, and industrial machinery. These embedded processors made it easier to control the behavior of the devices, allowing the development of more complex control functions, such as anti-lock braking systems (ABS). At the beginning of the 21st century, most electrical appliances, almost all types of electric transportation, and most lines of production of the factories work with a computer.

PC with touch interface.

Around the end of the 20th century and beginning of the XXI, personal computers are used for both research and entertainment (video games), while mainframes are used for complex mathematical calculations, technology, modeling, astronomy, medicine, etc.

As a result of the crossover between the concept of the personal computer and the so-called «supercomputers», the workstation arises; This term, originally used for equipment and machines for recording, recording and digital sound processing, now refers to workstations, which are high-capacity computing systems, normally dedicated to scientific calculation tasks or real-time processes. A workstation is, in essence, a personal work computer with high computing, performance and storage capacity, superior to conventional personal computers.

Contenido relacionado

Slackware

Slackware Linux is a distribution of the Linux operating system created in 1993 by Patrick Volkerding aimed at advanced users. Originally based on SLS Linux...

Msx

MSX is an 8-bit home microcomputer standard released during the 1980s and early 1990s. It was announced by Microsoft and ASCII Corporation on June 16, 1983...

Joy (programming language)

The Joy is a functional programming language that was produced by Manfred von Thun of La Trobe University in Melbourne, Australia. Joy is based on function...
Más resultados...
Tamaño del texto:
undoredo
format_boldformat_italicformat_underlinedstrikethrough_ssuperscriptsubscriptlink
save