Fifth generation of computers

AjustarCompartirImprimirCitar

The fifth generation computer, also known by its acronym in English, FGCS (for Fifth Generation Computer Systems), was a project made by Japan that began in 1981 His goal was the development of a new class of computers that would use artificial intelligence techniques and technologies both at the hardware and software levels, using the PROLOG language at the machine language level, and would be able to solve problems. complex, such as machine translation from one natural language to another (Japanese to English, for example). As a unit of measurement of the performance and benefits of these computers, the amount of LIPS (Logical Inferences Per Second) capable of performing during the execution of the different programmed tasks was used. For its development, different types of VLSI (Very Large Scale Integration) architectures were used.

The project lasted eleven years, but it did not obtain the expected results: current computers continued like this, since there are many cases in which, either it is impossible to carry out a parallelization of the same, or once carried out this there is no improvement, or, in the worst case, there is a loss of performance. It must be clear that to carry out a parallel program we must, to begin with, identify parts within it that can be executed separately on different processors. In addition, the other generations are almost no longer used; It is important to note that a program that is executed sequentially must receive numerous modifications so that it can be executed in parallel; that is, first it would be interesting to study if the work that this entails is really compensated by the improvement in the performance of the task after parallelizing it.

History and development of the project

Background and design

Throughout the multiple generations since the 1950s, Japan had been the follower in terms of advancing and building computers based on the models developed in the United States and the United Kingdom. Japan, through its Ministry of Economy, Trade and Industry (MECI), decided to break with this nature of following the leaders and in the mid-1970s began to forge a path towards a future in the computer industry. The Japan Information Processing and Development Center (JCDPI) was commissioned to carry out a plan to develop the project. In 1979 they offered a three-year contract to carry out more in-depth studies with the joint participation of companies in the industry dedicated to technology and academic institutions, at the request of Hazime Hiroshi. It was during this period that the term "fifth generation computer" began to be used.

Home

In 1982, at the initiative of MITI, an international conference was held, during which Kazuhiro Fuchi announced the research program, and on April 14, 1982, the government decided to officially launch the project, creating the Institute for New Generation Computer Technology (Institute for Next Generation Computing Technologies or ICOT for its acronym in English), under the direction of Fuchi, whom he would succeed in the position as director of the Tohru Moto-Oka institute, and with the participation of researchers from various Japanese companies dedicated to the development of hardware and software, including Fujitsu, NEC, Matsushita, Oki, Hitachi, Toshiba and Sharp.

The main fields for research in this project initially were:

  • Technologies for the process of knowledge.
  • Technologies to process databases and mass knowledge bases.
  • High performance work sites.
  • Functional information distributed.
  • Supercomputers for scientific calculation.

Institutional and international impact

Because of the shock that caused the Japanese to be successful in the electronics area during the 1970s, and to do much the same in the automotive area during the 1980s, the fifth generation project it had a lot of reputation among the other countries.

Such was its impact that parallel projects were created. In the United States, the Microelectronics and Computer Technology Corporation and the Strategic Computing Initiative; on the European side, in the United Kingdom it was ALVEY, and in the rest of Europe his reaction was known as ESPRIT (European Strategic Program for Research in Information Technology, in Spanish European Strategic Program on Information Technology Research).

International popularity

Apart from the reactions at the institutional level, on a more popular level it began to be known in the West thanks to the appearance of books in which the project was discussed more or less directly or was cited but mainly by articles that appeared in magazines dedicated to computer enthusiasts; thus, for example, in the August 1984 issue of the American Creative Computing an article was published that dealt extensively with the subject, "The fifth generation: Japan& #39;s computer challenge to the world" (translated, The fifth generation: The Japanese computer challenge to the world). In the Spanish-speaking area, for example, MicroHobby magazine can be cited, which in July 1985 published an interview with Juan Pazos Sierra, a PhD in Computer Science and linked at that time to the Faculty of Computer Science of the University of Madrid, in which briefly described the project as:

...a Japanese project that has curious and special features; first, the claim is to build a VLSI technology-based computer, with an architecture not Von Neumann and that would lead logical programming as a software core, the PROLOG language, to finally build on all this Expert Systems.

And regarding its potential results, he expressed a relatively optimistic opinion, in line with what was predicted by the promoters of the project themselves. Thus, when asked if any results had been obtained in it, he replied:

Right now, nothing. Much is going to be developed, new technologies will appear, new systems and research will be greatly enhanced by the tremendous injection of money that the fifth generation project has meant for Artificial Intelligence.

For his part, Román Gubern, in his essay The computerized ape of 1987, considered that:

...the fifth generation computer is a real attempt at technological duplication of the intellect Homo sapiens.

Main events and completion of the project

  • 1981: The international conference is held that outlines and defines the objectives and methods of the project.
  • 1982: the project begins and receives grants for equal shares from industry and government sectors.
  • 1985: the first hardware developed by the project, known as Staff Sequential Inference machine (PSI) and the first version of the operating system Sequential Inference Machine Programming Operating System (SIMPOS). SIMPOS was programmed in Kernel Language 0 (KL0), a concurrent Prolog variant with extensions for object-oriented programming, the ESP metalenguaje. Shortly after PSI machines, CHI machines were developed (Co-operative High-performance Inference machine).
  • 1986: Last Machine Delta, based on relational databases.
  • 1987: a first prototype of the hardware called Parallel Inference Machine (PIM) using several networked PSI machines. The project receives grants for five more years. A new version of the proposed language is developed, Kernel Language 1 (KL1) very similar to "Flat GDC"Flat Guarded Definite Clauses), influenced by subsequent developments of the Prolog and directed to parallel computing. The SIMPOS operating system is rewritten in KL1 and renamed Parallel Inference Machine Operating System, or PIMOS.
  • 1991: Work on PIM machines is completed.
  • 1992: the project is extended one more year from the original plan, which ended this year.
  • 1993: The project of the fifth generation of computers is officially completed, although a new two-year project, called for, is launched to make the results known. FGCS Folow-on Project. The source code of the PIMOS operating system is released under public domain license and the KL1 is ported to UNIX systems, resulting in the KLIC (KL1 to C compiler).
  • 1995: complete all institutional initiatives linked to the project.

As one of the final products of the Project, five Parallel Inference Machines (PIM) were developed, called PIM/m, PIM/p, PIM/i, PIM/k and PIM /c, having as one of its main features 256 Network Coupled Processing elements. The project also produced tools that could be used with these systems such as the Kappa parallel database management system, the HELIC-II legal reasoning system, the programming language Quixote, a hybrid deductive object-oriented database and logic programming language, and the automatic theorem prover MGTP.

Eleven years after the start of the project, the large sum of money, infrastructure and resources invested in it did not correspond to the expected results and it was terminated without having met its objectives. William Zachman criticized the project a year before its completion, arguing:

It perjuates the development of AI applications; with AI, it does not matter the system, while there are no powerful inference mechanisms. There are already a lot of IA-type applications, and I'm waiting for the arrival of the powerful inference engine, which is why the fifth generation computer is an error.

The proposed hardware and its software developments had no place in the computing market, which had evolved since the project was launched, and in which general-purpose systems could now handle most of the tasks. tasks proposed as initial objectives of the fifth generation machines, in a similar way to what had happened in the case of the potential market of Lisp machines, in which systems for the creation of rule-based Expert Systems such as CLIPS, implemented on common computers, had made these expensive machines unnecessary and obsolete.

On the other hand, within the disputes between the different branches of artificial intelligence, the Japanese project started from the paradigm based on logic programming and declarative programming, dominant after the publication in 1969 by Marvin Minsky and Seymour Papert of the book Perceptrons, which would gradually pass into the background in favor of Artificial Neural Network (RNA) programming after the publication in 1986 by McClelland and Rumelhart of the book Parallel Distributed Processing, which together with its few results contributed to the fact that the fifth generation project fell into oblivion when it came to an end in 1993.

The Institute for New Generation Computer Technology (ICOT) was renamed in 1995 to the Research Institute for Advanced Information Technology (AITEC), a center that was closed in 2003, passing all its resources to the Advanced IT Research Group (AITRG), dependent on the Research Department of the JIPDEC.

Hardware

First stage

Sequential machines PSI (Personal Sequential Inference machine) and CHI (Co-operative High-performance Inference machine):

  • PSI-I: 30 KLIPS (Logical Inference Per Second)
  • PSI-II: PSI-I + CPU VLSI
  • CHI-I: 285 KLIPS

Machine in parallel PIM (Parallel Inference Machine):

  • PIM-D
  • PIM-R

Relational database machine:

  • DELTA

Second stage

Sequential machines:

  • PSI-III
  • CHI-II: 490 KLIPS

Machines in parallel:

  • Multi-PSI

Third stage

Machines in parallel:

  • PIM/p: 512 RISC microprocessors, 256 MB memory
  • PIM/m: 256 CISC microprocessors, 80 MB memory
  • PIM/c: 256 CISC microprocessors, 160 MB memory
  • PIM/k: 16 RISC microprocessors, 1 GB memory
  • PIM/i: 16 RISC microprocessors (LW type), 320 MB memory

Contenido relacionado

Seikan tunnel

The Seikan Tunnel is the second longest railway tunnel in the world, only surpassed by the Gotthard tunnel and the longest tunnel with an underwater section...

EMule

eMule is an exchange program of files with a P2P system using the eDonkey 2000 protocol and the Kad network, published as free software for Microsoft Windows...

Intel 80286

The Intel 80286 is a 16-bit microprocessor of the x86 family, which was launched by Intel on February 1, 1982. It has 134,000 transistors. Like its...
Más resultados...
Tamaño del texto: