Scrigroup - Documente si articole

Username / Parola inexistente      

Home Documente Upload Resurse Alte limbi doc  
BulgaraCeha slovacaCroataEnglezaEstonaFinlandezaFranceza
GermanaItalianaLetonaLituanianaMaghiaraOlandezaPoloneza
SarbaSlovenaSpaniolaSuedezaTurcaUcraineana

AdministrationAnimalsArtBiologyBooksBotanicsBusinessCars
ChemistryComputersComunicationsConstructionEcologyEconomyEducationElectronics
EngineeringEntertainmentFinancialFishingGamesGeographyGrammarHealth
HistoryHuman-resourcesLegislationLiteratureManagementsManualsMarketingMathematic
MedicinesMovieMusicNutritionPersonalitiesPhysicPoliticalPsychology
RecipesSociologySoftwareSportsTechnicalTourismVarious

Computer Architecture - COMPUTER GENERATIONS

computers



+ Font mai mare | - Font mai mic



Foundations

Computer Architecture - users view

Computer Organization - designers view, more details



Computer Architecture

T       software aspects (instructions, data types, operating systems, translators)

T       hardware aspects (signals, chips, functions and timing of elements, registers, memories, peripherals)

Hardware - solid objects: electronic circuits along with the memory and input/output devices, integrated circuits, printed circuit boards, cables, power supplies, and terminals - rather than abstract ideas, algorithms or instructions.

Software - algorithms (detailed instructions telling how to do something) and their computer representations (programs). Programs can be represented on punched cards, magnetic tapes, disks, photographic films, but the essence of software is the set up of instructions that make up the programs, not the physical media on which they are recorded.

Firmware - an intermediate form between hardware and software which consists of software embedded in electronic devices during their manufacture (microprograms, or permanent program in ROM for toys or appliances).

HARDWARE AND SOFTWARE ARE LOGICALLY EQUIVALENT !!

Any operation performed by software can also be built directly into the hardware and any instruction executed by the hardware can also be simulated in software. The decision to put certain functions in hardware and others in software is based on cost, speed, reliability, and frequency of expected changes. As the role of software becomes more important, software becomes more complex.

However, computers can work without software (a connectionists view, neural networks) and have only hardware. Computers cannot work without hardware although software seem to become more and more important.

THIS COURSE WILL DEAL BOTH WITH HARDWARE AND SOFTWARE FOR COMPUTERS.

THIS COURSE WILL DEAL WITH ADVANCED COMPUTER ARCHITECTURES (MOSTLY)

Computer Architecture - conventional (up to the 4th generation), advanced (5th and 6th generation).

Conventional - deals with lower ***rather* concepts as instruction set design, addressing modes, ALU, control unit, memories and input/output.

Advanced - deals with higher concepts as programming styles, type of control (data or demand driven), parallelization of compilers and operating systems, multiprocessor topologies, and only next low-level details.

COMPUTER GENERATIONS

Zeroth Generation - Mechanical Computers (1642 - 1945)

The first adder/subtracter (mechanical) built by Blaise Pascal in 1642 (programming language Pascal has been named in his honor)

The first mechanical calculator with division and multiplication built by Wilhelm Leibnitz (this from calculus and integrals)

Charles Babbage, mathematician from Cambridge University, XIX c. invented two computation machines: difference engine (analog) which could solve equations; and analytical engine (digital with the key parts of computers: input device, processor, a storage place and an output device.

Ada Lovelace was the first programmer (in her honor programming language ADA has been named).

1930 - 1944 Konrad Zuse Built a series of automatic calculating machines using electromagnetic relays.

In USA, John Atanasoff at Iowa State Collage (ABC) and George Stibitz at Bell Labs (1942) designed first electronic calculators.

In 1944 Howard Aiken at Harvard built an electromechanical computer Mark I and Mark II.

First Generation - Vacuum Tubes (1945 - 1955)

In 1943, the first electronic digital computer COLOSSUM has been built in the U.K. to break code for the German encrypting device ENIGMA. The British government kept every aspect of the project as a military secret for 30 years, thus COLOSSUM did not have an influence on the other computers. Alan Turing helped design COLOSSUM.

1946 the ENIAC (Electronic Numerical Integrator And Calculator) was built by John Mauchley and his graduate student J. Prespent Eckert at the University of Pennsylvania to calculate range tables for heavy artillery. Mauchley, professor of physics, knew Atanasoffs work as well as Stibitz thus later controversy who was the first to build and electronic computer. ENIAC is so important because it started modern computer history.

ENIAC consisted of 18,000 vacuum tubes and 1,500 relays. IT weighed 30 tons and consumed 140 kilowatts of power. Architecturally, the machine had 20 registers, each capable of holding a 10 digit decimal number. It was programmed by setting up 6,000 multiposition switches and connecting a multitude of sockets with jumper cables. ENIACs memory contained only data (no code) and arithmetic was decimal.

Other operational first computers were EDSAC (1949 built at Cambridge Univ. in the UK by Maurice Wilkes first stored-program computer), JONIAC at the RAND Corporation, ILLIAC at the Univ. of Illinois, MANIAC at Los Alamos Laboratory, WEIZAC at the Weitzmann Institute in Israel, XYZ in Poland, EDVAC built by Eckert/Mauchley, UNIVAC (also built by Mauchley/Eckert, it was ENIAC in disguise. It was the first computer sold on a commercial basis).

One of the people involved in the ENIAC project was John Von Neumann who at Princetons Institute of Advanced studies built his own version of EDVAC, as the IAS machine. Von Neumann was a genius in the same league as Leonardo Da Vinci. He spoke many languages, was an expert in the physical sciences and mathematics, and had total recall of everything he ever heard, saw, or read. He was able to quote from memory the verbatim text of books he had read years earlier. At the time he became interested in computers, he was already the most eminent mathematician in the world.

First generation computers required thousands of vacuum tubes and had slow input/output. They were programmed only in machine language and were unreliable. The main form of memory was magnetic core. Magnetic tape was introduced in 1957 to store data compactly.

Von Neumann Machine

John Von Neumanns basic design, now known as a Von Neumann machine was used in the EDSAC (1949)***, the first stored program computer and the IAS machine (1952), and is still the basis for nearly all digital computers, even now, almost half a century later. Most current computers still use this design.

Memory

Input

Control Unit Arithmetic Output

Logic Unit

Accumulator

**Von Neumann Machine**

Generally, his contributions were in two areas:

1) Program (code) should be represented in digital form in the computers (linear) memory, along with data instead of inflexible huge numbers of switches and cables; and

2) numbers should be represented using parallel binary arithmetic instead of the clumsy serial decimal arithmetic used by the ENIAC, with each digit represented by 10 pairs of vacuum tubes (1 on and 1 off).

The Von Neumann machine had 5 parts:

Memory, arithmetic-logic unit, program control unit, input/output equipment

The memory consisted of 4096 words, a word holding 40 bits (0 and 1). Each word held either two 20-bit instructions or a 33-bit signed integer. The instructions had 8 bits devoted to telling the instruction type and 12 bits for specifying one of the 4096 memory words. Inside the arithmetic-logic unit, the forerunner to the modern CPU was a 40-bit accumulator register. A typical instruction added a word of memory to the accumulator of stored the accumulator in memory. The machine did not have floating-point arithmetic.

Second Generation - Transistors (1955 - 1965)

The transistor was invented at Bell Labs in 1948 by John Bardeen, Walter Brattain, and William Shockley, for which they were awarded the 1956 Nobel Prize in physics. Second generation computers used transistors. Transistors, compared to vacuum tubes, were small, needed no warm up, consumed less energy, and were faster and more reliable. During the second generation, assembly languages or symbolic languages were developed. They used abbreviations for instructions rather than numbers. Later, higher-level languages such as FORTRAN (Backus, 1956), LISP (McCarthy, 1955), COBOL (1953), ALGOL (1960), which are more English like than machine languages, were developed.

In 1962 the first removable disk was marketed.

In 1961 DEC introduced the first mini-computer, PDP-1 replaced later by the PDP-8 with Omnibus.

IBM introduced IBM7094 which dominated scientific computing in the early 60s and business oriented IBM1401 (bytes instead of words).

In 1964 CDC introduced the first highly parallel computer CDC 6600, designed by S. Cray, which was nearly an order of magnitude faster than IBM7094.

Third Generation - Integrated Circuits (1965 - 1975)

The third generation merged with the introduction of the integrated circuit (IC) - a coupled electronic circuit in a small chip of silicon. Silicon is a semiconductor, a substance that will conduct electric current when it has been doped with chemical impurities. Some have called the integrated circuit the greatest invention ever. IC have been invented independently by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor in 1953. Kilby went on to develop the first hand-held calculator and Noyce founded Intel Corporation.

In 1964 IBM took a radical step. It introduced a single product line, the System/360, based on integrated circuits (TTL, ECL) that was designed for both scientific and commercial computing. System/360 introduced several innovations:

1) the same assembly language for the whole family of machines (compatibility),

2) multiprogramming,

3) emulation of other computers,

4) huge (for that time) address space 2^24 (16 MB), and

5) universal (business/scientific) design: 16 32-bit registers for binary arithmetic, but its memory was byte oriented.

** ** TABLE GOES HERE************PAGE 11

The 360 series were later followed by the 370 series, 4300 series, 3800 series, and the 3090 series, all using exactly the same architecture (assembly language).

In 1970 DEC introduced PDP-11, a 16-bit successor of the PDP-8. The PDP-11 was enormously successful, especially at universities. UNIX was initially implemented by Brian Kerningham and Dennis Ritchie in C for DEC PDP-11 in 1972. In the third generation also the first computer networks appeared (ARPANET in 1969).

Fourth Generation - Personal Computers and VLSI (1975 - ???)

In 1975 very large scale integration (VLSI) was achieved. Microprocessors led to the development of microcomputers, expanding computer markets to smaller business and to personal use. The first microprocessor was built by Intel in 1971 (Intel 4004). The first micro- computer and the MITS Altair, was produced in 1975. The first successful personal computer to include an easy-to-use keyboard and screen was Apple 1, built by Steve Jobs and Steve Wozniak in 1977. Soon other companies, such as Tandy and Commodore, entered the microcomputer market.

IBM entered the market in 1981 with IBM PC, and captured the top market share in just 18 months. IBM PC has since been replaced by XT, AT, and PS/2.

Intel microprocessor family: 4004, 8008, 8080, 8085, 8086, 8088, 80186, 80286, 80386, 80486, and 80586 (Pentium).

Motorola microprocessor family: 6800, 6809, 68000, 68008, 68010, 68012, 68020, 68030, 68040.

Type

MIPS

RAM

Machine

Application

Personal

Computer

1

1

IBM

PS/2

word processing

mini-

computer

2

4

PDP-

11/84

real time control

supermini

10

32

SUN-4

network file server

mainframe

30

128

IBM

3090/

300

banking

super-computer

125

1024

CRAY-2

weather forecasting

5 common computer types

The IBM 360 and all other mainframes, DEC VAX, Intel 80386, Motorola 68030 are CISC (Complex Instruction Set Computers). The first modern RISC machine was IBM 801 built in 1975. In 1980 David Patterson at Berkeley built RISC I which was inspiration for SPARC of Sun Microsystems (1987). In 1981 John Hennessey built MIPS at Stanford which led to the MIPS Computer Systems R2000 and R3000 chips.

Supercomputers are specially designed to maximize the number of FLOPS (Floating Point Operations Per Second). Anything below 1 Gigaflops is not considered to be supercomputer. Supercomputers have unique highly parallel architectures in order to achieve these speeds, and are only effective on a small range of problems. For years, the names supercomputers and Seymour Cray have been almost synonymous. Cray designed CDC6600 and CDC 7600. Then he formed his own company to build Cray-1 and Cray-2. In 1989, Cray left to form yet another company to build the Cray-3.

Fifth Generation - AI Knowledge Based Computers (1981- ???)

The announcement to the world of the concept of the fifth generation computer took place at the International Conference held in Tokyo in October 1981. In 1982 the Japanese governments Ministry of Trade and Industry began a 10 year project to make the fifth generation computer. The ICOT (Institute For New Generation Computer Technology) was established in 1982 as the central organization responsible for the project. The main participants in ICOT are the 8 computer manufacturers; Fujitsu, Hitashi, NEC, Toshiba, Mitsubishi, Oki, Matsushita, and Sharp.

The UK, in 1981, launched its national Alwey 5th generation program (John Alwey was a chairman of British Telecom).

The USA response to the Japanese plan was a group of manufacturers centered around Control Data Corporation (CDC), though not including IBM, set up a research Consortium called MCC ( Microelectronics and Computer Technology Corporation). Moreover, since 1982 the DARPA (Defense applied Research Projects Agency) of the US Dept. of Defense has started a program  for strategic programming research.

West Germany started in 1984 under Siemens. France in 1983 with SICO and INRIA. The European Community has launched successively ESPRIT-1 and ESPRIT-2 projects (European Strategic Plan for Research in Information Technology).

Fifth Generation knowledge-based computers of 1990s should be able to store the specialized knowledge of a human expert and use this knowledge to operate like a consultant. They are viewed as knowledge processing systems supporting Artificial Intelligence applications.

Artificial Intelligence is a field of study that explores computer involvement in tasks requiring intelligence, imagination, and intuition. The development of expert systems ( software allowing computers to be experts on particular subjects) would enable computers to serve as consultants. Research on computer use of natural language (everyday human language) would lead to easier interaction between people and computer systems.

The major driving force in Fifth Generation computer design is to efficiently support very high-level programming languages. There are at least 7 basic programming styles that could form the basis of the Fifth Generation Computer. Each style consists of a class of programming languages and an associated computer architecture that most efficiently supports the class.

Programming Styles

Architecture

procedural

(OCCAM, PASCAL)

control flow (Transputer)

object-oriented

(Smalltalk)

object-oriented (DOOM)

single-assignment functional

(SISAL)

dataflow (Manchester)

applicative functional (PURE LISP)

reduction (GRIP)

predicate logic (Prolog)

logic (ICOT PIM)

production system (OPS5)

rule-based (Non-Von)

semantic network (NETL, IXL)

computational arrays (Connection Machine)

Requirements for New Generation of AI Computers

Non-numeric data increasing in importance. Examples are sentences, symbols, speech, graphics, and images,

Processing becoming more intelligent. Scientific --> data processing --> artificial intelligence,



Computing changing from sequential, centralized computation to parallel, decentralized computation, and

Todays computers are based on 40 year old Von Neumann model, only software changes (gap between hardware and software exists).


Non-Von Neumann Architectures

Although RISC machines hold the promise of winning a factor of 10 to 20 in performance by elimination the microprogram they are still traditional von Neumann machines, with all their inherent limitations.

Chief among these is that circuit speed cannot be increased indefinately. The speed of light is already a major problem for designers of high-end computers, and heat dissipation issues are turning supercomputers into state of the art air-conditioners. The most ironic aspect of the quest for ever increasing speed is that most of the transistors are idle nearly all the time! Modern computers are dominated by memory chips. A typical computer has one CPU chip and over 100 memory chips. When the CPU issues a request to read a 32 bit word, either four chips (byte organized static momory) or 32 chips (1 but wide dynamic memory) respond. The rest do nothing.

Von Neumann Bottleneck

The traditional structure of a computer with a single CPU that issues sequential request over a bus to a memory that responds to one request at a time has become know as the Von Neumann bottleneck. A great deal of research is currently taking place on how to eliminate this bottleneck, and a wide variety of radically different architectures have been proposed and built.

Memory Problem

The Von Neumann machine operates on one word of data at a time, even when the same operation must be performed on thousands of words. Finding the location of a piece of data in memory and bringing it to the CPU therefore limits computation speed.



Assignment Problem


The Von Neumann bottleneck blocks parallel operation and the effective use of more VLSI circuits, but more critically, it is the model for serious drawbacks in programming languages. Programs in present languages alter the data stored in memory one word at a time. Variables in the programs are use to designate the storage cells, and one entire assignment is needed to allow the data for each variable. Thus programs consist of repetitive sequences of instructions which control statements governing how many times and under what conditions the sequence of assignment statements are to be repeated. Conventional programs (FORTRAN, C, Pascal, etc.) consist of average 47% assignments! RISC approach tries to minimize it by having more registers in CPU, but this does not remove the proper bottleneck.

Object Problem

Conventional programs are more concerned with the object building than with program building. They use object forming operations to build the objects and they say how to build an object, not how to build the program. Thus such a popularity of object-oriented programming exists - it fits nicely to old conservative style of programming! For example, functional programming seeks to shift the forces from combining of objects to the combining of programs (programs are now simply mathematical functions or mapping). The goal of this shift from object level to function level description of programs is to emphasize the main issue of programming:

HOW PROGRAMS ARE PUT TOGETHER RATHER THAN HOW OBJECTS ARE PUT TOGETHER

Program Structure Problem

Present programs contain 3 kinds of structures: programs (instructions), expressions and variables or constants. A program is built from subprograms, the simplest of which are the assignment statements (variable expression); these in turn are built from expressions (for example, 2x+y); and expressions are built from variables and constants. Functional programs are built only from functions, logic programs from clauses (relations), rule-based programs from production rules.

New styles of programming are very distant from the Von Neumann model of computing and this means that either many problems of optimization must be solved before they can be run on Von Neumann computers with acceptable speed (compare speed of Prolog or Lisp programs) or else that new von Neuamnn architectures must be designed to execute 5th generation program efficiency.

All the attempts to build the non-Von Neumann architecture start with the assumption that a computer should consist of some number of control units, ALUs, and memory modules that can operate in parallel. Where they diverge is how many of each are present, what they are like, and how they are interconnected. Some designs use a relatively small number of powerful CPUs that are only weakly connected (LANS). Others use large numbers of ALUs that are sternly connected and work in unison, directed by a single control unit (CDC 6600 and 7600). These are just parallel control flow, objects oriented, data flow, reduction, logic, rule based and computational arrays architectures.

Sixth Generation - Neural Networks Intelligent Computers

The sixth generation marks the end of the eighties. The key words are brain-like computers. Although we cannot expect to emulate the brain, we could eliminate many intelligent functions that exists in biological systems. The brain and biological intelligence are the most complex areas in science. Many different disciplines are needed to attack these problems. The sixth generation project establishes cooperation between the following scientific disciplines: logic, linguistics, psychology, physiology, and computer science and engineering.

Sixth Generation intelligent computers of 2000s should be able to store specialized knowledge and able to learn new knowledge from its operation.

Neurocomputers, spanning both Neural models and Connectionist models, are massively parallel networks where the nodes range from a special purpose analog threshold device to a primitive programmable processor. Neural models are aimed at emulating the way neurons in the brain are interconnected on massively parallel networks. The number of neurons in the brain is approximately 10 billion, with each neuron connected to between 100 and 100,000 other neurons. In simple terms, a neuron can be considered a threshold unit, collection input signals at its synopsis, summing them together, and outputting a signal at the axons. Thus neural computers are massively parallel networks of processors, where a node is a special purpose analog or digital threshold device. Connectionist models are based on learning algorithms (Boltzmann machines) where computation involves the establishment, strength and feedback of inter-network connections. This leads to connectionist computers that are massively parallel networks of processors, where a node is a primitive, general purpose, programmable processor.

For Neurocomputers, the two basic design issues:

The massively parallel communications structure, and

The very primitive processing elements.

Both digital and analog Neurocomputers are under development. Some are wholly electronic while others are electro-operated, exploiting optical devices for communications.

Three Level of Man-Made Intelligence

Artificial Intelligence (AI),

Brain-Behavior Intelligence (BBI) or Neural Intelligence (NI), and

Molecular Intelligence (MI).

Artificial Intelligence (AI) -

Basically a branch of applied mathematics (logic) which provides rule-based software packages called expert systems. Computers optimized for expert systems are called fifth-generation computers. The fifth generation project started in the early eighties.

Sixth Generation Project Goals:

Physiology - A brain model which approximates human/animal cognitive processes.

Psychology - Clarification of the nature of understanding.

Linguistics - Understanding of the process of speech, syntax, semantics, and language.

Logic - New logic is needed, suitable for learning and inductive inference.

Technologies - Feature extraction; knowledge representation; learning; intelligent programming; application generations; language processing; image processing.

Brain-Behavior Intelligence (BBI) -

Seemingly mimics intelligent behavior in man and animals and approximates some of the brains functions. In the brain a signal fired from one neuron can trigger a cascade of thousands of other neurons; consequently BBI hardware has been designed to use massively parallel structure with a high interconnectivity between a large number of simple processors. Programming is frequently replaced by learning and training; presently the hardware solutions are based on large-scale integration (LSI) and wafer-scale integration (WSI) and are called sixth-generation computers. The sixth generation project started in the late eighties. An alternative name for BBI is neural intelligence (NI).

Molecular Intelligence (MI) -

Based on the premise that the cytoskelton within the living cell represents the molecular level of cognition and information processing. Research in this direction could result in an interface between biological and technological information devices; ultimate goals would be the design of biosensors, biochips, and biocomputers. Useful results are expected by the early nineties, as a tail of the sixth generation or even as the start of the seventh generation.

Research development and marketing are focused on BBI, but there are strong overlaps with AI and MI.

Relationships between AI, BBI, and MI

Artificial Intelligence (AI) - 5th generation

Brain Behavior Intelligence (BBI) - 6th generation

- Computers applied to neurobiology and behavior

- Adaptive and learning systems

- Neural Computers

- Goal-directed genetic systems

- Event-train computers

- Transformation (mapping) systems

- Associative memories and processors

- Fuzzy and pseudoassociative systems

- Hypercubes

- Array processors

- Programmable connection machines

Molecular Intelligence (MI) - 7th generation

5th

6th

7th

Electrical

X

X

Digital

X

X

Deterministic

X

X

Linear

X

X

Parallel

X

X

X

Optical

X

Analog

X

Random

X

Nonlinear

X

Time-coded

X

Massively Parallel

X

Self-adaptable

X

Learning

X

Brain-like

X

Biochip- Based

X

Seventh Generation - Biomolecular Computers (??-??)





Politica de confidentialitate | Termeni si conditii de utilizare



DISTRIBUIE DOCUMENTUL

Comentarii


Vizualizari: 1143
Importanta: rank

Comenteaza documentul:

Te rugam sa te autentifici sau sa iti faci cont pentru a putea comenta

Creaza cont nou

Termeni si conditii de utilizare | Contact
© SCRIGROUP 2024 . All rights reserved