Sari la continut | Sari la navigare
Computer Evolution
The Fifth Generation and the PC
Fifth Generation (Present and Beyond)
Defining the fifth generation of computers is somewhat difficult because the field is in its infancy. The most famous example of a fifth generation computer is the fictional HAL9000 from Arthur C. Clarke's novel, 2001: A Space Odyssey. HAL performed all of the functions currently envisioned for real-life fifth generation computers. With artificial intelligence, HAL could reason well enough to hold conversations with its human operators, use visual input, and learn from its own experiences. (Unfortunately, HAL was a little too human and had a psychotic breakdown, commandeering a spaceship and killing most humans on board.)
Though the wayward HAL9000 may be far from the reach of real-life computer designers, many of its functions are not. Using recent engineering advances, computers are able to accept spoken word instructions (voice recognition) and imitate human reasoning. The ability to translate a foreign language is also moderately possible with fifth generation computers. This feat seemed a simple objective at first, but appeared much more difficult when programmers realized that human understanding relies as much on context and meaning as it does on the simple translation of words.
Many advances in the science of computer design and technology are coming together to enable the creation of fifth-generation computers. Two such engineering advances are parallel processing, which replaces von Neumann's single central processing unit design with a system harnessing the power of many CPUs to work as one. Another advance is superconductor technology, which allows the flow of electricity with little or no resistance, greatly improving the speed of information flow. Computers today have some attributes of fifth generation computers. For example, expert systems assist doctors in making diagnoses by applying the problem-solving steps a doctor might use in assessing a patient's needs. It will take several more years of development before expert systems are in widespread use.
Personal Computers History and Development
The personal computer (PC) has revolutionized business and personal activities and even the way people talk and think; however, its development has been less of a revolution than an evolution and convergence of three critical elements - thought, hardware, and software. Although the PC traces its lineage to the mainframe and minicomputers of the 1950s and 1960s, the conventional thought that was prevalent during the first thirty years of the computer age saw no value in a small computer that could be used by individuals.
A PC is a microcomputer, so named because it is smaller than a minicomputer, which in turn is smaller than a mainframe computer. While early mainframes and their peripheral devices often took up the floor space of a house, minicomputers are about the size of a refrigerator and stove. The microcomputer, whose modern development traces back to the early 1970s, and fits on a desk.
From the start, the creation of the computer was centered around the concept that a single unit would be used to perform complex calculations with greater speed and accuracy than humans could achieve.
The PC is Born
In 1975, Rubik's Cube was put on store shelves and proved to many that the human brain was incapable of complex problem solving. But a ray of hope also appeared; the first PC was introduced. Micro Instrumentation and Telemetry Systems, Inc. (MITS) sold a kit for the MITS Altair 8800 that enabled computer hobbyists to assemble their own computers. It had no monitor, no keyboard, no printer, and couldn't store data, but the demand for it, like Rubik's Cube, was overwhelming.
The Altair proved that a PC was both possible and popular, but only with those people who would spend hours in their basements with soldering irons and wire strippers. The Altair, which looked like a control panel for a sprinkler system, didn't last, but it helped launch one of the largest companies in the computer world and gave a couple of young software programmers a start. In 1974, Bill Gates and Paul Allen wrote a version of BASIC for the Altair and started a company called Microsoft Corporation.
In 1976, another computer kit was sold to hobbyists - the Apple I. Stephen Wozniak sold his Volkswagen and Steve Jobs sold his programmable calculator to get enough money to start Apple. In 1977, they introduced the Apple II, a pre-assembled PC with a color monitor, sound, and graphics. It was popular, but everyone knew that a serious computer didn't need any of this. The kits were just a hobby and the Apple II was seen as a toy. Even the Apple name wasn't a serious, corporate sounding name like IBM, Digital Equipment Corporation, or Control Data.
But 1977 also brought competition. The Zilog Z-80 microprocessor, which had been introduced in 1975, was used in the Tandy Radio Shack TRS-80, affectionately called the "Trash 80." Apple, Commodore, and Tandy dominated the PC marketplace. The Apple II had 16K bytes of RAM and 16K bytes of ROM; Commodore Business Machines' Personal Electronic Transactor (PET) included 4K RAM and 14K ROM; and the TRS-80 had 4K RAM and 4K ROM.
Also in 1977, the Central Program for Microprocessors (CP/M) operating system was developed by Digital Research and Gary Kildall. From its introduction until 1980, CP/M was used in most PCs, but even that did not guarantee that a program or document could be written on one machine and read on another because each manufacturer used different floppy disk drives.
Apple introduced the floppy disk drive in 1978, allowing Apple II users to store data on something other than the cumbersome and unreliable tape cassettes that had been used up to that point. But despite the popularity of the three PCs, non-computer people still saw little reason to buy an expensive calculator when there were other ways to do the same things. In 1979, that all changed.
When VisiCalc was introduced for the Apple II, non-computer people suddenly saw a reason to buy a computer. VisiCalc, a spreadsheet program created by Dan Bricklin and Bob Frankston, allowed people to change one number in a budget and watch the effect it had on the entire budget. It was something new and valuable that could only be done with a computer. For thousands of people, the toy, the computer few could find a use for, had been transformed into a device that could actually do something worthwhile.
Microprocessors and high-tech gadgets were gradually worming their way into people's lives. In 1978, Sony introduced the Beta format video cassette recorder, and a year later the VHS video recorder and the Sony Walkman. And to remind everyone of how far we had to go, Star Trek: The Motion Picture came to theaters in 1979.
The Sinclair ZX-80 PC, which hit the market in 1980, used the same Z-80 chip as Commodore's PET and the Tandy TRS-80. The ZX-80 had 1K RAM and 4K ROM. Developed by British entrepreneur Clive Sinclair, the ZX-80 meant that people could enter the computer revolution for under $200. Its small size and price attracted people who had never thought about owning a PC.
The Commodore VIC-20, also introduced in 1980, had a color monitor and would eventually become the first PC to sell more than one million units. Even with all of the success the early PC manufacturers had in the late 1970s and early 1980s, the advances in microprocessor speeds, and the creation of software, the PC was still not seen as a serious business tool. Unknown to everyone in the computer industry; however, a huge oak tree was about to drop an acorn that would fall close to the tree and change everything.
Out of the Box and Obsolete
For consumers, the late 1980s were a time of frustration. No sooner had they learned to run their new PC and Macs than a new, better, larger, faster model was on the shelf. New versions of software, printers, and modems made it impossible to have the latest of anything.
In 1990, Intel's 386 and Motorola's 68030 microprocessors were at the top, then in 1991 Intel brought out the i486SX 20 MHz chip and Motorola introduced the 68040. Less than a year later Intel introduced the 50MHz 486 chip and Tandy brought out its $400 CD-ROM drive for PCs. Then, just to make everyone wonder what was going on, in 1991 Apple and IBM agreed to share technology by integrating the Mac into IBM's systems and using the IBM Power PC chip.
In 1992, Apple brought out the Apple PowerBook, a laptop that made everyone wonder just how small a full-function computer could get. A year later everyone knew the answer when Apple introduced the Newton Personal Digital Assistant (PDA). The Newton was supposed to be able to recognize hand-written notes and Apple sold 50,000 of them in 10 weeks.
In 1993, Intel introduced the 60MHz Pentium chip, the next generation of chips. The Pentium; however, had a nasty mathematical bug and its acceptance was slowed. Apple discontinued the workhorse of its fleet, the Apple II, which, despite the mind boggling changes in the industry, had lasted 17 years.
Not only were hardware and software obsolete, people were also getting caught up in their own obsolescence. For years, employers had included the operating systems and software names in their advertising for clerical and secretarial positions. As companies used more temporary workers and included both IBM clones and Macintosh's in their operations, proficiency with only one slammed the door on employment opportunities.
Many people enrolled in classes to learn the latest software or update their computer skills. A good, well-rounded employee needed to know desktop publishing, two or more word processing programs, at least one spreadsheet program, and a graphics package. They had to be able to access the company local area network (LAN), send and receive E-mail using high-speed modems, and solve problems with hardware and software to maximize their output. Microprocessor-driven telephones, cellular phones, and pagers added to the complexity of the job, and repetitive motion syndrome from using keyboards hour after hour created an army of people wearing wrist braces.
Many people left a job where their day was spent working at a computer terminal or PC and went home to enjoy the quite, relaxing camaraderie they found in Internet chat rooms, by visiting the World Wide Web, or reading their favorite newspapers and electronic magazines (ezines).
From its inception in 1975, the PC has become a focal point of business, education, and home life. The microprocessor, an amazing technology when it had 4000 transistors on a single chip, is now even more amazing when it has over 3 billion transistors on an even smaller chip. In 1982, when Time magazine made the computer its "Man of the Year," the PC was still in its infancy. "Big Iron" still dominated the high-tech environment and having a personal computer was a luxury.
The creation and success of the PC would not have been possible without the elimination of the concept that a computer was a large, centralized, data processor and number cruncher. Today the PC is a communication channel more than it is a computational tool. Millions of people work in their "electronic cottages," either operating their own business from home or telecommuting to work. It is strange that one of the first Intel 4004 microprocessors ever made, continues to operate and lead the world to the outer edges of time and space. In 1972 one of the small chips was installed in the Pioneer spacecraft. Today it continues to operate over 5 billion miles from earth.
Defining the fifth generation of computers is somewhat difficult because the field is in its infancy. The most famous example of a fifth generation computer is the fictional HAL9000 from Arthur C. Clarke's novel, 2001: A Space Odyssey. HAL performed all of the functions currently envisioned for real-life fifth generation computers. With artificial intelligence, HAL could reason well enough to hold conversations with its human operators, use visual input, and learn from its own experiences. (Unfortunately, HAL was a little too human and had a psychotic breakdown, commandeering a spaceship and killing most humans on board.)
Though the wayward HAL9000 may be far from the reach of real-life computer designers, many of its functions are not. Using recent engineering advances, computers are able to accept spoken word instructions (voice recognition) and imitate human reasoning. The ability to translate a foreign language is also moderately possible with fifth generation computers. This feat seemed a simple objective at first, but appeared much more difficult when programmers realized that human understanding relies as much on context and meaning as it does on the simple translation of words.
Many advances in the science of computer design and technology are coming together to enable the creation of fifth-generation computers. Two such engineering advances are parallel processing, which replaces von Neumann's single central processing unit design with a system harnessing the power of many CPUs to work as one. Another advance is superconductor technology, which allows the flow of electricity with little or no resistance, greatly improving the speed of information flow. Computers today have some attributes of fifth generation computers. For example, expert systems assist doctors in making diagnoses by applying the problem-solving steps a doctor might use in assessing a patient's needs. It will take several more years of development before expert systems are in widespread use.
Personal Computers History and Development
The personal computer (PC) has revolutionized business and personal activities and even the way people talk and think; however, its development has been less of a revolution than an evolution and convergence of three critical elements - thought, hardware, and software. Although the PC traces its lineage to the mainframe and minicomputers of the 1950s and 1960s, the conventional thought that was prevalent during the first thirty years of the computer age saw no value in a small computer that could be used by individuals.
A PC is a microcomputer, so named because it is smaller than a minicomputer, which in turn is smaller than a mainframe computer. While early mainframes and their peripheral devices often took up the floor space of a house, minicomputers are about the size of a refrigerator and stove. The microcomputer, whose modern development traces back to the early 1970s, and fits on a desk.
From the start, the creation of the computer was centered around the concept that a single unit would be used to perform complex calculations with greater speed and accuracy than humans could achieve.
The PC is Born
In 1975, Rubik's Cube was put on store shelves and proved to many that the human brain was incapable of complex problem solving. But a ray of hope also appeared; the first PC was introduced. Micro Instrumentation and Telemetry Systems, Inc. (MITS) sold a kit for the MITS Altair 8800 that enabled computer hobbyists to assemble their own computers. It had no monitor, no keyboard, no printer, and couldn't store data, but the demand for it, like Rubik's Cube, was overwhelming.
The Altair proved that a PC was both possible and popular, but only with those people who would spend hours in their basements with soldering irons and wire strippers. The Altair, which looked like a control panel for a sprinkler system, didn't last, but it helped launch one of the largest companies in the computer world and gave a couple of young software programmers a start. In 1974, Bill Gates and Paul Allen wrote a version of BASIC for the Altair and started a company called Microsoft Corporation.
In 1976, another computer kit was sold to hobbyists - the Apple I. Stephen Wozniak sold his Volkswagen and Steve Jobs sold his programmable calculator to get enough money to start Apple. In 1977, they introduced the Apple II, a pre-assembled PC with a color monitor, sound, and graphics. It was popular, but everyone knew that a serious computer didn't need any of this. The kits were just a hobby and the Apple II was seen as a toy. Even the Apple name wasn't a serious, corporate sounding name like IBM, Digital Equipment Corporation, or Control Data.
But 1977 also brought competition. The Zilog Z-80 microprocessor, which had been introduced in 1975, was used in the Tandy Radio Shack TRS-80, affectionately called the "Trash 80." Apple, Commodore, and Tandy dominated the PC marketplace. The Apple II had 16K bytes of RAM and 16K bytes of ROM; Commodore Business Machines' Personal Electronic Transactor (PET) included 4K RAM and 14K ROM; and the TRS-80 had 4K RAM and 4K ROM.
Also in 1977, the Central Program for Microprocessors (CP/M) operating system was developed by Digital Research and Gary Kildall. From its introduction until 1980, CP/M was used in most PCs, but even that did not guarantee that a program or document could be written on one machine and read on another because each manufacturer used different floppy disk drives.
Apple introduced the floppy disk drive in 1978, allowing Apple II users to store data on something other than the cumbersome and unreliable tape cassettes that had been used up to that point. But despite the popularity of the three PCs, non-computer people still saw little reason to buy an expensive calculator when there were other ways to do the same things. In 1979, that all changed.
When VisiCalc was introduced for the Apple II, non-computer people suddenly saw a reason to buy a computer. VisiCalc, a spreadsheet program created by Dan Bricklin and Bob Frankston, allowed people to change one number in a budget and watch the effect it had on the entire budget. It was something new and valuable that could only be done with a computer. For thousands of people, the toy, the computer few could find a use for, had been transformed into a device that could actually do something worthwhile.
Microprocessors and high-tech gadgets were gradually worming their way into people's lives. In 1978, Sony introduced the Beta format video cassette recorder, and a year later the VHS video recorder and the Sony Walkman. And to remind everyone of how far we had to go, Star Trek: The Motion Picture came to theaters in 1979.
The Sinclair ZX-80 PC, which hit the market in 1980, used the same Z-80 chip as Commodore's PET and the Tandy TRS-80. The ZX-80 had 1K RAM and 4K ROM. Developed by British entrepreneur Clive Sinclair, the ZX-80 meant that people could enter the computer revolution for under $200. Its small size and price attracted people who had never thought about owning a PC.
The Commodore VIC-20, also introduced in 1980, had a color monitor and would eventually become the first PC to sell more than one million units. Even with all of the success the early PC manufacturers had in the late 1970s and early 1980s, the advances in microprocessor speeds, and the creation of software, the PC was still not seen as a serious business tool. Unknown to everyone in the computer industry; however, a huge oak tree was about to drop an acorn that would fall close to the tree and change everything.
Out of the Box and Obsolete
For consumers, the late 1980s were a time of frustration. No sooner had they learned to run their new PC and Macs than a new, better, larger, faster model was on the shelf. New versions of software, printers, and modems made it impossible to have the latest of anything.
In 1990, Intel's 386 and Motorola's 68030 microprocessors were at the top, then in 1991 Intel brought out the i486SX 20 MHz chip and Motorola introduced the 68040. Less than a year later Intel introduced the 50MHz 486 chip and Tandy brought out its $400 CD-ROM drive for PCs. Then, just to make everyone wonder what was going on, in 1991 Apple and IBM agreed to share technology by integrating the Mac into IBM's systems and using the IBM Power PC chip.
In 1992, Apple brought out the Apple PowerBook, a laptop that made everyone wonder just how small a full-function computer could get. A year later everyone knew the answer when Apple introduced the Newton Personal Digital Assistant (PDA). The Newton was supposed to be able to recognize hand-written notes and Apple sold 50,000 of them in 10 weeks.
In 1993, Intel introduced the 60MHz Pentium chip, the next generation of chips. The Pentium; however, had a nasty mathematical bug and its acceptance was slowed. Apple discontinued the workhorse of its fleet, the Apple II, which, despite the mind boggling changes in the industry, had lasted 17 years.
Not only were hardware and software obsolete, people were also getting caught up in their own obsolescence. For years, employers had included the operating systems and software names in their advertising for clerical and secretarial positions. As companies used more temporary workers and included both IBM clones and Macintosh's in their operations, proficiency with only one slammed the door on employment opportunities.
Many people enrolled in classes to learn the latest software or update their computer skills. A good, well-rounded employee needed to know desktop publishing, two or more word processing programs, at least one spreadsheet program, and a graphics package. They had to be able to access the company local area network (LAN), send and receive E-mail using high-speed modems, and solve problems with hardware and software to maximize their output. Microprocessor-driven telephones, cellular phones, and pagers added to the complexity of the job, and repetitive motion syndrome from using keyboards hour after hour created an army of people wearing wrist braces.
Many people left a job where their day was spent working at a computer terminal or PC and went home to enjoy the quite, relaxing camaraderie they found in Internet chat rooms, by visiting the World Wide Web, or reading their favorite newspapers and electronic magazines (ezines).
From its inception in 1975, the PC has become a focal point of business, education, and home life. The microprocessor, an amazing technology when it had 4000 transistors on a single chip, is now even more amazing when it has over 3 billion transistors on an even smaller chip. In 1982, when Time magazine made the computer its "Man of the Year," the PC was still in its infancy. "Big Iron" still dominated the high-tech environment and having a personal computer was a luxury.
The creation and success of the PC would not have been possible without the elimination of the concept that a computer was a large, centralized, data processor and number cruncher. Today the PC is a communication channel more than it is a computational tool. Millions of people work in their "electronic cottages," either operating their own business from home or telecommuting to work. It is strange that one of the first Intel 4004 microprocessors ever made, continues to operate and lead the world to the outer edges of time and space. In 1972 one of the small chips was installed in the Pioneer spacecraft. Today it continues to operate over 5 billion miles from earth.