HISTORY OF COMPUTERS
Welcome to the
Science Lovers, a simple blog to share knowledge with full sincerity. This time
we will share knowledge about the History of Computer From Time to Time.
Hopefully his knowledge can be useful.
Since time
immemorial, the data processing has been performed by humans. Humans also find
equipment mechanics and electronics to help human beings in calculation and
data processing in order to get results faster. Computers that we see today is
a long evolution of human inventions since time immemorial in the form of
mechanical or electronic appliance.
The beginnings of
modern science (Computer Science) has started much since ancient times there.
Pasa ancient times take place, there are groups and societies which have a
charge of any religious ceremony. The person responsible is called shaman
(shaman). Shaman's ruling should be able to count the days in a year and
determine the arrival time of a season. This tradition (shamanistic) gave birth
to a primitive calculation mechanism by making notes in the form of notches on
a wooden stick or graffiti on the walls of the cave. Slowly the shaman is able
to compile and build stone structures such as those encountered in Stonehenge
(North Salisbury, England). Stonehenge is believed to be an ancient form of
calendar was designed to "capture" the sun as he turned toward
summer.
Calculating
developments continue to stage abacus (abacus, swipoa) (primitive calculator).
The traders in the past using the abacus to calculate trade transactions. Along
with the emergence of a pencil and paper, especially in Europe, the abacus lost
its popularity. Calculators oldest known since the year 460 BC. China is still
often use this tool until now, in Indonesian abacus instead be given to
children who are learning to count. The abacus is the first attempt of man in a
practical way the calculation process. This tool is not a machine that can
automatically calculate, the function that the user remember the current
calculation status while performing complex calculations. The value of each
seed in position, the seeds of the first row has a unit value, the second row
is worth tens and continuing as an existing row. Abacus tool is actually a
reminder to the wearer so as to calculate the mind. After thousands of years
after the abacus spread to mainland China, there is no progress to automate the
calculations and mathematical.
In short the
first-century BC been recorded mechanism Antikythera instrument used to record
and predict the movement of stars and planets (calendar). This tool is found in
Greece in 1901. The Arabic numeral system was introduced to Europe in the 7th
and 9th century AD, while the Roman numeral tetapdigunakan there until the 17th
century Arabic Numbers is introduced to the world the concept of
"zero" and define the concept of tens, hundreds , thousands, etc. so
as to simplify the mathematical calculations.
In the past,
mathematicians often do the questions are the same. They do so in order to gain
assurance that the answers to the questions are true. It can take weeks to
months of work by hand manually to check the truth of a theorem. Most of the
tables integrals, logarithms and trigonometric values obtained in this way.
One of the earliest
records the invention of computer technology is machine-made a researcher from
Germany named Wilhelm Schikard (1623) (University of Tubingen, Germany) is a
mechanical calculator first worked with 6 digits using gears to perform the
operations of addition, multiplication and division. Results of machine design
he submitted to the renowned astronomer Keppler at the time. Unfortunately,
making a stop to the prototype only.
In 1642, Blaise
Pascal (1623-1662), who at that time was 18 years old, found what he called a
numerical wheel calculator (numerical wheel calculator) to help his father make
tax calculations.
The brass square box
called Pascaline, used eight serrated wheel to add numbers to eight digits.
This tool is a calculator tool based on number ten. The weakness of this tool
is only terbataas to do the sums. In 1694, a German mathematician and
philosopher, Gottfred Wilhem von Leibniz (1646-1716) to improve Pascaline by
creating a machine that can multiply. Just like its predecessor, this
mechanical device works by using wheels serrations. However, the drawback is
that the gears are often collide with each other and that makes it special is
only Pascal semikin can fix it!
Gottfred Wilhem von
Leibniz (1646-1716) also found a binary number consisting of two digits 0 and
1. In 1671 he designed a calculating machine called pinion machine can work
mechanically for four calculus trigonometry.
By studying the
notes and drawings made by Pascal, Leibniz can improve the tool. Then in 1820,
mechanical calculators became popular. Charles Xavier Thomas de Colmar invented
machines that can perform four basic arithmetic functions. Colmar mechanical
calculator, arithometer, presenting a more practical approach in the
calculation because the tool can perform addition, subtraction, multiplication,
and division. With his ability, arithometer widely used until World War I.
Together with Pascal and Leibniz, Colmar helped build a mechanical computing
era.
Furthermore, in
1822, Charles Babbage built a prototype machine called difference engine in
1822 and with the help of British government planned the establishment of the
engine in 1823. The engine characteristic is a large, working with steam power,
fully automatic, print astronomical tables and controlled by a fixed
instruction program. Unfortunately again, this machine does not successfully
made in full in 1833.
Charles Babbage's
analytical engine that also make a parallel decimal counter that can operate at
50 decimal word and is able to store 1000 decimal number. The analytical engine
has a number of operations that permit conditional control instructions for the
engine can be run in a specific order and not in numerical order. The system
has a cedar Babbage conditional statement (input, conditional and cedar point
output (output). Augusta Ada Byron, Countess of Lovelace, met Babbage in 1833.
She described the Analytical Engine as weaving / weave "algebraic patterns
such as flowers weave Jacquard loom and leaves on the cloth. "Analysis of
the publication of the best recordings of the programming history of ancient
times. He described the basics of computer programming, including data
analysis, looping, and memory addressing!
Currently the
computer and its supporting tools have been included in every aspect of life
and work. Computers are now capable of more than just an ordinary mathematical
calculations. Among them is the computer system in gauze supermarket which is
able to read the code groceries, telephone exchange that handles millions of
calls and communications, computer networks and the Internet that connects the
various parts of the world.
After the discovery
of Babbage, Herman Hollerith of the statistics bureau of the United States has
used Hollerith tabulating machines successfully on a tender 1890. This device
is practically reads census information in the form of holes in the card.
Amazingly he found the idea of attention to the train conductor who punched
his ticket. Remarkable result of the invention of the hole card system, the
data read errors decreased dramatically, work flow more rapidly. Even more
importantly ya, unlimited storage. However, this machine still has its
limitations:
- Can only be as tabulation
- With hole cards can not be used for more complex calculations.
In 1938, Konrad Zuse
(Germany) to build a calculation engine, introduced a calculator that can be
programmed for the first time. Designed to solve the equations of complex
engineering, and called Z1. The engine control using strips perforation of the
film former, the data information based on the binary system. The first machine
that uses a binary system, while at it most of the machines using the decimal
system. 1939 followed by Z2 already using a system of electromechanical relays
in the form of 2600 pieces. Following the Z3 machines, electromechanical, and was
used to aid calculation during World War II. Is able to perform calculations
with four operating functions plus root calculation.
Late 1930s-hole
technique card machine has been well established and reliable.
Howard Aiken
(Harvard University) in collaboration with engineers at IBM makes
large-capacity automatic digital computer based on standard IBM
electromechanical components. Machine Aiken, called the Harvard Mark-I
superiority that is, able to handle numbers some 23 decimal, can display four
arithmetic operations: amount, less, for, time has a special program that is
built-in or subroutines to handle functions logarithms and trigonometry,
controlled perforated paper tape without provision for reversal (reversal) so
that the instructions "transfer of control" can not be programmed and
expenditure in the form of holes cards and electric typewriters.
Although the Mark-I
use counter rotating wheels of IBM as a key component in addition to
electromechanical relays, the machine is still classified as a "relay
computer". Characteristics:
- Working slowly: requires 3-5 seconds to calculate multiplication. (But faster than the machine Z3).
- Can work fully automatic.
- Can resolve long calculations without human intervention.
- Capable of performing calculations 4 arithmetic functions, logarithmic, exponential and trigonometric calculus.
- 23 digit capacity and speed of the process summation 0:03 seconds.
British
mathematician Alan Turing wrote a paper "On Computable Numbers"
(1936) which describes a hypothetical devais. The machine called "Turing
machine": the initial idea of a computer that can be programmed. And is
designed to display the logic operations and can read, write, or erase the
symbols written on a paper tape of infinite length.
After lengthy talks
about the initial idea until it reaches the computer scientists, let's look at
a computer that has been developed over generations. Here's a little
explanation:
A. First Generation
Computers (1945-1955)
The first generation
is the beginning of the development of electronic computing systems as a
replacement for mechanical computing systems. This is because the speed to
calculate the limited human and humans are very easy to make a fatal mistake.
In 1941, Konrad
Zuse, a German engineer to build a computer, the Z3, to design airplanes and
missiles the time of the second world war. The allies also made other advances
in the development of computer power. In 1943, the British completed a secret
code-breaking computer called Colossus to decode secret German. This increases
funding for the development of computers and accelerate the advancement of
computer engineering.
Another computer
development at present is the Electronic Numerical Integrator and Computer
(ENIAC), created by the cooperation between the US government and the
University of Pennsylvania. This computer was designed by John Presper Eckert
(1919-1995) dn John W. Mauchly (1907-1980), ENIAC is a versatile computer
(general purpose computer) that work 1000 times faster than Mark I. In the
mid-1940s, John von Neumann (1903-1957) joined the team of the University of
Pennsylvania to build a concept desin computers up to 40 years is still used in
computer engineering. Von Neumann designed the Electronic Discrete Variable
Automatic Computer (EDVAC) in 1945 with sebuh memory to accommodate either
program or data.
Both the US Census
Bureau and General Electric have UNIVAC. One of the impressive results achieved
by the UNIVAC is success
in predicting victory of Dwight D. Eisenhower in the 1952 presidential
election.
First generation
computers were characterized by the fact that the operating instructions are
made specifically for a particular task. Each computer has a program different
binary-coded-called "machine language" (machine language). This
causes the computer is difficult to be programmed and the speed limit. Another
feature is the use of first-generation computer vacuum tube (which makes the
computer at that time very large) and magnetic cylinder for data storage.
B. The second Computer generation (1955-1965)
In this generation,
the computer system is not equipped with the operating system, but some parts
of the existing information systems eg FMS operating system functions (Fortran
Monitoring System). In 1948, the invention of the transistor greatly influenced
the development of computers. The transistor replaced the vacuum tube in
televisions, radios and computers. As a result, the size of the electrical
machines is reduced drastically.
The transistor used
in computers began in 1956. Another is the development of magnetic-core memory
to help the development of second generation computers smaller, faster, more
reliable, and more energy efficient than their predecessors. The first machine
that utilizes this new technology is a supercomputer. IBM makes supercomputer
named Stretch, and Sprery-Rand makes a computer named LARC. In the early 1960s,
began to appear successful second generation computers in business, in
universities and in government. The second generation of computers is fully
computer using transistor and also have components that can be associated with
the computer at this time: a printer, storage, disk, memory, operating system,
and programs.
Program stored in
the computer and programming language that is in it gives flexibility to the
computer. Flexibility is increased performance at a reasonable price for
business use. Software industry also began to appear and grow during this
second generation computers.
C. Third Generation
Computers (1965-1980)
![]() |
Third Generation Computers (1965-1980) |
Because of the weakness
of transistors heat up quickly, Jack Kilby, an engineer at Texas Instruments,
developed the integrated circuit (IC: integrated circuit) in 1958. IC combined
three electronic components onto a small silicon disc made of quartz sand.
Scientists later managed to fit more components into a single chip, called a
semiconductor. As a result, computers became ever smaller as more components
were squeezed onto the chip. Other third-generation development is the use of
the operating system (operating system) that allows the engine to run many
different programs at once with a central program that monitored and
coordinated the computer's memory. Or can be a multiuser operating system (many
users at once) and multi-programming (many programs at once.
D. Fourth Generation
Computers (1980)
After IC, the
development becomes more obvious: reduce the size of circuits and electrical
components. Large Scale Integration (LSI) could fit hundreds of components onto
one chip. In the 1980s, the Very Large Scale Integration (VLSI) contains
thousands of components in a single chip.
Ultra-Large Scale
Integration (ULSI) increased that number into the millions. The ability to
install so many components in a chip that is half the size of dime helped
diminish the size and price of computers. It also increased their power,
efficiency and reliability. In 1981, IBM introduced the use of Personal
Computer (PC) for use in homes, offices, and schools. The number of PCs in use
jumped from 2 million units in 1981 to 5.5 million units in 1982. Ten years
later, 65 million PCs in use. Computers continued their trend toward a smaller
size, of computers that are on the table (desktop computer) into a computer
that can be inserted into the bag (laptop), or even a computer that can be
grasped (palmtops).
IBM PC to compete
with Apple Macintosh, introduced in the computer. Apple Macintosh became famous
for popularizing the computer graphics system, while his rival was still using
a text-based computer. Macintosh also popularized the use of mouse devices.
E. Fifth Generation
Computers (2001-present)
Explaining the fifth
generation computer becomes quite difficult because this stage is still on the
way. Examples are the fifth generation computer imaginative fictional HAL9000
computer from the novel by Arthur C. Clarke's 2001: A Space Odyssey. HAL
displays all the desired functions of a fifth-generation computer. With
artificial intelligence (artificial intelligence), HAL could reason well enough
to hold conversations with its human operators, use visual input, and learn
from his own experience.
Although it may be
the realization of HAL9000 is still far from reality, many of the functions
that had been established. Some computers can receive verbal instructions and
imitate human reasoning. The ability to translate a foreign language also
becomes possible. This facility seems simple. However, such facilities become
much more complicated than expected when programmers realized that human
understanding relies heavily on context and understanding rather than just
translate the words directly.
Many advances in the
field of computer design and technology increasingly allows the manufacture of
fifth generation computers. Two engineering advances which are mainly parallel
processing capabilities, which will replace the non-Neumann model. Non Neumann
model will be replaced with a system that is able to coordinate many CPUs to
work in unison. Another advancement is the superconducting technology that
allows the flow of electrically without any obstacles, which will accelerate
the speed of information.
Now that this
discussion about the History of Computers from Time to Time, May knowledge can
be beneficial. If there are not clear, please ask your friend via the comments
box below. Thank you for visiting the Science Lovers, do not forget to follow,
like and comments here.