Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
276 views4 pages

Historia de La Computadora en Ingles

The history of computers stretches back over 2500 years to the abacus. While mechanical calculators were invented in the 17th century to help with calculations, true computers required machines that could operate automatically according to stored programs. Charles Babbage designed but never completed machines in the 19th century that had the basic components of modern computers. Herman Hollerith built one of the first practical calculating machines in the late 19th century to help with census data. Key developments in binary code and Boolean algebra in the mid-19th and early 20th centuries helped lay the foundations for digital computers.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
276 views4 pages

Historia de La Computadora en Ingles

The history of computers stretches back over 2500 years to the abacus. While mechanical calculators were invented in the 17th century to help with calculations, true computers required machines that could operate automatically according to stored programs. Charles Babbage designed but never completed machines in the 19th century that had the basic components of modern computers. Herman Hollerith built one of the first practical calculating machines in the late 19th century to help with census data. Key developments in binary code and Boolean algebra in the mid-19th and early 20th centuries helped lay the foundations for digital computers.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

History of the computer

Computers truly came into their own as great inventions in the last two decades of
the 20th century. But their history stretches back more than 2500 years to the
abacus: a simple calculator made from beads and wires, which is still used in some
parts of the world today. The difference between an ancient abacus and a modern
computer seems vast, but the principle—making repeated calculations more
quickly than the human brain—is exactly the same.
Read on to learn more about the history of computers—or take a look at our article
on how computers work.
Photo: One of the world's most powerful computers: NASA's Pleiades
ICE supercomputer consists of 112,896 processor cores made from 185 racks of
Silicon Graphics (SGI) workstations. Photo by Dominic Hart courtesy of NASA
Ames Research Center.
Cogs and Calculators
It is a measure of the brilliance of the abacus, invented in the Middle East circa 500
BC, that it remained the fastest form of calculator until the middle of the 17th
century. Then, in 1642, aged only 18, French scientist and philosopher Blaise
Pascal (1623–1666) invented the first practical mechanical calculator, the
Pascaline, to help his tax-collector father do his sums. The machine had a series of
interlocking cogs (gear wheels with teeth around their outer edges) that could add
and subtract decimal numbers. Several decades later, in 1671, German
mathematician and philosopher Gottfried Wilhelm Leibniz (1646–1716) came up
with a similar but more advanced machine. Instead of using cogs, it had a "stepped
drum" (a cylinder with teeth of increasing length around its edge), an innovation
that survived in mechanical calculators for 300 hundred years. The Leibniz
machine could do much more than Pascal's: as well as adding and subtracting, it
could multiply, divide, and work out square roots. Another pioneering feature was
the first memory store or "register."

Artwork: Pascaline: Two details of Blaise Pascal's 17th-century calculator. Left:


The "user interface": the part where you dial in numbers you want to calculate.
Right: The internal gear mechanism. Picture courtesy of US Library of Congress.
Apart from developing one of the world's
earliest mechanical calculators, Leibniz is
remembered for another important
contribution to computing: he was the man
who invented binary code, a way of
representing any decimal number using
only the two digits zero and one. Although
Leibniz made no use of binary in his own
calculator, it set others thinking. In 1854, a
little over a century after Leibniz had died, Englishman George Boole (1815–1864)
used the idea to invent a new branch of mathematics called Boolean algebra. In
modern computers, binary code and Boolean algebra allow computers to make
simple decisions by comparing long strings of zeros and ones. But, in the 19th
century, these ideas were still far ahead of their time. It would take another 50–100
years for mathematicians and computer scientists to figure out how to use them
(find out more in our articles about calculators and logic gates).
Engines of Calculation
Neither the abacus, nor the mechanical calculators constructed by Pascal and
Leibniz really qualified as computers. A calculator is a device that makes it quicker
and easier for people to do sums—but it needs a human operator. A computer, on
the other hand, is a machine that can operate automatically, without any human
help, by following a series of stored instructions called a program (a kind of
mathematical recipe). Calculators evolved into computers when people devised
ways of making entirely automatic, programmable cal

Photo: Punched cards: Herman Hollerith perfected the way of using punched cards
and paper tape to store information and feed it into a machine. Here's a drawing
from his 1889 patent Art of Compiling Statistics (US Patent#395,782), showing how
a strip of paper (yellow) is punched with different patterns of holes (orange) that
correspond to statistics gathered about people in the US census. Picture courtesy
of US Patent and Trademark Office.
The first person to attempt this was a rather obsessive, notoriously grumpy English
mathematician named Charles Babbage (1791–1871). Many regard Babbage as
the "father of the computer" because his machines had an input (a way of feeding
in numbers), a memory (something to store these numbers while complex
calculations were taking place), a processor (the number-cruncher that carried out
the calculations), and an output (a printing mechanism)—the same basic
components shared by all modern computers. During his lifetime, Babbage never
completed a single one of the hugely ambitious machines that he tried to build.
That was no surprise. Each of his programmable "engines" was designed to use
tens of thousands of precision-made gears. It was like a pocket watch scaled up to
the size of a steam engine, a Pascal or Leibniz machine magnified a thousand-fold
in dimensions, ambition, and complexity. For a time, the British government
financed Babbage—to the tune of £17,000, then an enormous sum. But when
Babbage pressed the government for more money to build an even more
advanced machine, they lost patience and pulled out. Babbage was more fortunate
in receiving help from Augusta Ada Byron (1815–1852), Countess of Lovelace,
daughter of the poet Lord Byron. An enthusiastic mathematician, she helped to
refine Babbage's ideas for making his machine programmable—and this is why
she is still, sometimes, referred to as the world's first computer programmer. Little
of Babbage's work survived after his death. But when, by chance, his notebooks
were rediscovered in the 1930s, computer scientists finally appreciated the
brilliance of his ideas. Unfortunately, by then, most of these ideas had already
been reinvented by others.
Babbage had intended that his machine would take the drudgery out of repetitive
calculations. Originally, he imagined it would be used by the army to compile the
tables that helped their gunners to fire cannons more accurately. Toward the end
of the 19th century, other inventors were more successful in their effort to construct
"engines" of calculation. American statistician Herman Hollerith (1860–1929) built
one of the world's first practical calculating machines, which he called a tabulator,
to help compile census data. Then, as now, a census was taken each decade but,
by the 1880s, the population of the United States had grown so much through
immigration that a full-scale analysis of the data by hand was taking seven and a
half years. The statisticians soon figured out that, if trends continued, they would
run out of time to compile one census before the next one fell due. Fortunately,
Hollerith's tabulator was an amazing success: it tallied the entire census in only six
weeks and completed the full analysis in just two and a half years. Soon afterward,
Hollerith realized his machine had other applications, so he set up the Tabulating
Machine Company in 1896 to manufacture it commercially. A few years later, it
changed its name to the Computing-Tabulating-Recording (C-T-R) company and
then, in 1924, acquired its present name: International Business Machines (IBM).
Bush and the bomb
The history of computing remembers colorful characters like Babbage, but others
who played important—if supporting—roles are less well known. At the time when
C-T-R was becoming IBM, the world's most powerful calculators were being
developed by US government scientist Vannevar Bush (1890–1974). In 1925,
Bush made the first of a series of unwieldy contraptions with equally cumbersome
names: the New Recording Product Integraph Multiplier. Later, he built a machine
called the Differential Analyzer, which used gears, belts, levers, and shafts to
represent numbers and carry out calculations in a very physical way, like a gigantic
mechanical slide rule. Bush's ultimate calculator was an improved machine named
the Rockefeller Differential Analyzer, assembled in
1935 from 320 km (200 miles) of wire and 150 electric
motors. Machines like these were known
as analog calculators—analog because they stored
numbers in a physical form (as so many turns on a
wheel or twists of a belt) rather than as digits. Although
they could carry out incredibly complex calculations, it
took several days of wheel cranking and belt turning
before the results finally emerged.

Photo: A Differential Analyzer. The black part in the background is the main part of
the machine. The operator sits at a smaller console in the foreground. Picture
courtesy of NASA on the Commons (where you can download a larger version of
this photo).
Impressive machines like the Differential Analyzer were only one of several
outstanding contributions Bush made to 20th-century technology. Another came as
the teacher of Claude Shannon (1916–2001), a brilliant mathematician who figured
out how electrical circuits could be linked together to process binary code with
Boolean algebra (a way of comparing binary numbers using logic) and thus make
simple decisions. During World War II, President Franklin D. Roosevelt appointed
Bush chairman first of the US National Defense Research Committee and then
director of the Office of Scientific Research and Development (OSRD). In this
capacity, he was in charge of the Manhattan Project, the secret $2-billion initiative
that led to the creation of the atomic bomb. One of Bush's final wartime
contributions was to sketch out, in 1945, an idea for a memory-storing and sharing
device called Memex that would later inspire Tim Berners-Lee to invent the World
Wide Web. Few outside the world of computing remember Vannevar Bush today—
but what a legacy! As a father of the digital computer, an overseer of the atom
bomb, and an inspiration for the Web, Bush played a pivotal role in three of the
20th-century's most far-reaching technologies.
Turing—tested
Many of the pioneers of computing were hands-on experimenters—but by no
means all of them. One of the key figures in the history of 20th-century
computing, Alan Turing (1912–1954) was a brilliant Cambridge mathematician
whose major contributions were to the theory of how computers processed
information. In 1936, at the age of just 23, Turing wrote a groundbreaking
mathematical paper called "On computable numbers, with an application to the
Entscheidungsproblem," in which he described a theoretical computer now known
as a Turing machine (a simple information processor that works through a series of
instructions, reading data, writing results, and then moving on to the next
instruction). Turing's ideas were hugely influential in the years that followed and
many people regard him as the father of modern computing—the 20th-century's
equivalent of Babbage.
Although essentially a theoretician, Turing did get
involved with real, practical machinery, unlike many
mathematicians of his time. During World War II, he
played a pivotal role in the development of code-
breaking machinery that, itself, played a key part in
Britain's wartime victory; later, he played a lesser role in
the creation of several large-scale experimental
computers including ACE (Automatic Computing
Engine), Colossus, and the Manchester/Ferranti Mark I
(described below). Today, Alan Turing is best known for
conceiving what's become known as the Turing test, a
simple way to find out whether a computer can be considered intelligent by seeing
whether it can sustain a plausible conversation with a real human being.

You might also like