Computer science is the study of computation, information, and automation.
[1][2][3] Computer science
spans theoretical disciplines (such as algorithms, theory of computation, and information theory) to
applied disciplines (including the design and implementation of hardware and software).[4][5][6] Though
more often considered an academic discipline, computer science is closely related to computer
programming.[7]
Algorithms and data structures are central to computer science.[8] The theory of computation concerns
abstract models of computation and general classes of problems that can be solved using them. The
fields of cryptography and computer security involve studying the means for secure communication and
for preventing security vulnerabilities. Computer graphics and computational geometry address the
generation of images. Programming language theory considers different ways to describe computational
processes, and database theory concerns the management of repositories of data. Human–computer
interaction investigates the interfaces through which humans and computers interact, and software
engineering focuses on the design and principles behind developing software. Areas such as operating
systems, networks and embedded systems investigate the principles and design behind complex
systems. Computer architecture describes the construction of computer components and computer-
operated equipment. Artificial intelligence and machine learning aim to synthesize goal-orientated
processes such as problem-solving, decision-making, environmental adaptation, planning and learning
found in humans and animals. Within artificial intelligence, computer vision aims to understand and
process image and video data, while natural language processing aims to understand and process textual
and linguistic data.
The fundamental concern of computer science is determining what can and cannot be automated.[2][9]
[3][10][11] The Turing Award is generally recognized as the highest distinction in computer science.[12]
[13]
History
Main article: History of computer science
History of computing
Eniac.jpg
Hardware
Hardware before 1960Hardware 1960s to present
Software
SoftwareSoftware configuration managementUnixFree software and open-source software
Computer science
Artificial intelligenceCompiler constructionEarly computer scienceOperating systemsProgramming
languagesProminent pioneersSoftware engineering
Modern concepts
General-purpose CPUsGraphical user interfaceInternetLaptopsPersonal computersVideo gamesWorld
Wide Web
By country
BulgariaEastern BlocPolandRomaniaSoviet UnionYugoslavia
Timeline of computing
before 19501950–19791980–19891990–19992000–20092010–20192020–presentmore timelines ...
Glossary of computer science
Category
vte
Gottfried Wilhelm Leibniz (1646–1716) developed logic in a binary number system and has been called
the "founder of computer science".[14]
Charles Babbage is sometimes referred to as the "father of computing".[15]
Ada Lovelace published the first algorithm intended for processing on a computer.[16]
The earliest foundations of what would become computer science predate the invention of the modern
digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since
antiquity, aiding in computations such as multiplication and division. Algorithms for performing
computations have existed since antiquity, even before the development of sophisticated computing
equipment.[17]
Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623.[18] In
1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner.[19]
Leibniz may be considered the first computer scientist and information theorist, because of various
reasons, including the fact that he documented the binary number system. In 1820, Thomas de Colmar
launched the mechanical calculator industry[note 1] when he invented his simplified arithmometer, the
first calculating machine strong enough and reliable enough to be used daily in an office environment.
Charles Babbage started the design of the first automatic mechanical calculator, his Difference Engine, in
1822, which eventually gave him the idea of the first programmable mechanical calculator, his Analytical
Engine.[20] He started developing this machine in 1834, and "in less than two years, he had sketched out
many of the salient features of the modern computer".[21] "A crucial step was the adoption of a
punched card system derived from the Jacquard loom"[21] making it infinitely programmable.[note 2] In
1843, during the translation of a French article on the Analytical Engine, Ada Lovelace wrote, in one of
the many notes she included, an algorithm to compute the Bernoulli numbers, which is considered to be
the first published algorithm ever specifically tailored for implementation on a computer.[22] Around
1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical
information; eventually his company became part of IBM. Following Babbage, although unaware of his
earlier work, Percy Ludgate in 1909 published[23] the 2nd of the only two designs for mechanical
analytical engines in history. In 1913, the Spanish engineer Leonardo Torres y Quevedo wrote his Essays
on Automatics, and designed, inspired by Babbage, a theoretical electromechanical calculating machine
which was to be controlled by a read-only program. The paper also introduced the idea of floating-point
arithmetic. In 1920, to demonstrate his concepts, Torres built an electromechanical calculator that could
automatically perform all four arithmetic operations.[24] In 1937, one hundred years after Babbage's
impossible dream, Howard Aiken convinced IBM, which was making all kinds of punched card equipment
and was also in the calculator business[25] to develop his giant programmable calculator, the
ASCC/Harvard Mark I, based on Babbage's Analytical Engine, which itself used cards and a central
computing unit. When the machine was finished, some hailed it as "Babbage's dream come true".[