Different types of computing --Grid, Cloud, Utility, Distributed and Cluster
computing
vasudevk
5 Feb 2009 3:52 PM
Comments1
Cloud Computing - http://en.wikipedia.org/wiki/Cloud_computing
Cloud computing is a computing paradigm shift where computing is moved away from personal computers
or an individual application server to a “cloud” of computers. Users of the cloud only need to be concerned
with the computing service being asked for, as the underlying details of how it is achieved are hidden. This
method of distributed computing is done through pooling all computer resources together and being
managed by software rather than a human.
The services being requested of a cloud are not limited to using web applications, but can also be IT
management tasks such as requesting of systems, a software stack or a specific web appliance.
Grid Computing - http://en.wikipedia.org/wiki/Grid_computing
Multiple independent computing clusters which act like a “grid” because they are composed of resource
nodes not located within a single administrative domain. (formal)
Offering online computation or storage as a metered commercial service, known as utility computing,
computing on demand, or cloud computing.
The creation of a “virtual supercomputer” by using spare computing resources within an organization.
Utility Computing - http://en.wikipedia.org/wiki/Utility_computing :
Conventional Internet hosting services have the capability to quickly arrange for the rental of individual
servers, for example to provision a bank of web servers to accommodate a sudden surge in traffic to a web
site.
“Utility computing” usually envisions some form of virtualization so that the amount of storage or computing
power available is considerably larger than that of a single time-sharing computer. Multiple servers are
used on the “back end” to make this possible. These might be a dedicated computer cluster specifically
built for the purpose of being rented out, or even an under-utilized supercomputer. The technique of
running a single calculation on multiple computers is known as distributed computing.
Distributed Computing - http://en.wikipedia.org/wiki/Distributed_computing
A method of computer processing in which different parts of a program are run simultaneously on two or
more computers that are communicating with each other over a network. Distributed computing is a type
of segmented or parallel computing, but the latter term is most commonly used to refer to processing in
which different parts of a program run simultaneously on two or more processors that are part of the same
computer. While both types of processing require that a program be segmented—divided into sections
that can run simultaneously, distributed computing also requires that the division of the program take into
account the different environments on which the different sections of the program will be running. For
example, two computers are likely to have different file systems and different hardware components.
Cluster Computing - http://en.wikipedia.org/wiki/Computer_cluster
A computer cluster is a group of linked computers, working together closely so that in many respects they
form a single computer. The components of a cluster are commonly, but not always, connected to each
other through fast local area networks. Clusters are usually deployed to improve performance and/or
availability over that provided by a single computer, while typically being much more cost-effective than
single computers of comparable speed or availability.
Additional Cloud Topics
Actor model
Cluster manager
Communication as a service
Grid computing
Online Office
Parallel computing
Parallel processing
Redundant Array of Inexpensive Servers
Software as a service
Utility computing
Virtual Private Cloud
Web operating system
Web Services
Grid Computing Concepts and related technology
Distributed computing
List of distributed computing projects
High-performance computing
Network Agility
Render farm
Semantic grid
Supercomputer
Computer cluster
Computon
Grid FileSystem
Edge computing
Metacomputing
Cloud Computing
Space based architecture (SBA)
Farm Computing:
Link farm
Blade server
Data center
Render farm
Comparison of wiki farms
Server room
Comparison of Grid Computing vs. Cluster Computing
Grid computing is focused on the ability to support computation across administrative domains
sets it apart from traditional computer clusters or traditional distributed computing. Grids offer a
way of using the information technology resources optimally inside an organization. In short, it
involves virtualizing computing resources.
Grid computing is often confused with cluster computing. Functionally, one can classify grids
into several types: Computational Grids (including CPU scavenging grids), which focuses
primarily on computationally-intensive operations, and Data grids, or the controlled sharing and
management of large amounts of distributed data.
Definitions of Grid Computing
There are many definitions of the term: Grid computing:
1. A service for sharing computer power and data storage capacity over the Internet
2. An ambitious and exciting global effort to develop an environment in which individual users can
access computers, databases and experimental facilities simply and transparently, without having
to consider where those facilities are located. [RealityGrid, Engineering & Physical Sciences
Research Council, UK 2001] http://www.realitygrid.org/information.html
3. Grid computing is a model for allowing companies to use a large number of computing resources
on demand, no matter where they are located.
www.informatica.com/solutions/resource_center/glossary/default.htm
Difference Between Cluster Computing VS. Grid Computing
When two or more computers are used together to solve a problem, it is called a computer cluster
. Then there are several ways of implementing the cluster, Beowulf is maybe the most known
way to do it, but basically it is just cooperation between computers in order to solve a task or a
problem. Cluster Computing is then just the thing you do when you use a computer cluster.
Grid computing is something similar to cluster computing, it makes use of several computers
connected is some way, to solve a large problem. There is often some confusion about the
difference between grid vs. cluster computing. The big difference is that a cluster is homogenous
while grids are heterogeneous. The computers that are part of a grid can run different operating
systems and have different hardware whereas the cluster computers all have the same hardware
and OS. A grid can make use of spare computing power on a desktop computer while the
machines in a cluster are dedicated to work as a single unit and nothing else. Grid are inherently
distributed by its nature over a LAN, metropolitan or WAN. On the other hand, the computers in
the cluster are normally contained in a single location or complex.
Another difference lies in the way resources are handled. In case of Cluster, the whole system
(all nodes) behave like a single system view and resources are managed by centralized resource
manager. In case of Grid, every node is autonomous i.e. it has its own resource manager and
behaves like an independent entity.
Characteristics of Grid Computing
Loosely coupled (Decentralization)
Diversity and Dynamism
Distributed Job Management & scheduling
Characteristics of Cluster computing
Tightly coupled systems
Single system image
Centralized Job management & scheduling system