DEPARTMENT OF COMPUTER ENGINEERING
Subject: Cloud Computing
Unit-II
“Data Storage and Cloud Computing”
By-Prof. Nilam R.Thorat.
Compter Department
Cloud Computing Unit 2: Data Storage and Cloud Computing
Data Storage: Introduction to Enterprise Data Storage, Direct Attached Storage,
Storage Area Network, Network Attached Storage, Data Storage Management,
File System, Cloud Data Stores, Using Grids for Data Storage. Cloud Storage:
Data Management, Provisioning Cloud storage, Data Intensive Technologies
for Cloud Computing. Cloud Storage from LANs to WANs: Cloud
Characteristics, Distributed Data Storage.
Course Objective for Unit 2:
To learn various data storage methods on
cloud Course Outcome:
CO2: Use appropriate data storage technique on Cloud,
based on Cloud application
#Exemplar/Case Studies
Online Book Marketing Service, Online Photo Editing Service
Learning Resources Text Books
:
1. A. Srinivasan, J. Suresh,” Cloud Computing: A Practical
Approach for Learning and Implementation”, Pearson, ISBN:
978-81-317-7651-3
( Chapter 11 page no 137 )
2. Rajkumar Buyya, Christian Vecchiola, S. Thamarai Selvi,
“Mastering Cloud Computing”, McGraw Hill Education, ISBN-13:978-
1-25902995-0
Reference Books :
1. James Bond ,“The Enterprise Cloud”, O’Reilly Media, Inc. ISBN:
9781491907627
2.Dr. Kris Jamsa, “Cloud Computing: SaaS, PaaS, IaaS, Virtualization
and more”, Wiley Publications, ISBN: 978-0-470-97389-9
3.Anthony T. Velte Toby J. Velte, Robert
Elsenpeter, “Cloud Computing: A Practical Approach”,
2010, The McGraw-Hill.
4.Gautam Shrof, “ENTERPRISE CLOUD COMPUTING Technology
Architecture, Applications, Cambridge University Press, ISBN:
9780511778476
5.Tim Mather, Subra K, Shahid L.,”Cloud Security and Privacy”,
Oreilly, ISBN-13 978-81-8404-815
e-Books :
□ https://sjceodisha.in/wp-content/uploads/2019/09/CLOUD-
COMPUTING-Principles-and-Paradigms.pdf
□https://studytm.files.wordpress.com/2014/03/hand-book-of-cloud-
computing.pdf
□https://arpitapatel.files.wordpress.com/2014/10/cloud-computing-
biblel.pdf □
https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.500-
291r2.pdf
NPTEL video lecture link
□Cloud Computing
https://onlinecourses.nptel.ac.in/noc21_cs14/preview?
□ Cloud Computing and Distributed
System:
https://onlinecourses.nptel.ac.in/noc21_cs15/preview?
□ https://www.digimat.in/nptel/courses/video/106105167/L01.html
□ https://www.digimat.in/nptel/courses/video/l 06105167/L03 .html □
https://www.digimat.in/nptel/courses/video/106105167/L20.htm
l
The CO-FO Mapping Matrix
CO/
POl P02 POJ P04 P05 POti P07 P08 POP PO10 poll P012
PO
COl 1 2 1 1
C02 1 2 1
€03 1 2 1 -
2 - - - - - - -
CO A 1 2 2 1 - - - - - - -
1
CO? 1 2 2 2 - - - - - - - -
C06 1 2 2 1 1 - - - - - -
1
Data Storage: Introduction
• Technology has changed so quickly in the past few periods that the demand
for high quality data storage has detonated.
• The largest hard drives thirty years ago would hardly be able to store more
than a couple of current text forms.
• Floppy disks have been replaced by memory cards, USB drives, and CDs.
Each single day, the world is creating over 2.5 quintillion bytes of data,
with most of that data being formed in the past few years.
• Creativities are collecting all sorts of data: customer data, elevation data,
sales data, efficiency data, analytics, calculations, and much more.
Threats related to cloud computing
Disk failure, Disgruntled Employees, Network failure
Top 11 threats in cloud computing
The newest risks involved in cloud calculating point to problems connected
to configuration and authentication rather than the traditional focus on
malware and weaknesses, according to a new Cloud Security Alliance
report.
1. Data breaches
A data opening can be any cyber security incident or attack in which
complex or confidential information is observed, stolen, or used by an
unofficial individual.
2. Misconfiguration and inadequate change control
Misconfiguration occurs when computing properties are set up imperfectly,
leaving them vulnerable to malicious activity.
3. Lack of cloud security architecture and strategy
As companies travel parts of their IT substructure to the public cloud, one of
the largest challenges is applying the proper safety to guard against cyber-
attacks.
4. Insufficient identity, credential, access and key management
Security incidents and openings can occur due to the insufficient protection of
credentials, a lack of regular automatic rotation of cryptographic keys and
passwords, a lack of accessible identity and credential management systems, a
failure to use multifactor authentication, and a failure to use strong passwords.
5. Account hijacking
Through account hijacking, aggressors gain access to and abuse accounts that
are highly advantaged or sensitive. In cloud surroundings, the accounts at
highest risk are cloud service accounts or contributions.
6. Insider threats
Insiders don’t have to break through firewalls, virtual private networks
(VPNs), and other security battlements and instead operate on a reliable level
where they can traditional access networks, computer systems, and sensitive
data.
7. Insecure interfaces and APIs
A P Is (Application Programming Interfaces) and UIs (User Interfaces) are
normally the most visible parts of a system, often the only asset with a public
IP address available outside the trusted boundary. From authentication and
access control to encryption and activity monitoring, these interfaces must be
intended to protect against both accidental and mean attempts to circumvent
security.
8. Weak control plane
The control plane enables the security and integrity to supplement the data
plane, which provides the stability of the data. A weak control plane means the
person in charge is not in full control of the data organization’s logic, security,
and verification.
9. Meta structure and Appli structure failures
Potential failures exist at multiple levels in the meta structure and appli
structure model. For example, poor API execution by the cloud provider
offers attackers an opportunity to disrupt cloud customers by intruding
confidentiality, integrity, or availability of the service.
10. Limited cloud usage visibility
Limited cloud usage visibility occurs when a business does not have the ability
to imagine and analyze whether cloud service use within the party is safe or
malicious.
11. Abuse and nefarious use of cloud services
Malicious actors may leverage cloud computing resources to target users,
establishments, or other cloud earners, and can also host malware on cloud
services. Some instances of the misuse of cloud resources include: launching
DDoS attacks, email spam and phishing campaigns, "mining" for digital
currency, large-scale automatic click fraud, brute-force attacks of stolen
credential databases, and hosting of malicious or plagiarized content.
Introduction to Enterprise Data Storage
What is NAS Storage?
• N A S (Networked Attached Storage) mentions to a type of server
that brands storage available within a network. Its primary purpose is
to provide central and shared access to formless data, counting audio,
video, websites, text files, and other significant documents.
• Typically, N A S solutions are file level data storage explanations
connected to a specific computer network, which offers data access to
a heterogeneous group of clients.
• This kind of solution is an absolute necessity for industries and
governments that don’t want to store their data on a single computer or
hard drive.
NAS Storage Benefits
• The major benefit of N A S is easy access to stored data by any related
device. Some vendors also offer remote access.
• NAS proposals strategies for backup and competences to address
redundancy needs. In addition to property multiple mirrored hard
drives, NAS can be formatted to support replicated disks, RAID
formation, and erasure coding to ensure data integrity.
• All of this functionality takes place inside a small physical space,
associated to the sheer amount of physical storage required to store
SAN solutions.
What is NAS storage used for?
• There are N A S solutions available at every business level, whether
you’re a solo supplier or a large enterprise.
• Nevertheless, NAS provides the most value for mid-sized businesses,
and is also useful for businesses that employ remote teams and
workers in different time zones.
Direct Attached Storage
What is Direct Attached Storage?
• Direct devoted storage is data storage that is connected directly to a
computer such as a PC or server, as opposite to storage that is
associated to a computer over a network. Occasionally known as D
AS, direct devoted storage has an important role to play in many
organizations’ storage plan because of the specific benefits that it
offers.
How Does DAS Work?
• Almost every PC uses direct involved storage in the form of one or
more internal storage drives, which may be traditional hard disk
drives or faster Solid State Drives (SSDs), classically connected
using a Serial Progressive Technology Attachment (SATA)
interface.
• Many servers are also prepared with internal storage drives, which
may be associated using SATA, faster Small Computer System
Interface (S C S I), Serial-Attached SCSI(S A S), or other high speed
interfaces for better storing performance.
• But direct friendly storage does not have to be associated to a
computer system internally.
• It also contains external drives or drive inclusions (which may contain
multiple drives), typically connected using USB, eSATA, SAS or
SCSI to a separate computer system.
• The important feature of all direct devoted storage is that it is
measured by a single computer to which is attached.
• That means that any additional computer that needs to admission the
data stored on direct involved storage has to connect with the
computer it is devoted to, rather than being able to access the data
straight.
FIG 2.1 : Direct Attached Storage ( DAS )
As the name suggest, Direct Attached Storage is closely associated to the
computing device it servers, rather than use the more indirect network
connection.
Benefits of DAS
. High performance: Direct attached storage proposals fast access to data
because it is attached to the computer that usually needs it.
. Easy to setup and configure: Computer systems are frequently supplied
with internal direct attached storage which is ready to use instantly.
. Low cost: Direct attached storage contains only of the storage device itself,
plus any drive inclusion.
Drawbacks of DAS:
. Limited scalability: Direct friendly storage is challenging to scale because
options are imperfect by the number of internal drive bays, the
obtainability of external ports, and the capacity of external direct involved
storage devices.
. Poor performance possible when data needs to be shared: direct
devoted storage connected to a PC can be slow to provide data to other
computers on a network because performance depends in part on the
properties of the host PC.
. No central management and backup: Confirming the data stored on
direct attached storage is available and backed up is much more difficult
and generally more costly than arranging termination and backups on
schmoozed storage devices, which may include their own supervision,
RAID and backup software.
. DAS Architecture
Direct friendly storage architecture is very simple: PCs may access their own
direct devoted storage directly, or they can entrance data stored on direct
attached storage connected to storage servers over a network.
Other storage architectures such as those used with Network Attached Storage
(NAS) and Storage Area Network (S A N) solutions are more complex, but
offer benefits that direct committed storage cannot deliver.
Storage Area Network
What Is a Storage Area Network (SAN)?
• A Storage Area Network (S A N) is a dedicated, high speed network that
provides block level network access to storage.
• S A Ns are classically collected of hosts, switches, storage elements, and
storage strategies that are consistent using a variety of machineries,
topologies, and protocols.
• SANs may also span multiple sites.
• A SAN presents storage devices to a host such that the packing appears to be
locally attached.
• This simplified presentation of storage to a host is proficient through the use
of different types of virtualization.
Storage Storage Storage
FIG. : Storage Area Network ( SAN )
SANs are often used to:
● Improve application obtainability (e.g., multiple data paths)
● Enhance request presentation (e.g., off load storage functions, segregate
networks, etc.)
● Increase storage operation and efficiency (e.g., consolidate storage
resources, provide tiered storage, etc.), and advance data security and
security.
● SANs also typically play an important role in an organization’s Business
Continuity Management (BCM) activities.
● SANs are usually based on Fibre Channel (FC) technology that exploits
the Fibre Channel Protocol (FCP) for open systems and proprietary
alternatives for processors.
● In addition, the use of Fibre Channel over Ethernet (FCoE) makes it
possible to move FC traffic across existing high speed Ethernet
infrastructures and converge storage and IP procedures onto a single cable.
● Other technologies like Internet Small Computing System Interface
(ISCSI), commonly used in small and medium sized organizations as a
less expensive alternative to FC, and InfiniBand, commonly used in high
concert computing situations, can also be used.
In addition, it is possible to use gateways to move data between dissimilar
SAN technologies.
SNIA is a worldwide source for Vendor Neutral Storage and Information
Management Training & Education
SNIA is a universal source for vendor neutral training and education on
Storage and Information Management skills and provides an autonomous
understanding of a broad range of Storage and Information Management
machineries from basic foundations to advanced techniques.
Network Attached Storage
What is NAS (Network Attached Storage) and why is NAS Important?
It is accepted that data is a critical asset for companies
Without access to their corporate data, businesses
may not be capable of serving their customers with the
probable level of service. Poor customer
service, loss of sales, or even business liquidation can be the result of company
information not being available.
What is Network Attached Storage (N A S)
The S N I A Dictionary defines NAS as:
A term used to refer to storage devices that attach to a network and provide file
admittance services to computer systems. These devices normally consist of an
engine that implements the file services, and one or more plans, on which data is
stored. NAS uses file access protocols such as N F S or C I F S.
Some of the benefits of N A S include:
Simple to operate; a dedicated IT professional is normally not required Lower cost;
can meaningfully reduce wasted space over other storage machineries like SAN
Data Storage Management
Cloud Data Management
Cloud data management is a way to manage data across cloud stands, either
with or instead of on-premises storage. The goal is to curb rising cloud storage
costs, but it can be rather a complicated pursuit, which is why most businesses
employ an external company present cloud data management services.
Advantages of Cloud Data Management
Optimal cloud data management provides four key competences that help to
decrease cloud storage costs:
1. Gain Accurate Visibility Across Cloud Accounts into Actual Usage
2. Forecast Savings and Plan Data Management Plans
3. Archive Based on Actual Data Usage to Avoid Revelations
4. Radically Simplify Migrations
Challenges Faced with Enterprise Cloud Data Management
Cloud data management solutions should deliver you with options to remove
this disruption by clearly managing data across common formats such as file
and object.
Features of a Cloud Data Management Platform
Some common features and capabilities cloud data management solutions
should deliver:
. Data Analytics: Can you get a view of all your cloud data, how it’s being
used, and how much it’s costing you?
. Planning and Forecasting: Can you set policies for how data should get
moved whichever from one cloud storage class to another or from on
premises storage to the cloud.
. Policy based data archiving, data replication, and data management:
How much babysitting do you have to do to move and manage data?
. Fast Reliable Cloud Data Migration: Does the system support migrating
on-premises data to the cloud?
. Intelligent Cloud Archiving, Intelligent Tiering and Data Lifecycle
Management: Does the solution enable you to manage continuing data
lifecycle in the cloud? Does it support the different cloud storage classes?
How do Cloud Data Management Tools work?
As more initiative data runs on public cloud substructure, many different types
of tools and approaches to cloud data organization have emerged. The initial
focus has been on migrating and managing organized data in the cloud. Cloud
data integration, ETL (extraction, transformation and
loading), and iPaaS (integration platform as a service) tools are
intended to move and manage enterprise requests and databases in the
cloud.
What are the challenges faced with Cloud Data Management
security?
Most of the cloud data management security concerns are connected to
general cloud computing security questions governments’ face. Is
adoption of Cloud Data Management services growing?
As initiative IT organizations are progressively running hybrid,
multicloud, and edge computing infrastructure, cloud data management
services have appeared as a critical requirement.
How is Enterprise Cloud Data Management different from
Consumer Systems?
While consumers need to manage cloud storage, it is usually a matter
of capacity across personal storage and devices. Initiative cloud data
management involves IT organizations working closely with
departments to build plans and plans that will ensure unstructured data
growth is managed and data is accessible and available to the right
people at the right time.
Cloud Data Management Services
. Get accurate analytics across clouds with a single view across all
your users cloud accounts and buckets and save on storage costs
with an analytics driven approach.
. Forecast cloud cost optimization by setting different data lifecycle
policies based on your own cloud costs.
. Establish policy based multi cloud lifecycle management by
unceasingly moving objects by policy across storage
classes.
transparently (e.g., Amazon Standard, Standard-IA, Glacier,
Glacier Deep Archive).
. Accelerate cloud data migrations with fast, efficient data
migrations across clouds (e.g., AWS, Azure, Google and Wasabi)
and even on- premises (ECS, IBM COS, Pure Flash Blade).
. Deliver powerful cloud-to-cloud data replication by running,
monitoring, and managing hundreds of relocations faster than ever
at a fraction of the cost with Elastic Data Migration.
. Keep your users happy with no recovery fee surprises and no
disruption to users and applications from making poor data
movement decisions based on when the data was created.
Improve performance in cloud through load balancing
Improve your cloud application presentation using Load Balancer
Random spikes in online traffic can cause trouble for the best websites and
requests. During online sales events such as Amazon Prime Day, even a brief
outage of few seconds can cost you millions in revenue.
CLOUD
INSTAN
CE
CLOUD
INSTAN
CE
CLOUD
WWW INSTAN
FIREWALL LOAD CE
BALANCER
VIRTUAL ROUTER
USER DIRECTED
REQUESTS TRAFFIC
FIG.: Firewall Load Balancer
What is a Load Balancer?
Load balancer is a device which allocates network or application traffic across
a number of servers. It enables the systems to fulfill requests in a manner that
exploits speed and capacity utilization.
In this manner, a load balancer performs the following functions:
. Allocates client requests or network load competently across multiple
servers
. Ensures high obtainability and dependability by sending requests only to
servers that are online
. Provides the elasticity to add or subtract servers as demand dictates
Key requirements of Load Balancers
Load balancers are an important component of
Application Delivery Controller (A D C).
Automation
Cloud is very elastic and dynamic in nature. And if your virtual machine
(V M) requires manual operation you won’t be able to professionally
perform the operations. The cloud environment should have responsive
formation and provisioning to incorporate automation.
Manageability
Load balancing is pretty much monetized right now. The act of load
balancing very straight forward. The fact that it is an essential part of the
cloud architecture, you can get open source and enterprise offerings.
Flexibility
The business cloud should preferably be multi-cloud for 100% reliability.
For example, if you are solely relying on A W S for cloud support and it
has an outage what will you do? For migrating to a dissimilar cloud the
development team would have to redesign the architecture.
Load Balancing Algorithms
A load balancing algorithm controls the circulation of impending requests to
your bunch of servers.
Round Robin
The most original load distribution technique. In a round robin scenario, the
load balancer simply runs down the list of servers, sending one assembly to
each in turn, and starting at the top of the list when it reaches the end.
FIG 2.4 : Round Robin
Weighted Round Robin
It works on the same principle as Round Robin, but the number of
influences that each machine receives over time is balanced to a ratio
weight predefined for each machine.
FIG 2.5 : Weighted Round Robin
For example, the manager can define that Server 1 can handle five times
the traffic of Servers 2, and thus the load balancer should send five
requests to Server 1 for each one demand sent to Servers 2.
Least Connections
Handovers the latest session to the server with the least connections at the
time of session initiation. To avoid latency, this method is sensible in
an environment where server capacity and resources are uniform. Least
Connections is measured problematic, as most implementations are
challenged to accurately quantity actual server workload.
FIG 2.6 : Least Connections
In the above instance Server 1 is handling only one appeal from (5)
whereas Server 2 is handling 3 requests (2,4,6). Thus the succeeding
requests will be handled by Server 1.
Weighted Least Connections
Identical to Least Connection, except that servers are nominated based on
capacity, not just availability. For each node, the admin stipulates a
Connection Limit value, and the system creates a comparative algorithm
on which load balancing is based.
Least Pending Requests
The developing industry standard, Least Pending Requests selects the
server with the least active sessions based on the real-time monitoring.
Needs assignment of both Layer 7 and TCP profile to the virtual server.
File System
Cloud File Systems with architecture
Every era of IT redesigns how administrations process data, design applications
and provision infrastructure. The most significant, transformational
technologies often necessitate different application and infrastructure
constructions to fully exploit their benefits. Currently, cloud is triggering a
transformation of initiative systems and software.
Evolution of app architecture
Since the development of the processor and minicomputer, enterprise
applications have been monolithic -- large, extensive systems with upward of a
hundred thousand lines of code. The PC-era and client-server computing led to
a modicum of functional disaggregation, as application user boundaries were
built for the PC OS.
Additionally, back-end software was split into distinct middleware and
database systems. After adapting to technology developments, like server
virtualization and SAN storage pools, such N-tier separating persists today in
most on-premises systems.
FIG 2.7: Evolution of app infrastructure
Cloud services significantly alter application architectures through
several innovations:
. On demand, repeatedly scalable calculate instances and multiple
forms of storage
Event driven server less facilities
Application container examples and achieved container clusters
. Packaged amenities for many standard functions such as folders, data
granaries, data caching, load opposite, message queuing, virtual network
routing and closure, notifications and pleased distribution
Cloud File Systems GFS/HDFS
What Does Google File System (GFS) Mean?
Google File System (GFS) is a scalable Distributed File System (DFS)
created by Google Inc. and developed to accommodate Google’s increasing
data processing requirements. GFS provides fault
tolerance, reliability, scalability, availability and
performance to large networks and
connected nodes. The Google File System
exploited on the strengthof off the
shelf servers while minimizing hardware weaknesses.
GFS is also known as Google FS.
Google File System (GFS)
The GFS node cluster is a single master with multiple chunk servers that are
constantly accessed by dissimilar client systems. Chunk servers store data as
Linux files on local disks. Stored data is divided into large chunks (64 MB),
which are imitation in the network a minimum of three times. The large chunk
size reduces network overhead.
GFS features include:
. Fault tolerance . Critical data replication . Automatic and efficient data recovery
. High shared throughput
. Reduced client and master interaction because of large chunk server size
. Namespace organization and locking
. High availability
The largest GFS clusters have more than 1,000 nodes with 300 TB disk storage
dimensions. This can be opened by hundreds of clients on a unceasing basis.
Hadoop Distributed File System (HDFS)
The Hadoop Distributed File System (HDFS) is the primary data storage system
used by Hadoop applications.
Hadoop File System was progressive using distributed file system design. It is
run on product hardware. Unlike other circulated systems, HDFS is highly fault
tolerant and intended using low-cost hardware.
Features of HDFS
. It is appropriate for the distributed storage and processing.
. Hadoop brings a command interface to cooperate with HDFS.
. The built in servers of name node and data node help operatives
to easily check the status of cluster.
. Streaming access to file system data.
. HDFS offers file permissions and confirmation.
HDFS Architecture
Given below is the architecture of a Hadoop File System.
FIG 2.8 : HDFS Architecture
HDFS follows the master-slave architecture and it
has the following
elements.
Name node
The name node is the commodity hardware that comprises the GNU/Linux
operating system and the name node software. It is software that can be run on
commodity hardware. The system taking the name node acts as the master
server and it does the subsequent tasks -
. Manages the file system namespace.
. Normalizes client’s access to files.
. It also performs file system operations such as renaming, closing, and
opening files and directories.
Data node
The data node is a commodity hardware having the GNU/Linux operating
system and data node software. For every node (Commodity hardware/System)
in a cluster, there will be a data node. These nodes attain the data storage of
their system.
. Data nodes perform read write operations on the file systems, as per
client request.
. They also perform procedures such as block creation, deletion,
and replication according to the instructions of the name node.
Block
Usually the user data is stored in the files of HDFS. The file in a file system
will be divided into one or more sections and/or stored in distinct data
nodes. These file segments are called as blocks. In other words, the
minimum amount of data that HDFS can read or write is called a Block.
The default block size is 64MB, but it can be improved as per the need to
change in HDFS formation.
Goals of HDFS
Fault detection and recovery - since HDFS includes a large number of
product hardware, failure of mechanisms is frequent. Consequently HDFS
should have devices for quick and automatic fault detection and recovery.
Huge datasets - HDFS should have hundreds of nodes per cluster to manage
the applications having huge datasets.
Hardware at data - A requested task can be done competently, when the
computation takes place near the data. Especially where huge datasets are
involved, it reduces the network traffic and increases the throughput.
2.2.1 Cloud Data Stores
Should you store your data in the cloud?
At its maximum basic level, “the cloud” is just fancy talk for a network of
connected servers. (And a server is merely a computer that delivers data or
services to other computers). When you save files to the cloud, they can be
opened from any computer connected to that cloud’s network. So it’s not just
some nebulous concept. It’s physical, tangible, real.
FIG 2.9 : Cloud Data Stores
When you save files to the cloud, you can access them on any computer,
provided it’s associated to the Internet and you’re signed into your cloud
facilities platform. Take Google Drive. If you use Gmail, you can admittance
Drive wherever you can access your email. Sign in for one service and find
your entire library of forms and photos in another.
Google Data Store.
Google Cloud Data store (Cloud Data store) is a highly scalable, fully
achieved NoSQL database service offered by Google on the Google Cloud
Platform. Cloud storage is somewhat that "allows you to save data and files in
an off-site location that you access either complete the public internet or a
faithful private network connection.
Google Cloud Platform
Google Cloud Platform (GCP), offered by Google, is a suite of cloud
computing services that runs on the same infrastructure that Google uses
inside for its end-user products, such as Google Search, Gmail, Google Drive,
and YouTube.
Google Cloud Platform is a part of Google Cloud, which comprises the Google
Cloud Platform public cloud infrastructure, as well as Google Workspace (G
Suite), enterprise versions of Android and Chrome OS, and Application
Programming Interfaces (APIs) for machine learning and enterprise mapping
services.
Information security concerns associated with data stored in cloud Top
5 Private Cloud Security Issues
1. Lack of reliable security controls covering over traditional server and
virtualized private cloud infrastructures
2.Increasing difficulty of infrastructure resulting in more time/effort for
implementation and conservation
3.Lack of staff with skills to manage security for a software defined data
center (e.g., virtual compute, network, storage)
4.Imperfect visibility over security for a software defined data center (e.g.,
virtual compute, network, storage)
5.Advanced threats and attacks
Using Grids for Data Storage Grid Computing
Grid computing is the practice of leveraging multiple computers, often
geographically scattered but connected by networks, to work composed to
accomplish joint tasks. It is classically run on a “data grid,” a set of
computers that directly interact with each other to organize jobs.
How Does Grid Computing Work?
• Grid computing works by running dedicated software on every computer
that participates in the data grid. The software acts as the manager of the
complete system and coordinates numerous tasks across the grid.
• Exactly, the software assigns subtasks to each computer so they can work
concurrently on their respective subtasks.
• After the achievement of subtasks, the outputs are gathered and combined
to complete a larger scale task.
• The software lets each computer communicate over the network with the
other computers so they can share material on what portion of the subtasks
each computer is running, and how to associate and deliver outputs.
FIG 2.10 : Grid Computing
With grid computing, particular software runs on every computer that
contributes in the data grid. This controller software acts as the manager
of the entire system and coordinates various tasks across the grid.
How is Grid Computing Used?
• Grid computing is especially useful when different subject matter experts
need to collaborate on a project but do not essentially have the means to
directly share data and computing resources in a single site.
• By joining forces despite the geographical distance, the distributed teams are
able to leverage their own incomes that contribute to a bigger effort.
FIG 2.11 : In Memory Data
Grid
• While frequently seen as a large scale distributed computing endeavor,
grid computing can also be leveraged at a local level.
• For instance, a corporation that allots a set of computer nodes running in
a cluster to together perform a given task is a simple instance of grid
computing in action.
Cloud Storage:
What is Cloud Storage?
Cloud storage is a flexible and convenient new age solution to store data. In the
past, data stored on hard drives and external storage devices such as floppy
disks, thumb drives, and compact discs.
Different cloud storage providers
How Does Cloud Storage Work?
Cloud storage lets you store data on hosted servers. The remote servers are
managed and owned by presenting companies. You can access your data via the
Internet.
With so many cloud storage earners that have flooded the market today, the size
and conservation of cloud storage systems can be quite different created on the
provider.
Data Google
Limit for App
Premium play
Free Store
Option Store
Version Rating
Rating
$11.99/month for
2GB 2TB 4.5 4.3
$1.99/month for
15GB 100GB 4.6 4.4
$3.99/month for
10 GB 500GB 3.5 4.4
$1.99/month for
5 GB 100GB 4.7 4.6
$2.49/Month for
10 GB 1TB 3.9 4.1
$5/month For 500
5 GB GB 5 3.2
$10/month for
10 GB 100GB 4.8 4.7
$4.34/moth for 83
5 GB GB 4.5 3.7
$0.99/month for
5 GB 50GB 4.9 3.5
The smallest cloud storage system may contain of a single data server that
connects to the Internet, while some cloud storage systems are so enormous,
that the equipment can fill entire warehouses. These warehouses are called
“server farms.”
Leading cloud storage providers like Microsoft Azure, Amazon Web Services
(AWS), and Google Cloud Computing preserve such gigantic data centers that
store data from all over the world.
Data for Cloud Storage Providers
What Can Cloud Storage Do for You?
Cloud storage proposals many benefits which can potentially improve
efficiency and productivity in terms of backing up and safeguarding the data.
Here are a few of them:
• Accessibility: Data stored on the cloud can be accessed on the go, anytime,
and anywhere. All you need is an internet connection.
• Mobility: Cloud storage providers even offer applications that work with
various devices such as mobile phones and tablets.
• Synchronization: You have the option to sync all your files across all
devices so that you have the most current obtainable to you all the time,
creating a single source of truth.
• Collaboration: Cloud storage services come with structures that allow
multiple people to collaborate on a single file even if they are spread across
numerous locations across the world.
• Cost-Saving: Cloud storage earners generally require you to pay only for the
amount of storage you use, which avoids businesses from over investing into
their storage needs.
• Scalable: Cloud storage suppliers offer various plans that can quickly scale
your data storage capacity to meet the increasing needs of your business.
• Low Maintenance: The concern of up keeping of the storage lies with the
cloud storage provider.
• Space-Saving: Servers and even other forms of physical storage devices
such as hard disks and USBs necessitate space.
• Reduced Carbon Footprint: Even a small data center necessitates
servers, networks, power, cooling, space, and ventilation, all of which can
contribute suggestively to energy depletion and CO2 emissions
• Security: Cloud storage solutions are calculated to be very resilient. They
work as resistant standby as most cloud storage earners have about two to
three backup servers located in unlike places globally.
How to Use Cloud Storage?
While personalities use it for personal storage to store email backups, pictures,
videos, and other such own files, creativities use cloud storage as a
commercially maintained remote backup solution, where they can steadily
transfer and store data files and even share them among various locations.
What Are the Different Types of Cloud Storage?
There are primarily three types of cloud storage solutions:
1. Public Cloud Storage
Appropriate for unstructured data, public cloud storage is accessible by third-
party cloud storage providers over the open Internet. They may be obtainable for
free or on a paid basis. Users are usually essential to pay for only what they use.
2. Private Cloud Storage
A private cloud allows governments to store data in their environment. The
infrastructure is hosted on-premises. It offers much assistance that comes with a
public cloud service such as self-service and scalability; the dedicated in-house
resources increase the scope for customization and control. Internal hosting and
company firewalls also make this the more secure option.
3. Hybrid Cloud Storage
As the name suggests, hybrid cloud allows data and applications to be shared
between a public and a private cloud. Businesses that have a secret, the on-
premise solution can effortlessly scale up to the public cloud to handle any short
term points or overflow.
Data Management
With company databases increasing in size, and multiple people trying to access
the data from frequent locations, managing that data has become an gradually
challenging task. Today’s data managers need a system that is versatile sufficient
to meet all of their employees’ access needs, while still ensuring data security.
Many companies are finding solutions to these data complications using the
cloud.
What is cloud data management?
Cloud data management is the practice of storing a company’s data on an offsite
server that is naturally owned and overseen by a vendor who specializes in cloud
data hosting. Managing data in the cloud offers an automatic backup strategy,
professional funding, and ease of entrance from any location.
7 benefits of cloud data management
The benefits of a cloud data management system are constant with the overall
benefits the cloud has to offer, and they are substantial.
1. Security: Modern cloud data management is often more secure than on-
premises solutions. In fact, 94% of cloud adopters report security
enhancements.
2.Scalability and savings: Cloud data management lets users scale facilities
up or down as needed.
3.Governed access: With enhanced security comes greater peace of mind
regarding governed data access. Cloud storage means team members can access
the data they need from anywhere they are.
4.Automated backups and disaster recovery: The cloud storage vendor can
manage and automate data backups so that the company can focus its nsideration
on other things, and can rest guarantee that its data is safe.
5.Improved data quality: An integrated, well-governed cloud data
management solution helps corporations tear down data silos and generate a
single source of truth for every data point. Data remains clean, reliable, and
up-to-date.
6.Automated updates: Cloud data management suppliers are
committed to providing the best services and capabilities. When
requests need updated, cloud providers run these updates repeatedly.
That means your team doesn’t need to pause work while they wait
for IT to update everyone’s system.
7.Sustainability: For companies and brands committed to
decreasing their conservation impact, cloud data management is a
key step in the process.
Provisioning Cloud storage Cloud provisioning
Cloud provisioning is the allocation of a cloud provider’s resources and
services to a customer.
Cloud provisioning is a key feature of the cloud computing model,
relating to how a customer acquires cloud services and capitals from a
cloud provider. The developing catalog of cloud services that customers
can provision includes Infrastructure As A Service (IaaS), Software As
A Service (SaaS) and Platform As A Service (PaaS) in public or
private cloud environments.
Types of cloud provisioning
The cloud provisioning process can be directed using one of three
delivery models. Each delivery model differs dependent on the kinds of
incomes or services an organization purchases, how and when the cloud
earner delivers those resources or services, and how the customer pays for
them.
Data Intensive Technologies for Cloud Computing Data- Intensive
Computing
Characterizing data-intensive computations
Data-intensive requests not only deal with huge volumes of data but, very
often, also exhibit compute intensive possessions. Following Figure
identifies the domain of data-intensive computing in the two upper
quadrants of the graph.
FIG 2.12 : Data intensive research issues
• Data-intensive requests handle datasets on the scale of numerous
terabytes and petabytes. Datasets are normally persisted in numerous
formats and distributed across different locations.
• Such requests process data in multistep analytical pipelines, including
alteration and fusion stages.
• The processing necessities scale almost linearly with the data size,
and they can be easily processed in parallel.
• They also need efficient machines for data management, filtering and
fusion, and effectual querying and distribution.
Cloud Storage from LANs
to WANs
What Is Cloud Storage?
What is cloud storage? Let’s start with a definition.
Definition: Cloud storage is the process of storing digital data in an online space
that spans multiple servers and locations, and it is usually preserved by a hosting
company.
Here’s a diagram representing the process:
Some big questions remain, even when you appreciate the process. If you are
asking ‘what is cloud storage?’, you probably also want to know:
• How cloud storage works?
• Where is your data?
• What are the benefits?
• What are the challenges?
FIG : Cloud Service Provider
How Cloud Storage Works
In the past, administrations relied on storing data on large hard drives or
external storage devices, like thumb drives, compact discs, or - yes, we’ll say
it - floppy disks. Over time, governments found ways to association data onto
a local on-premise storage device or devices. Today, personalities and
creativities alike use cloud storage to store data on a remote database using
the internet to connect the computer and the off-site storage system.
Where Is Your Data?
The cloud can feel subtle to some users. Many wonder not only what cloud
storage is, but where all your data goes after you click “save” or “send”?
Let’s use. You can log on and view your email on any compute webmail as an
instance (Gmail, Yahoo, Hotmail, etc.)r or device - your laptop, a friend’s
computer, a library computer, your phone, a tablet while on vacation - it doesn’t
matter. You’ll see all the same emails and folders.
That’s because those emails aren’t stored on the hard drive
of your computer. They are stored on the email breadwinners’ servers.
Cloud Characteristics
Here’s a list of the top 10 major characteristics of Cloud Computing:
1. Resources Pooling
2. On-Demand Self-Service
3. Informal Maintenance
4. Scalability And Rapid Elasticity
5. Economical
6. Measured And Reporting Service
7. Security
8. Automation
9. Resiliency And Availability
10. Large Network Access
Distributed Data Storage
Distributed cloud, cloud computing, authority computing — the different forms
of computing can get unclear and it can be hard to understand what the unique
differences are.
So, if you’ve ever asked yourself what the thin cloud is, the difference between
cloud computing and distributed computing or maybe what are the benefits of
the distributed cloud — you’ll find the responses in this article. And, if you
ever wondered the change between scattered cloud and the block chain
1. What is distributed cloud storage?
Let’s start from the basics — what precisely is distributed cloud storage? It’s
just as it sounds, cloud storage with a geographically discrete infrastructure. By
spreading people’s data across a network, it permits data to be located closer to
the end-user and in turn speeds up handovers. Scattered cloud instead is
computation, storage and networking in a micro-cloud situated across the
network.
2. Cloud computing VS distributed cloud
So, how are cloud computing and the circulated cloud different? In truth, they
are the same thought but they use dissimilar systems to achieve it. Cloud
computing requires a data center with frequent servers to work across multiple
tasks for users, such as storing, processing and managing data, whereas
distributed computing allocates tasks across its network to single computers.
3. Distributed cloud VS edge computing
Edge computing is an instance of a distributed cloud. Edge computing factually
means computing which is trendy at the edge of a network, on or near the source
of data. Edge computing divests data to the cloud during peaks in computing
traffic to assurance speed and reliability.
4. The Technical Challenges of a distributed cloud
As you might imagine, these numerous returns do not come for free. To make
a distributed cloud storage infrastructure work in the real world, one has to
solve several technical subjects that have held the commerce back for years.
5. The benefits of a distributed cloud
Now that it’s clear why distributed cloud storage is dissimilar, it’s time to lay
out why these alterations matter. In short, they mean a cloud that is additional
private (every file is split, encrypted end-to-end and then supper across our
network), more secure (by relying on multiple systems it’s much less
susceptible) and greener — no central data center means
that for each 4TB you save on the network, you’re sinking your carbon
footprint by the corresponding of an extra fridge in your home!
Case Studies
1] Online Book Marketing Service Social
bookmarking is the best method in which client, use bookmarks and organize
the sheets they wish to recall or share with their friends. These collective
bookmarks are generally community-based and can be kept in confidence,
only with specific persons or assemblies, distributed only to internal reliable
systems, or another grouping of public and individual domains. Only the
approved persons can observe these communal bookmarks in succession, by
class or tags, or by a search engine.
Advantages of Social Book markings
.Users can profit from Google supported connections for the WWW sites.
. Useful connections can be provided to the visitors of libraries through
community publication marking.
. Can turn heavy traffic for the web site.
. It presents good view for Internet aided marketing.
. Social publication assessing completion of millions of sheet outlooks on
a monthly basis. It attracts the tourists from all over the world, and
therefore the Internet marketers use this WWW traffic to attract
targeted customers.
. It assists in assimilation of bookmarks from numerous computers,
association of bookmarks, distributing of bookmarks with associates
and so on.
. This scheme is capable of grading a specific asset based on the number of
times it has been bookmarked by the users.
Microsoft Labs lately launched Thumbtack, a new bookmarking application.
It comprises of intriguing take on bookmarking and keeping online data,
though it often runs short on consigning some of the basics from online
bookmarking services.
According to Microsoft, Thumbtack was evolved on client response the
business obtained after issuing Listas. Unlike Listas, Thumbtack does not aim
on community bookmarking but rather on conceiving online study libraries.
The original Thumbtack location is rather well-conceived and permits us to
pull and push pieces to distinct collections, edit and tag bookmarks and share
the bookmarks by Internet note and through a public WWW.
Qitera’s characteristics looks similar to Thumbtack’s although
Thumbtack has a more alluring client interface, the original bookmarking
and data retrieval through Qitera is far better than Microsoft’s product.
Thumbtack also needs any of the community bookmarking facets that
make Twine, Delicious or Qitera interesting. Not everyone, of course, is
involved in distributing bookmarks and for those users, Thumbtack is
unquestioningly worth pursuing, we would suggest Qitera, Delicious or
Magnolia, or the Google Notebook over Thumbtack.
2 ] Online Photo Editing Service
Cloud computing is a phase that encounters a wide range of services
and programs applications that share one common fact that they all
run on the Internet and not on a user’s PC. Cloud computing has
been around for years: instant messaging and web posted letters are
only two examples. New applications and services emerge almost
every day in the form of new community networking sites, SaaS
(Software as a Service), web posted letters and many others.
Cloud computing contributes numerous valuable and helpful services, and
applications and for many of these, it is a flawless venue. For instance, the
cloud is perfect for distributing images, community networking, instant
messaging, online data storage, non-sensitive data, online photograph revising
and many other applications and services which needs no uploading individual
or perceptive data. Cloud computing presents a very good natural environment
and structure that boosts and makes collaboration between parties convenient.
Online Photo Editors
Cloud computing and SaaS are both are in the middle of the most hyped phase
in the IT sphere right now, and numerous professionals accept for fact that they
there is a tendency that is just about to change the way we use and get access to
programs for ever. In its simplest form, cloud
computing is just a flexible computing application accessible as a service on a
pay-per-usage similar to electrical power in the power socket. SaaS is a
software and application provided by the cloud rather than being established
locally on a PC. Usually conceived to be multifarious, it ensures that multiple
users get access to the identical applications. This form cuts cost, eliminates
the need for in house maintenance and ensures that users start using it much
faster.
While the Cloud and SaaS signal are growing, a numerous user-friendly
Photo Editors accessible on a SaaS form absolutely free. Generally, they are
neither as unique nor very fast as Photoshop, but they are developing, and in
most cases they have the usual characteristics and much more. Another good
thing about Online Photo Editors is that they can be accessed from any
location and any computer with Internet connection. Let us take a close look
at some of the most useful Online Photo Editors accessible right now.
Photoshop Express Editor
It has been constructed on the convention of Photoshop minus the
technicalities. It is ideal for amateur photographers who don’t wish to get
involved into the complicated features of the Photoshop. In spite of all this, it
has its own limitations. The publishing choices are absent. Also, it does not
support photographs from high mega pixel cameras.
Picnik: It has the responsibility of extraordinary consequences, a variety of
fascinating fonts and shapes. It enables red eye decrease and also edits the
exposure which is most challenging for photographers. It is very speedy and
works well on distinct platforms such as Mac, Windows and Linux. One of the
most compelling characteristics of Picnik is the support of the photograph
distributing sites and the community networking sites.
Splash up: A free online tool for editing photos. It is browser friendly and
carries various photograph sharing services as Picasa, Flickr and
Facebook. Splashup comprises of numerous photograph revising tools such as
lasso, distort, brush load up, crop, etc. Multiple windows are permitted, which
entrusts its demonstration compared to the other tools. Besides, it also presents
us a Splash up Light, a free offline photograph reviewer, which works
flawlessly on our desktop as well as on wireless PCs.
Foto Flexer: It is one of the best choices for photograph editing. It has all the
rudimentary characteristics and supplements numerous sophisticated tools. It
offers upfront animations that most of the online devices don’t offer. Other
characteristic that makes Foto flexer to stand out is that it has 25 filters and it
can make flat images.
Pixer.us: Pixer.us is a straightforward and direct device for revising quickly. It
does not need signup. It has all the rudimentary devices such as crop, rotate,
flip, resize with hue rectify, and resize. A positive feature is that it permits
unlimited undo options.
Thank You