Page | 1
Full Name Microsoft Certified
Azure Data Engr
Email:
Ph No/WhatsApp: +91-
Location:
LinkedIn:
CAREER OBJECTIVE
Ambitious, self-assertive, and goal-oriented Azure Data Engr, seeking to hone my competency
and knowledge in an organization that provides me with structured and comprehensive career
growth beside accomplishing company goals and to obtain the position of Microsoft Azure Data
Specialist with a view to utilize my professional experience towards the growth and development
of the organization.
PROFESSIONAL EXPERIENCE SUMMARY
⮚ self-motivated & result oriented. A Microsoft Azure Data Engr with proven record of
accomplishment of Microsoft Cloud Data Solution design, maintenance, implementation,
and management of Microsoft Cloud technologies in IT for Enterprise applications data.
⮚ Highly skilled in managing solutions, procedures & service standards for project migrated to
azure cloud.
⮚ Possesses good experience of designing effective cloud data solutions for cloud
deployment, transitioning multiple projects, and ensuring seamless end-to-end delivery of
services.
⮚ Highly robust individual adept at learning and implementing new technologies swiftly.
⮚ Drives excellence in every project to deliver outstanding results.
⮚ Highly skilled Azure Data Engineer in developing secure, cost-efficient data solutions.
Successfully designed, implemented & manage the projects through the entire development
cycle, driving down storage costs by 25%, increasing customer satisfaction by 20%,
streamlining integration and profiling processes by 40%, and more.
⮚ Proven ability to architect automated environments for optimal data assets and resources,
leveraging essential tools such as Azure SA, ADL’s Gen2 SA, ADF, ASE, Az-copy, Azure
Data Bricks, Azure portal, GitHub, ADF Pipelines, ADF Data flows, SQL Server, SSMS,
Microsoft Data Migration Assistant and Python.
Technical Skills:
Cloud Data Storage Services: Standard SA. Premium SA. ADL’s Gen2 SA. Replications
of SA. Access tier of Storage Account, soft deletes for container, blob storage & file share
storage services, custom encryptions & Tags of Storage account services.
Cloud Storage Offerings: Analysis, design & Implementation of different cloud storage
offerings like Page blob. Append blob, Block blob, Queue Storage Services & Azure
Datalake Storage Gen2 to store the data into variety of different formats. Movement of SA
Page | 2
content from one RG to another RG/Region.
Storage Content Management: Restricting access for Storage account content using ASE
tools, implementing IAM for different users, provisioning of SA with private network and
storage encryptions
Backup & Restore: Enabling Soft deletes for different retention periods as per business
needs for SA services. Defining the backups within RSV with customized policies for File
share storage offering. MS SQL DB’s…etc.
SQL DB’s: Designing and implementation of MS-SQL DB is with various purchasing
(DTU’s & Vcores) models. DB deployments with Single, Elastic pool & Manage Instances.
Retention policies of DB’s.
Azure Data Factory (ADF): Linked services, Datasets, Integration runtime, self-hosted
integration runtime Control/Activities, Execution of single and multiple pipelines, Triggers.
Data movement activities, Data transformation activities, Data control activities. Schedule
triggers, tumbling window trigger, Event trigger, Conversion of data formats in targets as
per the business needs, Automating ADF pipelines for business data.
Azure Data Bricks:
Designing cluster for Apache Spark for Azure Data bricks for Master nodes and Worker
nodes with single and multi-nodes cluster concept.
Managing multiple Python Notebooks for mounting of source data and executing different
python scripts to fulfill the business requirements.
Fetching and reading the data from .csv sources in Azure Databricks Apache Spark cluster
with WASBS methods.
Connecting to Azure BLB SA from Azure Databricks cluster to mount the directory.
Visualizing the data in different Charts and Metrics in Azure Data bricks cluster to give
value adds to the businesses for higher managements.
Developing MS-SQL queries in Python Notebooks to analyze, visualize and get the
business statistics from source streams.
Importing Apache Spark libraries & writing the code in Databricks cluster in %Scala
Blob data & Database Migrations:
Migration of data from Private cloud to Public cloud(Forward migration), Public cloud to
Public cloud(Platform Migration) and Public cloud to Private cloud(Reverse Migration)
Performing the Fail Over (RPO & RTO) for data replication of Azure Blob Storage Services.
Management and maintenance of MS-SQL Database backups, on demand backups,
schedule backups, restoring of DB’s on demands, Importing DB into Azure Blb SA.
Performing Full migration or Partial migration of SQL DB’s using Azure Database Migration
Assistant.
Importing/Migrating Full DB directly from On-prem(local server) to Cloud DB Server using
Azure console or Azure Resource Manager.
Achievements:
⮚ Carrying multiple certifications in Azure cloud computing (Dp-900, DP-203…etc.)
⮚ Also mention other achievements from your college and school activities.
Page | 3
⮚ If you have done something great in sports or arts..
WORK EXPERIENCE:-
❖ Current working in Cognizant Technology Solutions from Feb 2022 to Present
EDUCATION/QUALIFICATIONS: -
DEGREE YEAR PERCENTAGE COLLEGE BOARD/UNIVERSITY
PROJECTS
Project #1:
Project : IOWA
Domain : Retail
Technology’s : Storage services, replications, ADF, File share servers, Apache Spark.
Team Size :8
Work Duration : March 2022 to June 2023
Pay Roll Company : NA
Description:
In IOWA Project we collect manufacturing related quality data (i.e.: functional test, measurement
repair etc) from different source systems across Ericsson HW make site and facilitates the
availability of quality information together on a global level. Objective of this project is to provide
continuous support to customer in displaying the qualitative data in various reports by solving and
handling the issues faced in the application, continuous monitoring, administrative and operations
activities needed to ensure the agreed service level.
Responsibilities:
⮚ Design and implementation of different Blobs(Page, Append Block blob) to store and
manage the data in a cloud computing platform.
⮚ Utilization of Azure storage explorer software for storage content management for blob
storage service, file share storage service, ADL’s Gen2 storage service with different
approaches like storage connection string, storage keys and shared access signature
⮚ Managing soft delete for blob and file share storage services to maintain for different
enterprise application.
⮚ Copying the data from variety of different sources (like .csv file, .zip file, .xls files) to
destination storage accounts (standards, premium & ADL’s Gen2 storage account) using
Azure Data Factory (ADF)
Page | 4
⮚ Provisioning of SQL DB as service of different purchase models (Vcores & DTU’s),
maintaining of retention period of SQL DB’s.
⮚ Copying the data from multiple data files of different formats into ADL’s Gen2 storage
accounts with ADF pipelines.
⮚ Designing and implementation of different activities/controls (Metadata, validation, if
condition, nested, copydata, filter, Get Metadata, ForEach, Set variables, Lookup…etc.)
pipelines and Data flows in Azure Data Factory
⮚ Copying the data from GitHub portal to ADL’s Gen2 via ADF. Implementation of dynamics
pipelines and parameterized pipelines for various ADF controls/activities.
⮚ Copying the data from one source type to load it in destination in another source type using
ADF pipelines.
⮚ Deployment of stand along DB’s and Elastic pool as per the business requirements.
⮚ Copying the data from variety of different sources (like .csv file, .zip file, .xls files) to
destination storage accounts (standards, premium & ADL’s Gen2 storage account) using
Azure Data Factory (ADF)
⮚ Designing of data flows to fetch the data as source from SQL DB with source and another
inline dataset transformation.
⮚ Designing cluster for Apache Spark for Azure Data bricks for Master nodes and Worker
nodes with single and multi-nodes cluster concept
⮚ Managing multiple Python Notebooks for mounting of source data and executing different
python scripts to fulfill the business requirements.
⮚ Fetching and reading the data from .csv sources in Azure Databricks Apache Spark cluster
with WASBS methods.
⮚ Deployment of stand along DB’s and Elastic pool as per the business requirements.
⮚ Connecting to Azure BLB SA from Azure Databricks cluster to mount the directory.
⮚ Visualizing the data in different Charts and Metrics in Azure Data bricks cluster to give
value adds to the businesses for higher managements.
Project #2:
Project : AWAMO
Domain : Insurance and Finance
Technology’s : Blob, File Share Storage services, ADF, Apache Spark, GitHub, File
share Services
Team Size : 10
Work Duration : NA
Pay Roll Company : NA
Description:
Page | 5
AWAMO project is a Business Intelligence system that enables the end users to analyze the data
in MOLAP Cubes and reports, Here in this project we pull the data from multiple source and load it
into Data warehouse Databases using Azure Cloud Services and on top of it a MOLAP cubes also
has been created from which we allows the end users and High level management to view the
data. Agile project methodology has been followed in the project.
Responsibilities:
● Provisioning of standard, Premium & Azure Data Lake Gen2 Storage accounts to fetch and
dump the data from variety of different sources and destinations for the data, which is
different formats, size & shape.
● Design and implementation of different Blobs(Page, Append Block blob) to store and
manage the data in cloud computing platform.
● Copying the data from variety of different sources (like .csv file, .zip file, .xls files) to
destination storage accounts (standards, premium & ADL’s Gen2 storage account) using
Azure Data Factory (ADF)
● Designing and implementation of different activities/controls (Metadata, validation, if
condition, nested, copydata, filter, Get Metadata, ForEach, Set variables, Lookup…etc.)
pipelines and Data flows in Azure Data Factory
● Performing the Fail Over(RPO & RTO) for data replication of Azure Blob Storage Services.
● Copying the data from GitHub portal to ADL’s Gen2 via ADF. Implementation of dynamics
pipelines and parameterized pipelines for various ADF controls/activities.
● Copying the data from multiple data files of different formats into ADL’s Gen2 storage
accounts with ADF pipelines.
● Copying the data from one source type to load it in destination in another source type using
ADF pipelines.
● Developing MS-SQL queries in Python Notebooks to analyze, visualize and get the
business statistics from source streams.
● Migration of data from Private cloud to Public cloud(Forward migration), Public cloud to
Public cloud(Platform Migration) and Public cloud to Private cloud(Reverse Migration)
● Management and maintenance of MS-SQL Database backups, on demand backups,
schedule backups, restoring of DB’s on demands, Importing DB into Azure Blb SA.
● Implementation of storage-based trigger, Event based trigger and Schedule based triggers,
tumbling windows-based triggers for different source and destinations using ADF.
● Copying data from Azure SQL DB to ADL Gen2 Storage Account, and loading the data
from Blob SA to SQL DB in single table/multiple tables from different source files.
● Implementation of Azure Keyvault and integrating with Azure Data Factory, storing the
secrets in Azure keyvault by creating a connection string for SQL DB’s
● Managing multiple Python Notebooks for mounting of source data and executing different
python scripts to fulfill the business requirements.
Page | 6
● Fetching and reading the data from .csv sources in Azure Databricks Apache Spark cluster
with WASBS methods.
● Connecting to Azure BLB SA from Azure Databricks cluster to mount the directory.
● Visualizing the data in different Charts and Metrics in Azure Data bricks cluster to give
value adds to the businesses for higher managements.
● Importing Apache Spark libraries & writing the code in Databricks cluster in %Scala as per
the business needs.
PERSONAL DETAILS:-
FATHER'S NAME :
DATE OF BIRTH :
AGE :
NATIONALITY :
MARITIAL STATUS :
LANGUAGES :
Passport :
DECLARATION
I hereby declare that the above written particulars are true to the best my knowledge and
belief.
Place:
Date: ( )