Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
6K views4 pages

Snowflake Resume

Venata Reddy has over 6 years of experience as a Snowflake Developer. She has extensive experience designing and developing scalable data solutions using Snowflake and AWS. Her skills include ETL pipeline development, data warehousing, loading data from various sources into Snowflake, and writing complex SnowSQL scripts for analytics and reporting. Her goal is to advance to a lead developer position to create innovative solutions for business challenges.

Uploaded by

Srilakshmi M
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6K views4 pages

Snowflake Resume

Venata Reddy has over 6 years of experience as a Snowflake Developer. She has extensive experience designing and developing scalable data solutions using Snowflake and AWS. Her skills include ETL pipeline development, data warehousing, loading data from various sources into Snowflake, and writing complex SnowSQL scripts for analytics and reporting. Her goal is to advance to a lead developer position to create innovative solutions for business challenges.

Uploaded by

Srilakshmi M
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

VENATA REDDY M

+91- 7989890832
[email protected]
Carrier Objectives
In a fast-paced company, I'm looking for a challenging career as a Snowflake Developer where I can
apply my technical expertise and experience to create novel solutions that promote business
expansion. My professional goal is to get to the position of lead Snowflake Developer and help create
ground-breaking solutions to challenging business issues. To give my clients the finest available
solutions, I'm committed to lifelong learning and remaining current with the most recent technological
and industry developments.

PROFESSIONAL SUMMARY
Having 6+ years of experience and as Snowflake Developer with 4+ years of experience in designing
and developing scalable data solutions. Strong experience in building ETL pipelines and data
warehousing. I am seeking employment that will make the best use of my skills and allow me to
develop them further. I am determined and enthusiastic. I have developed good planning.

• Proficient in Design and Development of process required to Extract, Transform and Load data
into the Data warehouse using Snowflake Cloud Database and AWS S3.
• Expertise in Building/Migrating data warehouse on Snowflake cloud database.
 Played a key role in Migrating Oracle objects into Snowflake Environment Using
AWS Services
• Expertise in working with Snowflake Snow pipes, Internal/External Stages, Clone, Tasks and
Streams
• Involved in Zero Copy cloning – Cloning databases for Dev and QA environments.
• Storage Considerations for Staging and Permanent Databases / Tables.
• Experience in Agile methodology.
• Created Internal and External stages and transformed data during load.
• Created Integration objects, file formats, stages, and used COPY INTO Snowpipe to ingest
CSV/TXT/JSON data continuously from AWS S3 bucket.
• Experienced in handling of Structured &Semi Structured data loads & unloads into the
Snowflake DW.
• Experienced in Continuous Data Protection Lifecycle: Time Travel, Fail-Safe zone.
• Experienced in data migration from traditional on-premises to the cloud systems.
• Queried Historical Results/Data based on Timestamp, Offset, Query ID
Worked of streams, Secure Views and Materliazed View.

EDUCATION
 B.Tech (ECE) from JNTU Ananthapur

TECHNICAL SKILLS
Cloud Technologies : Snowflake, AWS.
Programming Languages : PL/SQL, SnowSQL
Data Warehousing : Snowflake
DBMS : Oracle 12c/11g/10g
Operating System : Windows, XP

WORK EXPERIENCE
 Working in Wipro Technologies from June 2023 to till date
 Worked in Publicis Sapient from Jan 2022 to April 2023
 Worked in Appzwork software solutions from Feb 2018 to Jan 2022
 Worked in Wipro Technologies from Jan 2017 to Jan 2018

Project Details:

Project: EDW
Role: Snowflake Developer

Responsibilities:

 Bulk loading from the external stage (AWS S3), internal stage to snowflake cloud using the
COPY command.
 Loading data into snowflake tables from the internal stage using Snowsql.
 Played a key role in Migrating OracleDatabase objects into Snowflake Environment Using
AWS Services
 Used COPY, LIST, PUT and GET commands for validating the internal stage files.
 Used import and Export from the internal stage (snowflake) from the external stage (AWS S3).
 Writing complex snowsql scripts in snowflake cloud data warehouse to business analysis and
reporting.
 Used SNOW PIPE for continuous data ingestion from the S3 bucket.
 Created Clone objects to maintain zero-copy cloning.
 Data validations have been done through information schema.
 Performed data quality issue analysis using Snow SQL by building analytical warehouses on
Snowflake
 Experience with AWS cloud services: S3, IAM, Roles, SQS.
 Cloned Production data for code modifications and testing.

Project#2:

Project: Suncorp
Role: SnowFlake Developer

Responsibilities:

 Created various stages like internal/external


stages to load data from various sources like S3
and on premise systems.
 Responsible for all activities related to the
development, implementation and support of ETL
processes for large scale Snowflake cloud data ware
house.

 Bulk loading from external stage (AWS S3) to internal


stage (snowflake) using COPY command.

 Loading data into Snowflake tables from internal stage


and on local machine.

 Used import and export from internal stage


(Snowflake) vs external stage (S3Bucket).

 Writing complex Snow SQL scripts in Snowflake


cloud data warehouse to Business Analysis and
reporting.

 Create task flows to automate the data


loading/unloading to from Snowflake data ware
house and AWS S3

 Writing complex SQL scripts in Snowflake data


warehouse to support Business reporting

 Creating automated scripts to load data from


Snowflake to though spot server.

 Perform troubleshooting analysis and resolution of


critical issues.
Project#3:
Project: Health Plan Services

Responsibilities :-

 Created Database objects like Tables, Simple Views, Sequences as per the Business
requirements.
 Hands on Experience on DDL, DML, TCL and DRL Statements
 Dealing with Aggregate Functions and Analytical Functions.
 Using Global Temporary Tables (GTT), External tables (To load data into the stage table).
 Creating Backup Tables regularly using CTAS.
 Involved in all Phases of the Software Development Life Cycle.
 Analyze the Existing code and do the impact analysis with the help of Seniors.
 Handling Nulls using NVL, NVL2 and Coalesce.
 Creating Indexes on tables to improve the performance by eliminating the Full Table Scans.
 Worked on Complex Queries using Case Statements, String and Date Functions.

You might also like