Sai Gowtham
Senior QA Engineer
Email: [email protected]
Mobile: +91- 7893345464
PROFESSIONAL SUMMARY:
Having 5+ years of Experience in Software testing which includes ETL/DWH Testing,
Tableau Dashboards testing, Hadoop data Testing, AWS testing.
Experienced in ETL testing, Black box Testing (System testing, Integration and
Regression testing).
Used Testing Ape framework for performing the regression testing by changing the
required parameters to execute the stored procedures. Involved in updating the
metadata for all testing Ape framework to start execution for multiple batches and
validate the testing Ape results once execution completed.
Having the experience in migration projects, data migrated from DWH to AWS Cloud.
Executed ETL workflows for Initial and incremental loads using informatics utilities.
Having experience in creating the Task/Sub tasks under backlog items in Jira.
Having Good working experience in writing complex SQL Queries, testing Facts,
Dimensions Based on mapping logic.
Good hands on in Agile model testing, preparing the Test Cases, Test Scripts and
executing test cases.
Hands on testing experience in Informatica Power Center 9.1 tool frontend tool .
Highly motivated professional with integrity and commitment to quality and
perfection.
Having Knowledge on in Test Management tools like HP ALM QC.
Used Unix commands for data validation between files and tables.
EXPERIENCE SUMMARY:
Working as a Senior QA Engineer in Carelon Global Solutions LLP,
Bangalore from –May 2022 to till date.
Working as a QA Engineer ,Bangalore from Sept 2021 to April 2022
EDUCATION:
Bachelor of Computers in M.S.C discipline from SRI KRISHNADEVERAYA
University, Andhra Pradesh in 2016.
TECHNICAL SKILLS:
Testing : ETL/DWH testing, manual testing & DB testing
Operating Systems : Windows, UNIX
Languages : SQL,HQL
Databases : Oracle, MS SSMS, AWS Redshift
ETL Tools : Informatica 9.1
Test Management Tool : HP Quality Centre, Jira
Project Details: -
Project # 1
Project : Milliman/Integra
Client : Elevance health, USA.
Environments : AWS S3, Hive, Snowflae,Oracle,Redshift
Role : Senior QA Engineer
Description:
Milliman/Integra is serving as a vendor for Elevance, supporting the design and
implementation of Elevance’s Episode-Based Payment (EBP) programs across lines of
business. Our support includes the following broad areas of assistance: (a) defining
episodes and developing and running a customized episode grouper; (b) developing and
maintaining financial methodologies to support episode implementation; (c) performing
financial reconciliation; and (d) developing, maintaining, and updating a provider-facing
reporting solution.
Starting in April 2022, Elevance and Milliman have agreed to begin monthly data
transmissions. This memo outlines our process documentation for transitioning to and
processing incremental file extracts for monthly data transmissions for the EBP program.
DX EBP Milliman project creates Episode Bundle Payment data from MBR, CLM,
PROVIDER etc.
We process outbound and inbound data.
for outbound files we get input data from snowflake tables and write the transformation
snowflake queries and execute them using EMR/Glue services and load target tables
along with generating outbound files in S3.
These outbound files will be sent to Milliman and Integra vendors who prepare episode
bundle out of this data and do the payment to providers.
We created automated frameworks using EMR and Glue which takes care of the process
of loading snowflake target tables and generating outbound s3 files.
Inbound data will be recevied based on the outbound files we sent and we process that
data and load into snowflake tables for futher analysis by business.
Responsibilities:
Involved in functional study of the application by referencing the BTRD, TDD and
Mapping documents.
Interacted with clients and BA to get the requirement details.
Verifying the files were loaded in different S3 bucket location after ETL jobs
execution completed.
Prepared and executed test Cases for all requirements.
Maintained test cases and test artifacts in shared repository for future references
by each iteration wise.
Validated ETL test strategies to make sure data loaded in target DWH tables as
per the transformation logics.
Validated ETL runs for initial and incremental loads.
Updating the status in Jira with appropriate comments and documents and
tagging the team’s members to get the current status of task.
Validated the Tableau Dashboards from top level reports to local detailed
dashboards with different time frequency.
Joining the daily scrum calls to provide the current testing status task.
Joined the retrospective calls at the end of the sprint.
Project # 2
Project : Smart Metering BSS (Supply chain Management)
Client : Iqvia
Environments : AWS S3, Hive, Snowflake,Oracle,Redshift
Role : Senior QA Engineer
Description:
The primary goal of this project is to manage electricity and gas
consumption at the households in Southern Britain. The same is achieved by
measuring the consumption with the help of smart meters installed in the
houses.
The whole process is divided into below three operational segments:
Integration
ETL
BI Reporting
Data which comes into system through spooling and quartz mechanism.
The processed data is stored on the data warehouse server from where the ETL
picks up the respective file. ETL validates the data integrity and mapping between
the different tables. Lastly, with the help of data stored in the data mart,
reporting is done.
ETL – The output of integration works as input for ETL. From testing point
of view the following items are tested:
Schema validation of the newly added tables, count validation of the
records inserted in the tables after running the workflows and finally validation of
data inserted in the tables. The validation is done across various layers such as
staging, ODS and data mart.
Reporting – There are several techniques through which the data from the
warehouse or mart called in this case is used such as data visualization, data
mining, reporting etc. In this case reporting is used to give a pictorial
representation to data which is finally used by the FAT team to analyze the
system.
Have been part of END-TO-END testing which involves integration, ETL and
OBIEE reporting.
Responsibilities:
Involved in functional study of the application by referencing the BTRD, TDD and
Mapping documents.
Interacted with clients and BA to get the requirement details.
Verifying the files were loaded in landing Zone, Stage and Core DWH after ETL
jobs execution completed.
Prepared and executed test Cases for all requirements.
Maintained test cases and test artifacts in shared repository for future references
for each iteration wise.
Validated ETL test strategies to make sure data loaded in target DWH tables as
per the transformation logics.
Validated ETL runs for initial and incremental loads.
Updating the status in Jira with appropriate comments and documents and
tagging the team’s members to get the current status of task.
Joining the daily scrum calls to provide the current testing status task.
Joined the retrospective calls at the end of the sprint.