Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
53 views8 pages

Data Warehousing & Analytics Expert

Ravi Kumar Kodimyala has over 6 years of experience in data warehousing and analytics. He has worked extensively with Oracle Data Integrator (ODI) and Oracle Business Intelligence (OBIEE) to build data pipelines and analytics dashboards. Some of his key clients include Starbucks, Western Digital Corporation, and Deloitte Consulting. He has expertise in designing and developing complex data integration workflows using tools like ODI, Azure Databricks, Apache NiFi, and loading data into data warehouses, analytics platforms, and ERP systems.

Uploaded by

Gaurav Bhardwaj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views8 pages

Data Warehousing & Analytics Expert

Ravi Kumar Kodimyala has over 6 years of experience in data warehousing and analytics. He has worked extensively with Oracle Data Integrator (ODI) and Oracle Business Intelligence (OBIEE) to build data pipelines and analytics dashboards. Some of his key clients include Starbucks, Western Digital Corporation, and Deloitte Consulting. He has expertise in designing and developing complex data integration workflows using tools like ODI, Azure Databricks, Apache NiFi, and loading data into data warehouses, analytics platforms, and ERP systems.

Uploaded by

Gaurav Bhardwaj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Ravi Kumar Kodimyala

Name: Ravi Kumar K +91 9032773880


[email protected]

Professional Summary: -
 6.5+ years of experience in Data warehousing and Analytics experience in BI tools Oracle Data
Integrator (ODI) and Oracle Business Intelligence (OBIEE).
 Having experience in Cloud integration tools like Apache Nifi, Microsoft Azure Data bricks
 Hands-on experience in all activities related to the Development and Production Support of
implementing for large scale data warehouses using ODI & OBIEE tools.
 Worked on extensively on Azure Databricks and Nifi to integrate Data and loading into Anaplan
application sourcing from different sources like ERP, Flat files, DRM and SAP
 Trained and Certified in Oracle Analytics Cloud(OAC).
 Worked on complex Nifi Flows to integrate the data between multiple servers.
 Created Azure Databricks notebooks using pyspark commands to Extract, Load and Transform
the data.
 Extensively worked on Oracle Data Integrator (ODI) Topology manager, Designer, Operator and
troubleshooting of the packages.
 Worked with different Sources such as Oracle, Hive, SAP HANA, Flat files.
 Have good experience and good at working with the business users directly.
 Experience in Design and Implementation of Data Warehouse Gathering Business Requirements,
System Analysis, and Design and Data model.
 Good Experience in ODI components such as creating Master Repository, Work repository,
creating Procedures, Packages, Scenarios and Load Plans in ODI.
 Expertise in Scheduling the loads in ODI and debugging load issues.
 Experienced in developing SCD's using built in Knowledge modules.
 Specialized in design and development of OBIEE repository, dashboards, reports,Security and
migration.
 Experience in creating Hierarchies with drill-down capabilities using global & local Filters,
creating static, dynamic and session variables.
 Experience in design and development of the three layers (Physical, Business model and
Presentation Layers) of an OBIEE Metadata Repository(.rpd) using Oracle BI Administration Tool.
 Involved in Unit Testing, Peer Review and migration process across different environments.
 Complete knowledge on project life cycle.
 Good Basic Knowledge on UNIX, HDFS Commands.
 Knowledge on Control-M, Oracle DRM, EPM.

General Technical Skills: -

Cloud Integration tool : Microsoft Azure Databricks, Apache NiFi


BI Tools : OBIEE 11.1.x, BICS, OAC
Integration Tools : ODI 11g/12c

1
Ravi Kumar Kodimyala

Database : Oracle SQL


Database Tools : Toad, SQL Developer
Big data Tools : HDFS, Hive
Frontend Tools : WinSCP, Putty
Scheduling tools : Control - M
Others : PL/SQL, Oracle DRM, Oracle EPM

Employer Details: -

 Working in Infosys Ltd., as Consultant from Aug 2022 to till date.


 Working in Infosys Ltd., as Software Associate Consultant from Mar 2021 to Aug 2022.
 Worked as Software Consultant in Deloitte Consulting(USI) Pvt Ltd. from Aug-2019 to Mar
2021.
 Worked as ODI Developer/ OBIEE Developer in Forsys Software India Pvt. Ltd, Hyderabad from
Oct 2016 to Aug-2019

Professional Summary: -

PROJECT #6: Starbucks(SBFPTHZ5)


Company: - Infosys Ltd, Hyderabad
Role: - Consultant (Azure & ODI)
Client: - Starbucks 
Duration: - Aug 2022 –Till date

Environment: Microsoft Azure Databricks, ODI 12c, Oracle DRM, Oracle EPM, Apache Nifi, WinSCP,
Putty, SQL Developer

Client and Project Description:

Starbucks Corporation is an American multinational chain of coffeehouses and roaster


reserves headquartered in Seattle, Washington. It is the world's largest coffeehouse chain.
As of November 2021, the company had 33,833 stores in 80 countries, 15,444 of which were located in
the United States. Out of Starbucks' U.S.-based stores, over 8,900 are company-operated, while the
remainder are licensed.
Earlier there are multiple source systems loading the data into multiple repositories using
Oracle Data integrator in Starbucks. Now as part of this project, Get all the ODI repository objects into
Azure Data bricks notebooks. Using Nifi data will be pushed in Anaplan application. User will perform the
reporting on Anaplan Data.

Roles and Responsibilities:

 Involved in design discussions with client on end to end project implementation.


 Working on converting ODI integrations, Hyperion integrations into Azure integrations using
Databricks and Nifi.
 Worked on the design documentation based on the client requirement.
 Extensively working with onsite team to get the requirement and get the clarifications if we
required any inputs on the tasks.

2
Ravi Kumar Kodimyala

 Once after gathering the requirement put those details in Confluence page for development
reference.
 Involved in creating complex workflows in Nifi and Azure Databricks.
 Created Process groups, Controller services in Nifi to connect to DB servers.
 Used Multiple Nifi processors like Update Attribute, Genrateflowfile, ListSFTP,
ExecuteSQLRecord ..etc to achieve the business requirement
 Integrated Nifi and Azure Data bricks to load the source data into Anaplan model
 Developed Azure notebooks to load the data as per user/business requirements
 Created Nifi flows to load the data in Anaplan application based on views created in Azure
databricks.
 Migration of Azure notebooks and Nifi objects in Higher environments
 Involved in NIFI flow design to capture the data from ODI and send the data in file format to
other server.
 Created integration specs and technical specs to fulfill the gap between business and
development teams.

PROJECT #5: Starbucks(SBXODIHZ)


Company: - Infosys Ltd, Hyderabad
Role: - ODI Developer
Client: - Starbucks 
Duration: - Mar 2021 –Aug 2022

Environment: ODI 12c, Oracle DRM, Oracle EPM, Apache Nifi, WinSCP, Putty, SQL Developer, Control-
M

Client and Project Description:

Starbucks Corporation is an American multinational chain of coffeehouses and roaster


reserves headquartered in Seattle, Washington. It is the world's largest coffeehouse chain.
As of November 2021, the company had 33,833 stores in 80 countries, 15,444 of which were located in
the United States. Out of Starbucks' U.S.-based stores, over 8,900 are company-operated, while the
remainder are licensed.
Earlier there are multiple source systems loading the data into multiple repositories using
Oracle Data integrator in Starbucks. Now as part of this project, Get all the repository objects into single
repository (main repo) along with the upgrade of Application version.

Roles and Responsibilities:

 Involved in Hyperion integration upgrade using ODI integration tool


 Involved in design discussions with client on end to end project implementation.
 Worked on the design documentation based on the client requirement.
 Created the ODI Packages calling the EPM data load rules and load the data as per business
requirement into Target model.
 Fetch the data using ODI procedures by calling the calc scripts and Mxl scripts.
 Good knowledge on creating and debugging the Mxl script issues.
 Expertise in understanding DRM hierarchies and EPM data load rules.

3
Ravi Kumar Kodimyala

 Extensively worked with onsite team to get the requirement and get the clarifications if we
required any inputs on the tasks.
 Involved in NIFI flow design to capture the data from ODI and send the data in file format to
other server.
 Loading data from various sources involving flat files and Hive using Oracle data integrator.
 Designed and conceptualized the extraction strategies and methodologies for data integrity.
 Designed and developed interfaces, packages, procedures, variables, scenarios and load plans.
 Quick turnaround the PROD and UAT issues by analyzing logs and troubleshooting data issues.
 Worked with users to provide quick fixes on UAT and PROD issues.
 Involved in migration of Nifi flows, ODI Objects across all the environments.

PROJECT #4 : Western Digital Corporation


Company: - Deloitte Consulting(USI) Pvt Ltd, Hyderabad
Role: - ODI Developer
Client: - Western Digital Corporation 
Duration: - Jul 2020 –Mar 2021

Environment: OBIA, ODI 12c, Oracle, WinSCP, Putty, SQL Developer, SAP HANA

Project Description:

Western Digital Corporation (abbreviated WDC, commonly known as simply Western Digital and WD) is


an American computer hard disk drive manufacturer and data storage company. It designs,
manufactures and sells data technology products, including storage devices, data center systems and
cloud storage services. Western Digital has a long history in the electronics industry as an integrated
circuit maker and a storage products company. It is one of the largest computer hard disk drive
manufacturers, along with producing SSDs and flash memory devices. Its competitors include the data
management and storage companies Seagate Technology and Micron Technology.

Roles and Responsibilities:

 Working closely with users to gather the requirement and participating in the client meetings.
 Extensively worked on ODI mappings development based on the design document and
requirement sourcing from SAP Legacy System, Flat files, Hive DB(HQL), HDFS and Oracle ERP.
 Involved in End-to-end implementation of Data warehouse built for oracle EBS and SAP.
 Followed similar approach like BIAPPS. To load the Facts and Dimensions.
 Worked on extensively creating the Dimensions, Fact stages and facts mappings.
 Increased performance of loads by applying proper Hints, Indexes, and Analyzing master tables
and explain plans to check cost.
 Closely worked with clients to get the clarification of blockers in development.
 Coordinated with the on-site and off-shore team and working on service requests as team.
 Built the data warehouse and worked on Incremental and Full data loads.
 Worked on preparing the data model, Data lineage and data flow diagrams.
 Worked on creating and running the ODI scenarios, packages, procedures to load the foundation
tables.

4
Ravi Kumar Kodimyala

 Created and Imported/Exported various Sources, Targets, and mappings in interfaces using ODI
Designer and generated scenarios and scheduled as per client requirements
 Prepare the necessary technical documents and create or update the existing data model.
 Designed and conceptualized the extraction strategies and methodologies for data integrity.
 Designed and developed interfaces, packages, procedures, variables, scenarios and load plans.

PROJECT #3 : Exelon Corporation


Company: - Deloitte Consulting(USI) Pvt Ltd, Hyderabad
Role: - ODI Developer
Client: - Exelon Corporation
Duration: - Aug 2019– Jul 2020

Environment: ODI 12c, Hive, Oracle, Hive, HDFS, Spark, UC4, WinSCP, Putty, SQL Developer.

Project Description:

Exelon Corporation is an American Fortune 100 energy company headquartered in Chicago, Illinois and


incorporated in Pennsylvania.[2] It generates revenues of approximately $33.5 billion and employs
approximately 33,400 people.[3] Exelon is the largest electric parent company in the United States by
revenue, the largest regulated electric utility in the United States with approximately 10 million
customers, and also the largest operator of nuclear power plants in the United States and the largest
non-governmental operator of nuclear power plants in the world

Roles and Responsibilities:

 Handled development activities which include creation of the Hive layer and development of ETL
jobs for data pipeline between the Hadoop and oracle layers.
 Extensively worked on ODI mappings development based on the design document and
requirement sourcing from Hive.
 Closely worked with clients to get the clarification of blockers in development.
 Worked on creating and running the ODI scenarios, packages, procedures to load the foundation
tables.
 Worked on HDFS commands to perform the required operations on the Hadoop file system.
 Involved in data quality analysis to determine cleansing requirements designed and developed
ODI interfaces for data loads.
 Involved in debugging the mapping in YARN client which are developed in SPARK unit.
 Worked on SIT, UAT, PROD bug fixes to deliver the objects on time.
 Loading data from various sources involving flat files and oracle tables using Oracle data
integrator.
 Prepare the necessary technical documents and create or update the existing data model.
 Loading data from various sources involving flat files and Hive using Oracle data integrator.
 Designed and conceptualized the extraction strategies and methodologies for data integrity.
 Designed and developed interfaces, packages, procedures, variables, scenarios and load plans.
 Extensively worked with reusable mappings to handle complex requirements.
 Good experience on using the ODI commands like OdiSleep, OdiFileMove, OdiFileAppend, and
OdiFilecopy etc.
 Debugging the master jobs and packages by using operator navigator.
 Worked on migration of objects using GIT hub.

5
Ravi Kumar Kodimyala

PROJECT #2: Igloo Inventory and Shipment Management


Company: - Forsys Software India Pvt. Ltd, Hyderabad
Role: - ODI Developer
Client: - Igloo Products Corporation
Duration: - Oct 2017 – Aug 2019

Environment: OBIEE 11g, Oracle 11g, ODI 11g , PL/SQL, Solaris, Windows 2008 Server

Project Description:
For over 60 years, Igloo® has remained the #1 brand in the coolers industry worldwide. Since originating
the cooler category in 1947, the Igloo brand has been synonymous with quality and durability. Research
shows that three in every four U.S. households own at least one Igloo cooler-making Igloo the most
popular cooler brand in the U.S.!

Roles and Responsibilities:

 Involved in modification of local object mask to simplify the creation of more number of
interfaces.
 Involved in data quality analysis to determine cleansing requirements designed and developed
ODI interfaces for data loads.
 Loading data from various sources involving flat files and oracle tables using Oracle data
integrator.
 Reverse engineering the required tables for the development.
 Designed and conceptualized the extraction strategies and methodologies for data integrity.
 Designed and developed interfaces, packages, procedures, variables and scenarios.
 Created master & work repositories and configured ODI connections for source and target using
Topology.
 Extensively worked with Yellow interfaces to handle complex requirements.
 Involved in production support activities
 Implemented Email alter to the DBA when an interface execution failure.
 Development activities in ODI such as Interfaces and use of knowledge modules
 Extensive knowledge on using of Knowledge Modules.
 Good experience on using the ODI commands like OdiFileMove, OdiFileAppend, and OdiFilecopy
etc.
 Debugging the master jobs and packages by using operator navigator.
 Extensively worked on OBIEE report and Dashboards.
 Understanding requirements and customizing the Repository to meet the requirements i.e.,
Physical Layer, Business Model & Mapping and Presentation Layer level
 Created Drill down Hierarchies and implemented Business Logics and Facts.
 Identified the Facts, Dimensions, and grain of the Dimensional model, and referential joins.
 Designed various reports i.e. Drill Down, Pivot, in Answers and Configured Intelligent
Dashboards for different groups of users to view across regions, Customer Demography with the
facility to drill down to detailed level Information.
 Security – Over all security, Object Level security, OBIEE content and Data level security etc.

6
Ravi Kumar Kodimyala

PROJECT #1: Calix


Company: - Forsys Software India Pvt. Ltd, Hyderabad
Role: - OBIEE/ODI Developer
Client: - Calix, USA
Duration: - Oct 2016 to Sep 2016

Operating Systems : MS-DOS, Red hat Linux 5.0, Windows 95, 98, 2000 Server and NT
ETL/ELT Tools : Informatica 8x,9x (basic knowledge), ODI 11 g
OLAP Tools : OBIEE 11g
Back - end : Oracle 11g
Languages : SQL. *

Project Description:
Calix is a leading global provider of broadband communications access systems and software. The Calix
Unified Access portfolio allows service providers to connect to their residential and business subscribers
and deploy virtually any service over fiber- and copper-based network architectures. With more than
1000 customers – whose networks serve over 50 million subscriber lines in total – Calix is at the
forefront of enabling the innovative ways that communications service providers deliver advanced
broadband services and value to their customers.

Roles& Responsibilities:
 Working closely with users to gather the requirement and participating in the client meetings
 Training the Business Users in Ad hoc reporting
 Customizing & developing RPD, Configuration of Analytics Metadata objects (Subject Area,
Table, Column), Web Catalog Objects (Dashboard, Pages, Folders, Reports).
 Created Reports using different Analytics Views (Pivot Table, Chart, Column Selector, Table)
 Creating new users/groups and providing the access to the respective reports/subject areas.
 Monitoring the OBIEE Services (Dev, QA, UAT and Production), Agents, Help Desk Ticket System
and Emails for the user tickets.
 Migrating the RPD and Web Catalog across all the Environments (Dev, QA, UAT and Production)
based on business requirements.
 Migrating the User Accounts among multiple environment (Dev to QA, QA to UAT and UAT to
Production)
 Bouncing the server’s whenever it is required.
 Clearing the log/tmp files and taking the backup of RPD, Catalog as part of scheduled server
maintenance activity.
 Changing the OBIEE customer logo, tab names as per the client requirement.
 Enhancements to some of the existing reports and dashboards and fixing some of the issues
related to performance, data quality etc., during production support.
 Extensively used ODI (Oracle Data Integrator) ELT tool to create Data warehousing OLAP Model.
 Created master & work repositories and configured ODI connections for source and target
systems in topology.
 Extracted data from source Oracle (EBS) and loaded into Staging Area tables/Data warehouse
tables ODI.
 Customizing Interfaces, Packages, Scenarios, Variables, Sequences, Procedures and Load Plans
as per ELT Specifications (BRS) using ODI tool

7
Ravi Kumar Kodimyala

 Worked with multiple sources such as Relational database, Flat files for Extraction,
 Analyzing logs and troubleshooting the issues.
 Development activities in ODI such as Interfaces and use of knowledge modules when required.
 Migrated the ODI objects from one environment to Other environments (DEV, QA, UAT and
PROD).
 Working knowledge of Knowledge Modules (LKM, IKM, JKM and CKM)..
 Debugging the master jobs and packages by using operator navigator.
 Performed unit testing at various levels of ELT flows.

Education: -

Qualification University / Board Name of the Institute Percentage Year of


Passing
DVR PG Institute of computer
MCA Osmania University 71.22 2011
application
Sree Chaitanya Degree College,
BSc(MPCs) Kakatiya University. 70.70 2008
Karimnagar
Board of Intermediate Master Junior college,
Intermediate 68.3 2005
Education. Karimnagar
Board of Secondary
SSC VVNHS ,Karimnagar 83.3 2003
Education.

You might also like