Akanksha Ashok Shetty
PROFESSIONAL SUMMARY
Motivated and detail-oriented professional with experience in data analysis, backend development, and data
engineering. Proven ability to design automated workflows, streamline reporting processes, and support business
decision-making through data-driven insights. Comfortable working across cross-functional teams in fast-paced, Agile
environments.
EDUCATION:
UNIVERSITY OF BRIDGEPORT May 2024
Master of Science (Analytics & Systems) Bridgeport, CT
MUMBAI UNIVERSITY May 2020
Bachelors (Accounting & Finance) Mumbai, India
TECHNICAL SUMMARY:
● Programming languages: Python, PySpark.
● Web Frameworks: Flask, Django.
● Data Base Technologies: MySQL, MS SQL Server, MongoDB, PostgreSQL.
● Data Visualization tools: Tableau, Power BI.
● Cloud and Dev Tools: AWS, S3 Bucket, CloudWatch, CloudTrail, AWS RDS, AWS Glue, AWS SageMaker.
● API Documentation and Testing: Swagger, Postman.
● Version Control: GitHub.
● Microsoft Office Suite: MS Excel (VLOOKUP, HLOOKUP), PowerPoint and Outlook.
● Testing Frameworks: Pytest, unittest.
● Collaboration and Project Management: JIRA, Confluence, Agile methodologies.
PROFESSIONAL EXPERIENCES:
Python Developer July ’2024 – Present
Qdata, Tx, United States
• Building and maintaining PySpark-based ETL pipelines on AWS Glue to process data from Zendesk
(support), Stripe (billing), and product usage logs (JSON) stored in S3.
• Writing Python ingestion scripts to pull data from REST APIs and store structured, semi-structured data into
S3 raw zone with metadata tagging for downstream processing.
• Developing lightweight Flask microservices to expose curated metrics via REST endpoints, used for
automated reporting, alerting, and dashboard integrations.
• Implementing data transformations to flatten nested fields, normalize time zones, and join cross-system data
using customer and subscription keys.
• Managing final data storage in AWS RDS (PostgreSQL), enabling fast SQL queries and Tableau dashboard
integration for end users.
• Creating SQL views and materialized tables for tracking metrics such as ticket resolution time, CSAT trends,
usage-to-renewal correlation, and subscription health signals.
• Designing interactive Tableau dashboards for support leads, and CSMs to visualize agent performance,
engagement drop-offs, and billing risk indicators.
• Maintaining API documentation in Swagger UI, committing updates via Git, and collaborating in Agile
sprints using JIRA.
Data Engineer Nov ‘2019 – Sep ‘2022
Fin Solution, India
• Automated the collection of high-volume structured and semi-structured data from internal systems in formats
like CSV, JSON, and XML; built ingestion pipelines using Python and PySpark to cleanse, transform, and
load the data into AWS RDS (PostgreSQL) for downstream analytics and regulatory reporting.
• Built modular SQL queries to extract, aggregate, and monitor performance metrics such as NPA ratios, DPD
bands, collection efficiency, and delinquency rates, supporting regulatory and executive reporting.
• Designed and deployed cloud-native ETL pipelines using PySpark and AWS Glue to ingest, cleanse, and
transform large-scale datasets including loan disbursal records, delinquency data, credit scoring inputs, and
repayment behavior.
• Automated financial reconciliation workflows in Python, enabling cross-validation between repayment
schedules and actual collections, significantly reducing manual intervention and ensuring audit compliance.
• Developed internal tools and modular microservices using Django REST Framework, including borrower
search, repayment status, and credit scoring APIs, secured with JWT authentication and exposed via API
Gateway for controlled access to borrower insights and risk-tier dashboards.
• Created and maintained interactive Tableau dashboards for executives and risk teams to monitor repayment
performance, regional NPA distribution, collection trends, and customer segmentation analytics.
• Applied machine learning models to aggregated historical datasets to identify consumer behavior trends,
enhancing lead targeting and improving forecast precision for loan recovery and renewals.
• Supported Agile-based development cycles using JIRA for sprint tracking, Git for version control, and
Confluence for documenting data pipelines, service architecture, and analytical models.
Data Analyst Intern
• Automated monthly reporting workflows using pivot tables and VLOOKUPs, reducing manual reporting time
by 60% and improving data accuracy for compliance reviews.
• Collaborated with credit risk analysts to track delinquency rates, collection efficiency, and NPA ratios,
helping detect early-stage loan default signals.
• Worked with business stakeholders to define and monitor KPI benchmarks, including customer acquisition
cost, repayment ratio, and active loan portfolio health.
• Provided ad-hoc analytical support for underwriting teams using Excel-based profiling of borrower repayment
histories and region-wise loan performance.
RESEARCH/ ACADEMIC PROJECTS:
E-Library Management System Jan ‘2024 – May ‘2024
University Of Bridgeport
• Developed backend services using Python and Django REST Framework, enabling efficient e-book retrieval
and management.
• Utilized PostgreSQL for structured storage of user data and digital library content, with secure authentication
via Django’s built-in auth system.
• Integrated key features like reading, annotation, and user-friendly navigation across e-book collections.
• Created Power BI reports to visualize user activity and system usage for academic presentation and evaluation.
Predicting Employee Turnover Aug’2023 – Dec ‘2023
University Of Bridgeport
• Built a logistic regression model using Python and Scikit-learn to predict employee turnover, leveraging
structured data from Kaggle and storing it in MySQL for efficient querying.
• Applied feature engineering and statistical analysis to identify key drivers of attrition such as job satisfaction
and engagement levels, improving model accuracy.
• Visualized insights and model results in Tableau, presenting retention recommendations aligned with business
objectives.