- 🎓 Undergraduate at IIT BHU, pursuing Engineering Physics
- 🤖 Passionate about Representation Learning, Natural Language Processing, Probabilistic ML, and Reinforcement Learning
- 👯 Open to research collaborations in LLMs, Generative Models, and Quantum Machine Learning
- 📫 Reach me at:
Oct 2024 – Present
Exposure: LLM Fine-Tuning, PEFT, LoRA, LLaMA Adapters, Emotion Extraction, Prompt Tuning
• Researched low-resource LLM fine-tuning methods and emotion extraction from text.
• Working on synthetic interviews for depression detection using the DAIC-WOZ dataset, fine-tuned on Reddit data and integrated with emotion detection.
Aug 2023 – Apr 2024
Exposure: GANs, Graph Neural Networks, Fuzzy Logic, DeepWalk, Node2Vec, Struc2Vec
• Re-designed GraphGAN with Wasserstein Loss, improving accuracy from *84.7% → 88.57%.
• Implemented DeepWalk, Node2Vec, and Struc2Vec for node embeddings.
• Developed a Fuzzy Pre-processing Layer using a modified K-Means algorithm to boost accuracy to 88.95%.
Sep 2024 – Present
Exposure: Super Resolution, Quantization, Knowledge Distillation, LipSync, Mixed Precision Training
• Worked on inference optimization with Quantization, Knowledge Distillation, and Batch Inference.
• Applied Mixed Precision Training and Post-training Quantization on CodeFormer.
• Reduced inference time by 25%, and total forward pass time by 50%.
May 2025 – Present
- Contributing to AI for Code tooling and research problems.
- Exploring techniques for intelligent code understanding and generation.
Enriching Pre-Training Using Fuzzy Logic - Vansh Gupta, Vandana Bharti, Abhinav Kumar, Anshul Sharma, Sanjay Kumar Singh. IEEE International Conference on Fuzzy Systems (FUZZ-IEEE 2025)
- Focused on enhancing language representation through fuzzy logic integration into the pre-training phase.