GRAPH THEORY
IMPLEMENTATION IN NLP
GROUP MEMBERS
TALHA ISLAM 21688
HAMID ZAIB 21684
INTRODUCTION
In this presentation,
we'll uncover the
exciting combination of
computer science and
language understanding
by using graphs in
Natural Language
Processing (NLP).
2
CHALLENGES
NLP is tough because it involves
understanding context, meaning, and even the
emotions in human language.
Why It Matters: We'll see why these
challenges matter and how graph theory comes
to the rescue with innovative solutions.
3
GRAPH THEORY IN NLP
Graph Theory Defined: Graph Theory is a
mathematical tool for studying relationships
using nodes and edges.
Why It Matters in NLP: It's crucial in NLP
because it helps us uncover connections in
language, making sense of the complex
relationships between words and sentences.
Graph Power: See how graph-based models
tackle NLP challenges effectively.
4
WHAT CAN GNNS DO?
GNNs apply the predictive power of deep learning to rich data structures that depict objects and their relationships as
points connected by lines in a graph. Many branches of science and industry already store valuable data in graph databases.
With deep learning, they can train predictive models that unearth fresh insights from their graphs.
5
HOW DO GNNS
WORK?
Deep learning traditionally targets structured data like text and images, which are sequences of words or grids of pixels.
Graphs, however, are unstructured and can vary in shape and content.
Graph Neural Networks (GNNs) employ message passing to structure graphs for machine learning. It conveys information
about node neighbors, enabling AI models to detect patterns and make predictions.
• Recommendation systems match
customers with products using node
embedding in GNNs.
• Fraud detection systems find suspicious
transactions using edge embeddings.
• Drug discovery models compare entire
molecule graphs to understand their
reactions.
6
CH
A
RTILE
HOW DO GNNS WORK?
GNNs are unique in two other ways: They
use sparse math, and the models typically
only have two or three layers. Other AI
models generally use dense math and have
hundreds of neural-network layers.
Add a Footer 7
plCH
Forexam A
RTILE
ENHANCING CONVERSATIONAL
UNDERSTANDING AND CONTEXTUAL
RESPONSE GENERATION
In our project, we have implemented Graph Theory in the context of
Natural Language Processing (NLP). We started by fine-tuning a
language model for mental health assistance using a dataset, creating
a conversational chatbot. The process involves utilizing transformer-
based models, configuring quantization, and training the model for
specific tasks using the LoRA framework. We then employed this
chatbot to demonstrate the application of Graph Theory in NLP. The
chatbot maintains a conversation history and generates responses
based on user input, effectively modeling conversations as a graph.
Each user input represents a node in the conversation graph, and the
chatbot's responses connect these nodes, forming a structured
dialogue graph. This approach enables more context-aware and
coherent responses by considering the entire conversation history as
a connected graph. Overall, our project showcases how Graph Theory
enhances the understanding and generation of natural language in
the context of mental health assistance.
Add a Footer 8
THANK YOU