CSR304:INTRODUCTION TO AI AND ML
L:3 T:0 P:2 Credits:4
Course Outcomes: Through this course students should be able to
CO1 :: understand the foundational principles of AI and ML, including their definitions,
applications, ethical considerations, and fundamental terminologies.
CO2 :: understand the fundamentals of Machine Learning, focusing on types of ML, data
preprocessing, feature engineering, and basic statistical analysis required for ML models.
CO3 :: develop competencies in building and evaluating ML models, specifically through
techniques in linear and logistic regression, model training, and cross-validation methods.
CO4 :: gain proficiency in advanced ML techniques such as decision trees, random forests,
support vector machines, and clustering.
CO5 :: explore into basic AI concepts, including neural networks and deep learning frameworks,
while understanding strategic AI problem-solving methods.
CO6 :: explore advanced AI topics like Natural Language Processing (NLP) and Computer Vision,
learning about their core technologies, applications
Unit I
Introduction to Artificial Intelligence and Machine Learning : History and Evolution: Tracing
the milestones from early concepts to modern algorithms, Defining AI and ML: Distinctions between
artificial intelligence, machine learning, and deep learning, Applications and Case Studies: Examples
from healthcare, finance, automotive, and more, Ethical Considerations: Issues of bias, fairness,
transparency in AI/ML applications, Fundamental Terminologies: Key terms such as algorithm, neural
network, training data, etc.
Unit II
Fundamentals of Machine Learning : Types of ML Systems: Differences between supervised,
unsupervised, semi-supervised, and reinforcement learning, Data Handling: Techniques for gathering,
cleaning, and organizing data, Feature Engineering: Importance of feature selection, extraction, and
dimensionality reduction, Statistical Foundations: Descriptive statistics, inferential statistics,
probability distributions
Unit III
Building Machine Learning Models : Linear Regression and Logistic Regression: Theories,
applications, and assumptions, Model Evaluation Techniques: Understanding different metrics to
assess model performance, Overfitting and Underfitting: Techniques to balance bias-variance, such as
regularization, Cross-validation Methods: Implementing k-fold and leave-one-out cross-validation for
model reliability
Unit IV
Advanced Machine Learning Techniques : Tree-based Models: Detailed exploration of decision
trees, bagging, boosting, and random forests, Support Vector Machines: Kernel tricks,
hyperparameter tuning, and their impact on model performance, Clustering and Dimensionality
Reduction: Techniques including k-means, DBSCAN, and PCA applications
Unit V
Foundations of Artificial Intelligence : Introduction to Neural Networks: Architecture, activation
functions, and forward/backward propagation, Deep Learning Architectures: Introduction to CNNs,
RNNs, and their applications, AI Strategies: Algorithms for pathfinding, minimax for game playing,
and heuristic search techniques
Unit VI
Advanced Topics in Artificial Intelligence : Natural Language Processing: Tokenization,
stemming, lemmatization, and sentiment analysis, Computer Vision: Fundamentals of image
recognition, object detection, and applications in real-world scenarios, Robotics and AI: Sensors,
actuators, and the integration of AI for autonomous decision-making
List of Practicals / Experiments:
List of Practicals
• Implement an interactive timeline using Plotly in Python that visualizes the major milestones in the
development of artificial intelligence and machine learning.
• Create a Python notebook that allows users to perform principal component analysis (PCA).
Session 2024-25 Page:1/3
• Write a Python simulation that models an AI's decision-making process in predicting patient
treatment outcomes based on their medical history.
• Develop a Python notebook that guides users through the process of cleaning a real-world dataset,
including handling missing values, filtering outliers, and normalizing data.
• Develop a Python script that evaluates a logistic regression model using various metrics and cross-
validation techniques.
• Write detailed Python code to perform linear regression on a dataset with real-world data, such as
predicting housing prices based on various features.
• Build a decision tree classifier to predict the quality of the wine. Evaluate the model's performance
using accuracy and visualize the tree.
• Implement a Bagging classifier using decision trees as the base estimator. Compare its performance
with the standalone decision tree.
• Apply a Gradient Boosting classifier to the dataset. Evaluate its performance and compare it with the
previous models.
• Construct a Random Forest classifier and evaluate its performance. Discuss the importance of
different features as indicated by the model.
• Implement a Support Vector Machine classifier with a linear kernel to predict the quality of the wine.
Evaluate its performance.
• Experiment with different kernel tricks (e.g., polynomial, RBF) and hyperparameter tuning to improve
the model's performance. Compare the results.
• Use k-means clustering to group the wines into clusters based on their features. Visualize the clusters
and interpret the results.
• Apply the DBSCAN algorithm to the dataset and compare its clustering results with those of k-means.
• Perform Principal Component Analysis (PCA) to reduce the dimensionality of the dataset. Visualize
the first two principal components and discuss the variance explained by them.
• Use the dimensionality-reduced data from PCA as input to the decision tree and SVM models.
Evaluate and compare the performance with the original data.
• Design a simple neural network (Multi-Layer Perceptron) with one hidden layer to classify the digits.
• Experiment with different activation functions (e.g., ReLU, sigmoid, tanh) for the hidden layer and
observe their impact on the model’s performance.
• Implement the forward and backward propagation steps for training the neural network. Use an
appropriate loss function and optimization algorithm to train the model. Evaluate its performance on
a test set.
• Build a simple CNN to classify the handwritten digits. Use convolutional layers, pooling layers, and
fully connected layers in your architecture.
• Experiment with different configurations of CNN layers (e.g., number of filters, kernel size) and
compare the performance with the neural network.
• Although RNNs are typically used for sequential data, implement a simple RNN to understand its
architecture. Use a sequential dataset like a simple time series data (e.g., predicting the next number
in a sequence).
• Evaluate the RNN’s performance and discuss its suitability for this type of task compared to CNNs.
• Load a text dataset (e.g., a collection of news articles) and perform tokenization. Experiment with
different tokenization techniques (e.g., word, sentence).
• Apply stemming and lemmatization on the tokenized text. Compare the results and discuss the
differences.
• Use a pre-trained model (e.g., VADER, TextBlob) to perform sentiment analysis on the text dataset.
Evaluate the model's performance and visualize the sentiment distribution.
• Load a dataset of images (e.g., CIFAR-10) and build a simple image classification model using a
Convolutional Neural Network (CNN). Evaluate the model's performance.
• Use a pre-trained object detection model (e.g., YOLO, SSD) to detect objects in images. Visualize the
detected objects with bounding boxes.
• Choose a real-world application (e.g., facial recognition, automated attendance system) and
demonstrate how computer vision techniques can be applied. Implement a basic version of the
chosen application and evaluate its performance.
Session 2024-25 Page:2/3
Text Books:
1. ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING by VINOD CHANDRA S S, ANAND
HAREENDRAN S., PRENTICE HALL
References:
1. ARTIFICIAL INTELLIGENCE AND INTELLIGENT SYSTEMS by N. P. PADHY, OXFORD
UNIVERSITY PRESS
2. THE ELEMENTS OF STATISTICAL LEARNING by T. HASTIE, RT IBRASHIRAN AND J.
FRIEDMAN, SPRINGER
Session 2024-25 Page:3/3