Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
32 views2 pages

CO328 - Deep - Learning - Final 23.12.23

This document provides details about the Deep Learning course CO328 including: - The course code, title, contact hours, examination duration, weightage and credits. - The prerequisites required are linear algebra, probability, calculus and machine learning. - The objective is to introduce concepts of deep learning for images and sequential data. - The course is divided into 6 units covering topics such as neural networks, convolutional networks, recurrent neural networks, transformers and generative models. - 3 suggested text books on deep learning are also listed.

Uploaded by

magijed476
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views2 pages

CO328 - Deep - Learning - Final 23.12.23

This document provides details about the Deep Learning course CO328 including: - The course code, title, contact hours, examination duration, weightage and credits. - The prerequisites required are linear algebra, probability, calculus and machine learning. - The objective is to introduce concepts of deep learning for images and sequential data. - The course is divided into 6 units covering topics such as neural networks, convolutional networks, recurrent neural networks, transformers and generative models. - 3 suggested text books on deep learning are also listed.

Uploaded by

magijed476
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

CO328 Deep Learning

NAME OF DEPTT: Computer Science and Engineering


1. Subject Code: CO328 Course Title: Deep Learning
2. Contact Hours: L: 3 T: 0 P: 2
3. Examination Duration (ETE)(Hrs.): Theory 3 Hrs Practical 2 Hrs
4. Relative Weightage: CWS 15 PRS 25 MTE 20 ETE 40
5. Credits: 4
6. Semester: VI
7. Subject Area: DEC
8. Pre-requisite: Linear algebra, Theory of Probability, Calculus, Machine Learning
9. Objective: To introduce the concepts of deep learning for images and sequential data.
10. Details of Course
Unit Contents Contact
Hours
I. Introduction, intuition and applications of deep learning; mathematical primaries 8
for deep learning; Architecture of Neural Network, Hidden Units; Loss – Empirical
Loss, Binary Cross Entropy Loss, MSE Loss; Gradient-Based Learning; Back-
Propagation;
Optimization for Training Deep Models, Mini-batch gradient descent,
Exponentially weighted averages, Bias correction in exponentially weighted
averages, Gradient descent with momentum;
II. Local/Global Optimum, Learning rate decay; Train/Dev/Test sets; Bias/Variance; 6
Underfitting, Overfitting, Regularization, Dropout Regularization, Understanding
Dropout, Other regularization methods; Normalizing inputs; Activation functions,
Vanishing/Exploding gradients; Weight Initialization for Deep Networks, Types
of errors, bias-variance trade-off.
III. Images as Numbers, Problems in manual feature extraction, Convolutional 8
Networks, Convolution operation, Padding, Stride Convolutions, Convolutions
Over Volume, One Layer of a Convolutional Network, Simple Convolutional
Network, Pooling Layers, CNN architectures, Image classification with CNNs,
Batch Normalization, Data Augmentation.
IV. Sequence Models: Notation, Neurons with recurrence, Recurrent Neural 8
Networks, Encoding Language for NN, Backpropagation Through Time, Different
types of RNN, Vanishing gradients with RNNs, Gated Recurrent Unit (GRU),
Long Short Term Memory (LSTM), Optimization for Long-Term Dependencies,
Bidirectional RNN, Deep RNNs
V Transformers - Self-Attention, Positional encoding, Multi-head Attention, Need 6
of multiple heads; NLP Applications; Vision Transformers.
VI Generative Modeling: Latent space, Autoencoders, Variational Autoencoder, 6
Generative Adversarial Networks, loss function and training GANs, Cycle GAN,
Applications of GAN.
TOTAL 42
11. Suggested Books

S.No. Name of Books / Authors/ Publishers Year of


Publication/
Reprint

Text Books:
1. Deep Learning by Ian Goodfellow, Yoshua Begio, and Aron Courville 2016

2. Deep Learning with Python. By Francois Chollet, Google AI 2017


3. Neural Networks and Deep Learning by Michael Nielsen. 2019

You might also like