Python Syllabus

The Deep Learning Nanodegree program offers you a solid introduction to the world of artificial intelligence. In this program, you’ll master fundamentals that will enable you to go further in the field or launch a brand new career. You will study cutting-edge topics such as neural networks, convolutional neural networks, recurrent neural networks, and generative adversarial networks. Plus, you’ll build projects in PyTorch. Join the next generation of deep learning talent that will help define a highly beneficial AI-powered future for our world.

navratri special discount offer

50%

Available Seats

30

Schedule


Weekly:- 5.00 pm - 7.00 pm

WeekEnd:- Saturday - Sunday : 11:00 am - 2:00 pm

1 Deep Learning

• Explain the difference between artificial intelligence, machine learning, anddeep learning.
• Recognize the power of deep learning by reviewing popular examples of deep learning applications

2 Minimizing the Error Function with Gradient Descent

• Use PyTorch to preprocess data.
• Use maximum likelihood, cross-entropy, and probability to measure modelperformance.
• Apply gradient descent to minimize error.
• Implement a backpropagation algorithm.
• Identify key components of perceptrons.

3 Introduction to Neural Networks

• Explain essential concepts in neural networks.
• Design neural network architectures.
• Distinguish between problems based on the objective of the model.
• Implement appropriate architectures for model objectives

4 Training Neural Networks

• Define a loss function and optimization method to train a neural network.
• Distinguish between overfitting and underfitting, and identify the causes of each.
• Optimize the training process using early stopping, regularication, dropout, learning rate decay, and momentum.
• Distinguish between batch and stochastic gradient descent.
• Build a neural network with PyTorch and run data through it.
• Test and validate a trained network to ensure it generalizes.

5 Introduction to CNNs

• List main applications of CNNs.
• Understand professional roles involved in the development of a CNN-based application.
• Understand the main events in the history of CNNs.

6 CNN Concepts

• Recap training networks in PyTorch.
• Use multi-layer perceptron for image classification.
• Understand limitations of MLPs when applied to images.
• Learn the basic concepts of CNNs that make them great at tasks involvingimages

7 CNNs in Depth

7 • Learn how to use the basic layers used in CNNs.
• Put all layers together to build a CNN from scratch.
• Classify images using a CNN built from scratch.
• Improve the performances of your CNN.
• Export a model for production.

8 Transfer Learning

• Understand key CNN architectures and their innovations.
• Apply multiple ways of adapting pre-trained networks using transfer learning.
• Fine-tune a pre-trained network on a new dataset.

9 Autoencoders

• Understand linear and CNN-based autoencoders.
• Design and train a linear autoencoder for anomaly detection.
• Design and train a CNN autoencoder for anomaly detection and image denoising.

10 Object Detection and Segmentation

• Understand the architecture of an object detection model.
• Train and evaluate an object detection model.
• Understand the architecture of a semantic segmentation model.
• Train and evaluate a semantic segmentation model.

11 Recurrent Neural Networks

• Explain how RNNs evolved from feedforward neural networks.
• Recognize the benefit of RNNs by reviewing the applications of RNNs in areas like machine translation.
• Perform backpropagation on an RNN.
• Apply the SkipGram Word2Vec technique to implement custom word embeddings.
• Explain the limitations of simple RNNs and how they can be overcome by using long short term memory networks (LSTMs).

12 Long Short-Term Memory Networks (LSTMs)

• Understand the functioning of the LSTM via the four LSTM gates: the learning gate, the forget gate, the remember gate, and the use gate.
• Compare architectures such as GRU that can reveal new modeling techniques in combination with the LSTM.

13 Implementation of RNN and LSTMs

• Train a simple RNN in PyTorch to do time series prediction.
• Implement a character level sequence RNN.

14 Fine Tuning RNN Models

• Fine tune RNN models using hyperparameters.
• Apply key hyperparameters such as learning rate, minibatch size, number of epochs, and number of layers.
• Identify possible starting values and intuitions for the hyperparameters used in RNNs.

15 Seq2Seq Architecture

• Implement the components of a Seq2Seq architecture to produce a sequence of words in response to input prompts.
• Implement the key components of a Seq2Seq architecture and understand the way they interact.

16 The Limitations of RNNs

• Use recent architectures such as Transformers and BERT to address the limitations of RNNs at solving NLP problems.
• Identify the changes in architecture that occurred during the transition from recurrent networks to Transformer networks.
• Use new Transformer architectures like BERT and GPT3.

Generative Adversarial Networks

• Build generator and discriminator using fully connected layers.
• Implement loss functions.
• Train a custom GAN on the MNIST dataset.

Training a Deep Convolutional GANs

• Build generator and discriminator using convolutional, batch normalization,and fully connected layers. • Train a DCGAN model on the CIFAR10 dataset. • Implement evaluation metrics and evaluate generated samples.

Image to Image Translation

• Implement unpaired data loader. • Build the CycleGAN generator using residual connection and an encoder-decoder structure. • Train a CycleGAN model on the summer2winter Yosemite dataset.

Modern GANs

• Implement Wasserstein loss and gradient penalties.
• Build the ProGAN generator.
• Implement StyleGAN components (adaptive instance normalization)

Unsupervised Learning

a. Learn Kmeans ML algorithm.
b. Examples and Case studies

Dimension Reduction

a. ML Data dimensions reduction concepts Using PCA

Advanced Machine Learning Algorithms

a. Optimal Solution
b. Regularization
c. Ridge and Lasso
d. Model Selection

5.Deep Learning

a. Learn Neural Network ML algorithm
b. Examples and Case studies