We are working on the course content with rocket speed !! sooner we will be loaded with all the courses

Course 6 : Deep Learning – Primer (TF/Keras)

About Course

The primer on Deep Learning initiates with the application of perceptron and Multi-Layer Perceptron (MLP) using scikit-learn (sklearn). Sklearn’s perceptron serves as an entry point, offering a fundamental understanding of single-layer neural networks tailored for binary classification tasks.

Moving beyond, MLP in sklearn expands the scope, enabling the exploration of multi-layer architectures capable of handling more intricate, non-linear data relationships. Transitioning towards TensorFlow (TF) and Keras, the primer delves into deeper concepts of deep learning.

TF/Keras provides a robust platform, empowering users with the flexibility and scalability required for constructing complex neural networks.

Within this framework, participants are introduced to crucial components such as layers, optimizers, and loss functions.

Through practical exercises, learners gain hands-on experience in constructing and training deep neural networks for a diverse array of tasks, spanning from image classification to natural language processing.

Mastering TF/Keras equips individuals with the foundational knowledge necessary for delving into advanced deep learning methodologies and real-world applications.

Show More

What Will You Learn?

  • The basics of neural networks starting from perceptron and extending to Multi-Layer Perceptron (MLP) using scikit-learn.
  • Transitioning to TensorFlow (TF) and Keras, gaining a deeper understanding of deep learning concepts and methodologies.
  • Constructing and training neural networks with TensorFlow and Keras, including customization of architectures, activation functions, and optimization algorithms.
  • Exploring various deep learning applications such as image classification, natural language processing, and more.
  • Hands-on experience through practical exercises, enabling you to build and train deep neural networks for real-world tasks.
  • Understanding key components of TF/Keras including layers, optimizers, and loss functions, and how they contribute to model performance.
  • Gaining insights into advanced deep learning techniques and best practices for improving model accuracy and efficiency.

Course Content

what are activation functions
Activation functions are crucial components of neural networks that introduce non-linearity, allowing neural networks to learn complex patterns in data. They operate on the weighted sum of inputs to a neuron and produce an output, which is then passed to the next layer of the network.

  • Heaviside Step function
  • Sigmoid
  • Hyperbolic Tangent (Tanh)
  • Rectified Linear Unit (ReLU)
  • Leaky ReLU
  • Parametric ReLU (PReLU)
  • Exponential Linear Unit (ELU)
  • Softmax

Perceptrons & Multi layer perceptrons

TF/Keras – layers

Loss functions

Data loading

Metrics

Model building using keras

Hyperparameter Tuning

Applying Keras

Keras utilities

Student Ratings & Reviews

No Review Yet
No Review Yet