Skip to content

Latest commit

 

History

History
65 lines (38 loc) · 2.55 KB

File metadata and controls

65 lines (38 loc) · 2.55 KB

Deep Learning

For understanding Deep Learning, individuals need to understand the essence of data and how data works. Here, we need the data to work for us.

Deep Learning eliminates the job of Feature Extraction which was involved earlier with the

Blogs

  1. Raúl Gómez blog

Optimizers

  1. Adam

  2. RMS Prop From the TensorFlow perspective, explore the function RMSProp Optimizer

Learning Algotithms

  1. Mini-Batch Gradient Descent

Loss Functions

  1. Understanding Categorical Cross-Entropy Loss, Binary Cross-Entropy Loss, Softmax Loss, Logistic Loss, Focal Loss and all those confusing names

Distances

  1. L0, L1, L2 Norm

Activation Functions

What are Activation Functions? Activation Functions form one of the units of a Neuron. Layers comprises of Neurons.

  1. ReLU Rectified Linear Unit
    1. A Practical Guide to ReLU

12 Different Types of Activation Functions from V7 Labs

CNNs

  1. An Intuitive Explanation of Convolutional Neural Networks

Debugging

  1. IDPB
  2. Unittest
  3. Disabling Regularization
  4. No Visualization

Appendix

Neuron

Neuron, inspired from Biology with all the electrical impulses