Important Deep learning Concept Explained Part – 1

Neuron

Node is a NN, typically taking in multiple input values and generating one output value by applying an activation function (nonlinear transformation) to weighted sum of input values.

Weights

Edges is a NN, the goal of training is to determine the optimal weight for each feature; if weight = 0, corresponding feature does not contribute.

Neural Network

Composed of neurons (simple building blocks that actually learn), contains activation functions that makes it possible to predict non-linear outputs.

Activation functions

Mathematical function that introduces non-linearity to a network e.g., RELU, tanh.

Sigmoid function

Function that maps very negative numbers to a number very close to 0, huge number close to 1, and 0 to 0.5. useful for predicting probabilities.

Gradient Descent/Backpropagation

Fundamental loss optimizer algorithms, of which the other optimizers are usually based on. Backpropagation is similar to gradient descent but for neural nets.

Optimizer

Operation that changes the weights and biases to reduce loss e.g. Adagrad or Adam.

Weights/Biases

Weights are values that the input features are multiplied by to predict an output value. Biases are the value of the output given a weight of 0.

Important Deep learning Concept Explained Part – 2

Popular Posts

Spread the knowledge
 
  

Leave a Reply

Your email address will not be published. Required fields are marked *