1.1 full connect structure

image.png

1.2 Neural structure

image.png

a = h ( w * x + b)

多个nueral组成nueral net,输入层x 根据net结构得到 输出层y

1.3 Activation function

Q: Why we need activation function? A: There’s 3 layer, each layer multiply input x n times : y =n^3x let m = n^3 : y = mx 如果没activation function,层数深没用

1.3.1 Sigmoid

image.png

Adv

Disadv

1.3.2 tanh

image.png

1.3.3 relu

image.png

直接使negative neural为0,neural死亡

1.3.3 Leaky relu