Equation:
$f(x)=x$
Derivative:
$\frac{\partial f(x)}{\partial x}=1$
Properties:
> X = tf.linspace(-10., 10., 21)
> tf.keras.activations.linear(X)
<tf.Tensor: shape=(21,), dtype=float32, numpy=
array([-10., -9., -8., -7., -6., -5., -4., -3., -2., -1., 0.,
1., 2., 3., 4., 5., 6., 7., 8., 9., 10.],
dtype=float32)>

Equation:
$\sigma(x)=\frac{1}{1+\exp(-x)}$
$\frac{\partial\sigma(x)}{\partial x}=\sigma(x)(1-\sigma(x))$
Properties:
> X = tf.linspace(-10., 10., 21)
> tf.keras.activations.sigmoid(X)
<tf.Tensor: shape=(21,), dtype=float32, numpy=
array([4.5397868e-05, 1.2339458e-04, 3.3535014e-04, 9.1105117e-04,
2.4726230e-03, 6.6928510e-03, 1.7986210e-02, 4.7425874e-02,
1.1920292e-01, 2.6894143e-01, 5.0000000e-01, 7.3105860e-01,
8.8079703e-01, 9.5257413e-01, 9.8201376e-01, 9.9330717e-01,
9.9752742e-01, 9.9908900e-01, 9.9966466e-01, 9.9987662e-01,
9.9995458e-01], dtype=float32)>

Equation:
$\tanh(x)=\frac{\exp(x)-\exp(-x)}{\exp(x)+\exp(-x)}$
Derivative
$\frac{\partial(\tanh(x))}{\partial x}=\frac{1}{\cosh^2(x)}$
Properties:
> X = tf.linspace(-10., 10., 21)
> tf.keras.activations.tanh(X)
<tf.Tensor: shape=(21,), dtype=float32, numpy=
array([-1. , -0.99999994, -0.99999976, -0.99999833, -0.9999877 ,
-0.9999092 , -0.9993293 , -0.9950548 , -0.9640276 , -0.7615942 ,
0. , 0.7615942 , 0.9640276 , 0.9950548 , 0.9993293 ,
0.9999092 , 0.9999877 , 0.99999833, 0.99999976, 0.99999994,
1. ], dtype=float32)>

Equation
$\text{ReLU}(x)=\max(0, x)$
Derivative
$\frac{\partial(\text{ReLU}(x))}{\partial x}=\begin{cases}1\;&\text{if } x > 0\\0\;&\text{otherwise}\end{cases}$
Properties:
> X = tf.linspace(-10., 10., 21)
> tf.keras.activations.tanh(X)
<tf.Tensor: shape=(21,), dtype=float32, numpy=
array([ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 2.,
3., 4., 5., 6., 7., 8., 9., 10.], dtype=float32)>

Equation
$f(x)=\begin{cases} x\;\;\;\;\;\;\;\;\;&\text{if }x > 0\\ ax &\text{otherwise}\end{cases}$
Derivative
$\frac{\partial f(x)}{\partial x}=\begin{cases}1\;&\text{if } x > 0\\a\;&\text{otherwise}\end{cases}$
Parameters
Properties:
> X = tf.linspace(-10., 10., 21)
> tf.keras.activations.relu(X, alpha=0.2)
<tf.Tensor: shape=(21,), dtype=float32, numpy=
array([-2. , -1.8000001, -1.6 , -1.4 , -1.2 ,
-1. , -0.8 , -0.6 , -0.4 , -0.2 ,
0. , 1. , 2. , 3. , 4. ,
5. , 6. , 7. , 8. , 9. ,
10. ], dtype=float32)>

Equation: $f(x)=\begin{cases}x\;\;\;\;\;\;\;\;\;&\text{if }x > 0\\a(\exp(x)-1) &\text{otherwise}\end{cases}$
Derivative: $\frac{\partial f(x)}{\partial x}=\begin{cases}1\;&\text{if }x > 0\\a\exp(x)&\text{otherwise}\end{cases}$
Parameters:
Properties:
> X = tf.linspace(-10., 10., 21)
> tf.keras.activations.elu(X)
<tf.Tensor: shape=(21,), dtype=float32, numpy=
array([-0.9999546 , -0.9998766 , -0.99966455, -0.9990881 , -0.9975212 ,
-0.99326205, -0.9816844 , -0.95021296, -0.86466473, -0.63212055,
0. , 1. , 2. , 3. , 4. ,
5. , 6. , 7. , 8. , 9. ,
10. ], dtype=float32)>