(3rd-June-2020)
• Logistic functions are often used in neural networks to introduce nonlinearity in the model and/or to clamp signals to within a specified range. A popular neural net element computes a linear combination of its input signals, and applies a bounded logistic function to the result; this model can be seen as a "smoothed" variant of the classical threshold neuron.
• A common choice for the activation or "squashing" functions, used to clip for large magnitudes to keep the response of the neural network bounded is
• which is a logistic function. These relationships result in simplified implementations of artificial neural networks with artificial neurons. Practitioners caution that sigmoidal functions which are antisymmetric about the origin (e.g. the hyperbolic tangent) lead to faster convergence when training networks with backpropagation.
• The logistic function is itself the derivative of another proposed activation function, the softplus.
Comentarios