Tuesday, April 25, 2023

Relu functions

Despite their practical success, Relu activations are theoretically ill-defined. This is due to the existence of numerous trivial local optima where the gradient is zero, and we are optimizing them with approximate local optimizers.

No comments:

AI

Despite the benefits of AI we are starving for humanity.