Tuesday, April 25, 2023
Relu functions
Despite their practical success, Relu activations are theoretically ill-defined. This is due to the existence of numerous trivial local optima where the gradient is zero, and we are optimizing them with approximate local optimizers.
Saturday, April 15, 2023
Optimization
#Optimization is the core of #machinelearning. All GPU acceleration, cloud systems, and transfer learning in ML are for better optimization. There are many attempts to make it faster. However, there are no widely adopted second-order methods in the field despite some momentum methods depicting them. I believe there is still potential in this area. Having a robust 10% better algorithm could have billion of dollars of impact on the industry. Of course, it is very hard to achieve.
Subscribe to:
Posts (Atom)
Turkce-Ingilizce Tekerleme
I scream, you scream we all scream for ice scream I run, you run we all run for ayran
-
It is difficult to create robot that cleans our arbitrary dirty dishes. However , if we put some digital information on dishes (special des...
-
Real Time Transcription: In TV channels I realize that text is a little bit coming late after the speech. So , I think they are using...
-
Layoffs from cloud companies Amazon, Google and Microsoft is fade of Web2. But Metaverse is failed attempt to create Web3 so far. My perspec...