Perturbed gradient descent with backprop is what we were doing in the 90s. It feels like there are some new tricks but mostly what I see is the result of GPUs and cheap memory.
I volunteer at a summer science camp and 90% of the projects are “I pointed AI at this problem and…”, nobody seems to be even trying for analytical approaches any more. I’m ready for a new fad.
Anybody else remember when it was all wavelets all the time? That was kinda fun.
You can still say that about AI. What people are calling “AI” now is closer to Cleverbot than true AGI.
Perturbed gradient descent with backprop is what we were doing in the 90s. It feels like there are some new tricks but mostly what I see is the result of GPUs and cheap memory.
I volunteer at a summer science camp and 90% of the projects are “I pointed AI at this problem and…”, nobody seems to be even trying for analytical approaches any more. I’m ready for a new fad.
Anybody else remember when it was all wavelets all the time? That was kinda fun.
Also it’s not even close to be everywhere, as it should be.