The Orion's Arm Universe Project Forums





A Cognitive Neural Architecture Able to Learn and Communicate through Natural Languag
#4
Oh, what the hell. Here's another freebie - and while this one *IS* from my mad science, the results are very repeatable by sane scientists. I call it the Magic Sigmoid.

The activation function for the magic sigmoid is x/(1 + abs(x + cuberoot(x)))

This one isn't used by hardly anybody because training is very slow. But I adore it because its tail convergence is not just subexponential, it's subhyperbolic. The reason it's slow is because it trains the upper layers of weights at only about the same rate as the deepest layers of weights. Whereas G-B initialization is balanced on a razor's edge of stability with 1/(1+abs(x)), it is firmly within a broad range of stability for the magic sigmoid, meaning that if you have the time you can train networks to ANY depth. But, um, yeah, slow steady progress or not, that is a lot of time.
Reply


Messages In This Thread
RE: A Cognitive Neural Architecture Able to Learn and Communicate through Natural Languag - by Bear - 11-14-2015, 03:08 PM

Forum Jump:


Users browsing this thread: 1 Guest(s)