Factor·arcadia
Leaky ReLU activation and batch normalization
Neural network activation functions with leaky ReLU (negative slope 0.01) and batch normalization (momentum 0.8).
Confidence
100%
active
Source
G–P Atlas architecture and training description