This template implements RELU activation function used in neural networks.
You can find more information in comments below. Overrided methods can be found in corresponding base class.
template<class FLOAT>
class TRelu : public IActivator<FLOAT>
{
public:
FLOAT Calculate(const FLOAT value) override;
FLOAT Derivative(const FLOAT value) override;
};
Namespace: | nitisa::ai::activators |
Include: | Nitisa/Modules/AI/Activators.h |