In the context of artificial neural network, ReLU is an activation function defined as:

where is the input to a neuron

It’s a special case of linear activation function, but both are used for regression. Compared to sigmoid function, it supports faster learning for the artificial neural network, thus one of the most common choice of activation function.

derivative of this function is set to be