A simple function of x returning 0 if x < 0 or x otherwise.
x
x < 0
ReLU is a popular activation function in deep neural networks because it is fast to compute and helps avoid the vanishing gradients problem.