The output of the convolutional layer is normally passed through the ReLU activation purpose to bring non-linearity to your model. It's going to take the function map and replaces every one of the destructive values with zero. During the convolution layer, we go the filter/kernel to every doable place https://financefeeds.com/ai-was-asked-to-list-4-cryptos-that-could-make-47000-from-470-in-the-next-3-months/