#data-science
Read more stories on Hashnode
Articles with this tag
- in a back propagation,we use Relu (Rectified linear unit) activation function more then Sigmoid function but why? -before I explain,i hope you know...
Pros: The theorem is very crucial for computationally intensive problems, handling missing data or irrelevant features Cons: As the algorithm assumes...