Creating your own style transfer mirror with Gradient° and ml5.js
In this post, we will learn how to train a style transfer network with Paperspace's Gradient° and use the model in to create an interactive style transfer mirror.
In this post, we will learn how to train a style transfer network with Paperspace's Gradient° and use the model in to create an interactive style transfer mirror.
An look into how various activation functions like ReLU, PReLU, RReLU and ELU are used to address the vanishing gradient problem, and how to chose one amongst them for your network.
In this post, we will learn how to train a language model using a LSTM neural network with your own custom dataset and use the resulting model inside so you will able to sample from it directly from the browser!
In this post, we take a look at a problem that plagues training of neural networks, pathological curvature.
An in-depth explanation of Gradient Descent, and how to avoid the problems of local minima and saddle points.
Learn more about what we've been working on to make Gradient better than ever.
Next time you're wondering why your machine learning code is running slowly, even on a GPU, consider vectorizing any loopy code!
Generative Adversarial Networks or GANs are one of the most active areas in deep learning research and development due to their incredible ability to generate synthetic results. In this blog, we will build out the basic intuition of GANs through a concrete example.