Reinforcement Learning The Machine Learning Practitioner's Guide To Reinforcement Learning: All About Markov Decision Processes Deep Reinforcement Learning is one of the most quickly progressing sub-disciplines of Deep Learning right now. In less than a decade, researchers have used Deep RL to train agents that
Reinforcement Learning Getting Started With OpenAI Gym: The Basic Building Blocks In this article, we'll cover the basic building blocks of Open AI Gym. This includes environments, spaces, wrappers, and vectorized environments.
Computer Vision How To Speed Up Object Detection Using NumPy Reshape and Transpose This is Part 4 of our ongoing series on NumPy optimization. In Parts 1 and 2 we covered the concepts of vectorization and broadcasting, and how they can be applied
NumPy Nuts and Bolts of NumPy Optimization Part 3: Understanding NumPy Internals, Strides, Reshape and Transpose We cover basic mistakes that can lead to unnecessary copying of data and memory allocation in NumPy. We further cover NumPy internals, strides, reshaping, and transpose in detail.
NumPy Nuts and Bolts of NumPy Optimization Part 2: Speed Up K-Means Clustering by 70x In this part we'll see how to speed up an implementation of the k-means clustering algorithm by 70x using NumPy. We cover how to use cProfile to find bottlenecks in the code, and how to address them using vectorization.
NumPy Nuts and Bolts of NumPy Optimization Part 1: Understanding Vectorization and Broadcasting In Part 1 of our series on writing efficient code with NumPy we cover why loops are slow in Python, and how to replace them with vectorized code. We also dig deep into how broadcasting works, along with a few practical examples.
Coronavirus Fighting Coronavirus with AI, Part 2: Building a CT Scan COVID-19 Classifier Using PyTorch Using PyTorch, we create a COVID-19 classifier that predicts whether a patient is suffering from coronavirus or not, using chest CT scans of different patients.
Coronavirus Fighting Coronavirus With AI, Part 1: Improving Testing with Deep Learning and Computer Vision This post will cover how testing is done for the coronavirus, why it's important in battling the pandemic, and how deep learning tools for medical imaging can help us improve the quality of COVID-19 testing.
Series: GauGAN Understanding GauGAN Part 4: Debugging Training & Deciding If GauGAN Is Right For You In this post we cover how to tackle common training issues that may arise with GauGAN. We conclude with advice on whether GauGAN will fit your business needs or not.
Series: GauGAN Understanding GauGAN Part 3: Model Evaluation Techniques In Part 3 of the GauGAN series we cover how to evaluate model performance, and how GauGAN compares to models like Pix2PixHD, SIMS, and CRN.
Series: GauGAN Understanding GauGAN Part 2: Training on Custom Datasets In this article we cover how to train GauGAN on your own custom dataset. This is part of a series on Nvidia GauGANs.
Series: GauGAN Understanding GauGAN Part 1: Unraveling Nvidia's Landscape Painting GANs In this article we explain what GauGANs are, and how their architecture and objective functions work. This is part of a series on Nvidia GauGANs.
PyTorch PyTorch 101, Part 5: Understanding Hooks In this post, we cover debugging and Visualisation in PyTorch. We go over PyTorch hooks and how to use them to debug our backpass, visualise activations and modify gradients.
Tutorial PyTorch 101, Part 4: Memory Management and Using Multiple GPUs This article covers PyTorch's advanced GPU management features, including how to multiple GPU's for your network, whether be it data or model parallelism. We conclude with best practises for debugging memory error.
Tutorial PyTorch 101, Part 3: Going Deep with PyTorch In this tutorial, we dig deep into PyTorch's functionality and cover advanced tasks such as using different learning rates, learning rate policies and different weight initialisations etc
PyTorch PyTorch 101, Part 2: Building Your First Neural Network In this part, we will implement a neural network to classify CIFAR-10 images. We cover implementing the neural network, data loading pipeline and a decaying learning rate schedule.
Deep Learning PyTorch 101, Part 1: Understanding Graphs, Automatic Differentiation and Autograd In this article, we dive into how PyTorch's Autograd engine performs automatic differentiation.
Series: Data Augmentation Data Augmentation for Bounding Boxes: Rethinking Image Transforms for Object Detection How to adapt major image augmentation techniques for object detection purposes. We also cover the implementation of horizontal flip augmentation.
Series: Data Augmentation Data Augmentation for Bounding Boxes: Scaling and Translation We implement Scale and Translate augmentation techniques, and what to do if a portion of your bounding box is outside the image after the augmentation.
Computer Vision Data Augmentation for Bounding Boxes: Rotation and Shearing This is part 3 of the series where we are looking at ways to adapt image augmentation techniques to object detection tasks. In this part, we will cover how to implement how to rotate and shear images as well as bounding boxes using OpenCV's affine transformation features.
Series: Data Augmentation Data Augmentation For Bounding Boxes: Building Input Pipelines for Your Detector Previously, we have covered a variety of image augmentation techniques such as Flipping, rotation, shearing, scaling and translating. This part is about how to bring it all together and bake it into the input pipeline for your deep network.
Series: Optimization Intro to Optimization in Deep Learning: Busting the Myth About Batch Normalization Batch Normalisation does NOT reduce internal covariate shift. This posts looks into why internal covariate shift is a problem and how batch normalisation is used to address it.
Series: Optimization Intro to Optimization in Deep Learning: Vanishing Gradients and Choosing the Right Activation Function An look into how various activation functions like ReLU, PReLU, RReLU and ELU are used to address the vanishing gradient problem, and how to chose one amongst them for your network.
Series: Optimization Intro to optimization in deep learning: Momentum, RMSProp and Adam In this post, we take a look at a problem that plagues training of neural networks, pathological curvature.
Series: Optimization Intro to optimization in deep learning: Gradient Descent An in-depth explanation of Gradient Descent, and how to avoid the problems of local minima and saddle points.