Research Neural Architecture Search Part 1: An Overview The hyperparameter optimization problem has been solved in many different ways for classical machine learning algorithms. Some examples include the use of grid search, random search, Bayesian optimization, meta-learning, and

Data Science Measuring Text Similarity Using the Levenshtein Distance In word processing or text chat applications, it's common that users make some unintended spelling mistakes. It could be as simple as writing "helo" (single "l") rather than "hello". Luckily,

Series: GauGAN Understanding GauGAN Part 1: Unraveling Nvidia's Landscape Painting GANs One of the most interesting papers presented at CVPR in 2019 was Nvidia's Semantic Image Synthesis with Spatially-Adaptive Normalization. This features their new algorithm, GauGAN, which can effectively turn doodles

Series: Optimization Intro to optimization in deep learning: Gradient Descent An in-depth explanation of Gradient Descent, and how to avoid the problems of local minima and saddle points.