Research Neural Architecture Search Part 1: An Overview The hyperparameter optimization problem has been solved in many different ways for classical machine learning algorithms. Some examples include the use of grid search, random search, Bayesian optimization, meta-learning, and

Python NumPy Array Processing With Cython: 1250x Faster This tutorial will show you how to speed up the processing of NumPy arrays using Cython. By explicitly specifying the data types of variables in Python, Cython can give drastic

Python Boosting Python Scripts With Cython Python might be one of today's most popular programming languages, but it's definitely not the most efficient. In the machine learning world in particular, practitioners sacrifice efficiency for the ease-of-use

Series: Gradient Descent with Python Implementing Gradient Descent in Python, Part 2: Extending for Any Number of Inputs Hello again in the series of tutorials for implementing a generic gradient descent (GD) algorithm in Python for optimizing parameters of artificial neural network (ANN) in the backpropagation phase. The

Series: Gradient Descent with Python Implementing Gradient Descent in Python, Part 1: The Forward and Backward Pass Through a series of tutorials, the gradient descent (GD) algorithm will be implemented from scratch in Python for optimizing parameters of artificial neural network (ANN) in the backpropagation phase. The

Series: Optimization Intro to optimization in deep learning: Momentum, RMSProp and Adam In this post, we take a look at a problem that plagues training of neural networks, pathological curvature.

Series: Optimization Intro to optimization in deep learning: Gradient Descent An in-depth explanation of Gradient Descent, and how to avoid the problems of local minima and saddle points.