NumPy Nuts and Bolts of NumPy Optimization Part 3: Understanding NumPy Internals, Strides, Reshape and Transpose We cover basic mistakes that can lead to unnecessary copying of data and memory allocation in NumPy. We further cover NumPy internals, strides, reshaping, and transpose in detail.

NumPy Nuts and Bolts of NumPy Optimization Part 2: Speed Up K-Means Clustering by 70x In this part we'll see how to speed up an implementation of the k-means clustering algorithm by 70x using NumPy. We cover how to use cProfile to find bottlenecks in the code, and how to address them using vectorization.

NumPy Nuts and Bolts of NumPy Optimization Part 1: Understanding Vectorization and Broadcasting In Part 1 of our series on writing efficient code with NumPy we cover why loops are slow in Python, and how to replace them with vectorized code. We also dig deep into how broadcasting works, along with a few practical examples.

Research Neural Architecture Search Part 1: An Overview The hyperparameter optimization problem has been solved in many different ways for classical machine learning algorithms. Some examples include the use of grid search, random search, Bayesian optimization, meta-learning, and

Python NumPy Array Processing With Cython: 1250x Faster This tutorial will show you how to speed up the processing of NumPy arrays using Cython. By explicitly specifying the data types of variables in Python, Cython can give drastic

Python Boosting Python Scripts With Cython Python might be one of today's most popular programming languages, but it's definitely not the most efficient. In the machine learning world in particular, practitioners sacrifice efficiency for the ease-of-use

Series: Gradient Descent with Python Implementing Gradient Descent in Python, Part 2: Extending for Any Number of Inputs Hello again in the series of tutorials for implementing a generic gradient descent (GD) algorithm in Python for optimizing parameters of artificial neural network (ANN) in the backpropagation phase. The

Series: Gradient Descent with Python Implementing Gradient Descent in Python, Part 1: The Forward and Backward Pass Through a series of tutorials, the gradient descent (GD) algorithm will be implemented from scratch in Python for optimizing parameters of artificial neural network (ANN) in the backpropagation phase. The

Series: Optimization Intro to optimization in deep learning: Momentum, RMSProp and Adam In this post, we take a look at a problem that plagues training of neural networks, pathological curvature.

Series: Optimization Intro to optimization in deep learning: Gradient Descent An in-depth explanation of Gradient Descent, and how to avoid the problems of local minima and saddle points.