Theory
xUnit Spatial Activation Function for Image Denoising
This article provides an in-depth look at the CVPR 2018 paper titled "xUnit: Learning a Spatial Activation Function for Efficient Image Restoration" and its importance in the domain of image reconstruction.
Concurrent Spatial and Channel Squeeze & Excitation (scSE) Nets
In this article, we will take a look at the paper "Concurrent Spatial and Channel Squeeze & Excitation in Fully Convolutional Networks" which serves as a design update for the popular Squeeze-and-Excitation attention module.
Shuffle Attention for Deep Convolutional Neural Networks (SA-Net)
This article gives an in-depth summary of the ICASSP paper titled "SA-Net: Shuffle Attention for Deep Convolutional Neural Networks."
Style-based Recalibration Module (SRM) Channel Attention
In this post, we will cover a novel form of channel attention called the Style Recalibration Module (SRM), an extension of the popular TPAMI paper: Squeeze-and-Excitation Networks.
Attention Mechanisms in Recurrent Neural Networks (RNNs) With Keras
This series gives an advanced guide to different recurrent neural networks (RNNs). You will gain an understanding of the networks themselves, their architectures, their applications, and how to bring the models to life using Keras.
Accelerating Inference: Neural Network Pruning Explained
In this article, we'll discuss pruning neural networks: what it is, how it works, different pruning methods, and how to evaluate them.
Global Context Networks (GCNet) Explained
In this post, we will discuss a form of attention mechanism in computer vision known as Global Context Networks, first published at ICCV Workshops 2019.