Concurrent Spatial and Channel Squeeze & Excitation (scSE) Nets
In this article, we will take a look at the paper "Concurrent Spatial and Channel Squeeze & Excitation in Fully Convolutional Networks" which serves as a design update for the popular Squeeze-and-Excitation attention module.
Shuffle Attention for Deep Convolutional Neural Networks (SA-Net)
This article gives an in-depth summary of the ICASSP paper titled "SA-Net: Shuffle Attention for Deep Convolutional Neural Networks."
Style-based Recalibration Module (SRM) Channel Attention
In this post, we will cover a novel form of channel attention called the Style Recalibration Module (SRM), an extension of the popular TPAMI paper: Squeeze-and-Excitation Networks.
Attention Mechanisms in Recurrent Neural Networks (RNNs) With Keras
This series gives an advanced guide to different recurrent neural networks (RNNs). You will gain an understanding of the networks themselves, their architectures, their applications, and how to bring the models to life using Keras.
Accelerating Inference: Neural Network Pruning Explained
In this article, we'll discuss pruning neural networks: what it is, how it works, different pruning methods, and how to evaluate them.