SCNet (CVPR 2020)
This blogpost offers an in-depth insight into the CVPR 2020 paper titled "Improving Convolutional Networks with Self-Calibrated Convolutions"
This blogpost offers an in-depth insight into the CVPR 2020 paper titled "Improving Convolutional Networks with Self-Calibrated Convolutions"
This article is an in-depth insight into the paper authored by Dai et. al., titled as "Attention as Activation".
This blogpost is an in-depth discussion of the Google Brain paper titled "Searching for activation functions" which has since revived research into activation functions.
This article provides an in-depth look at the CVPR 2018 paper titled "xUnit: Learning a Spatial Activation Function for Efficient Image Restoration" and its importance in the domain of image reconstruction.
In this article, we will take a look at the paper "Concurrent Spatial and Channel Squeeze & Excitation in Fully Convolutional Networks" which serves as a design update for the popular Squeeze-and-Excitation attention module.
This article gives an in-depth summary of the ICASSP paper titled "SA-Net: Shuffle Attention for Deep Convolutional Neural Networks."
In this post, we will cover a novel form of channel attention called the Style Recalibration Module (SRM), an extension of the popular TPAMI paper: Squeeze-and-Excitation Networks.
In this post, we will discuss a form of attention mechanism in computer vision known as Global Context Networks, first published at ICCV Workshops 2019.