Shuffle Attention for Deep Convolutional Neural Networks (SA-Net)
This article gives an in-depth summary of the ICASSP paper titled "SA-Net: Shuffle Attention for Deep Convolutional Neural Networks."
This article gives an in-depth summary of the ICASSP paper titled "SA-Net: Shuffle Attention for Deep Convolutional Neural Networks."
In this post, we will cover a novel form of channel attention called the Style Recalibration Module (SRM), an extension of the popular TPAMI paper: Squeeze-and-Excitation Networks.
In this post, we will discuss a form of attention mechanism in computer vision known as Global Context Networks, first published at ICCV Workshops 2019.
In this tutorial, we'll discuss a new form of attention mechanism in computer vision known as Triplet Attention, which was accepted to WACV 2021.
In this article, we'll dive into an in-depth discussion of a recently proposed revamping of the popular MobileNet architecture, namely MobileNeXt, published at ECCV 2020.
In this blog, we will show an example of how to train and generalize Scaled-YOLOv4 on your custom dataset to detect custom objects.