Skip to content

Releases: juancolonna/SelfAttention_Conv2D

SelfAttention_Conv2D

28 Aug 21:52
57092d0
Compare
Choose a tag to compare

Custom Self-Attention Layer with 1x1 2D Convolutions

A custom self-attention layer, meticulously designed to enhance your models with self-attention capabilities through the utilization of 1x1 convolutions. This custom layer aims to enrich your models with advanced self-attention capabilities while maintaining the ease of integration that Keras offers.

Overview: This custom layer implementation draws its design principles from the influential SAGAN paper (Self-Attention Generative Adversarial Networks, which introduced pioneering concepts in self-attention. While originally implemented in PyTorch, we've taken the initiative to translate and adapt this mechanism for Keras (and TensorFlow), enriching it with unique modifications that further elevate its functionality.