Pepe's Braindump

Attention

tags
Neural Networks

Attention are techniques that allow a neural network to focus on a subset of their input.

It uses an attention network that applies a mask function to their inputs. This function may either convert all their inputs to zeros or ones (hard attention) or to values between 0 and 1 (soft attention).

In the picture: use of soft attention in the paper Show, Attend and Tell.

Cortex theme by Jethro Kuan. Built with org-mode, org-roam and Hugo