Focus Your Attention (with Adaptive IIR Filters)

Shahar Lutati, Itamar Zimerman, Lior Wolf


Abstract
We present a new layer in which dynamic (i.e., input-dependent) Infinite Impulse Response (IIR) filters of order two are used to process the input sequence prior to applying conventional attention. The input is split into chunks, and the coefficients of these filters are determined based on previous chunks to maintain causality. Despite their relatively low order, the causal adaptive filters are shown to focus attention on the relevant sequence elements. The new layer is grounded in control theory, and is shown to generalize diagonal state-space layers. The layer performs on-par with state-of-the-art networks, with a fraction of their parameters and with time complexity that is sub-quadratic with input size. The obtained layer is favorable to layers such as Heyna, GPT2, and Mega, both with respect to the number of parameters and the obtained level of performance on multiple long-range sequence problems.
Anthology ID:
2023.emnlp-main.772
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12538–12549
Language:
URL:
https://aclanthology.org/2023.emnlp-main.772
DOI:
10.18653/v1/2023.emnlp-main.772
Bibkey:
Cite (ACL):
Shahar Lutati, Itamar Zimerman, and Lior Wolf. 2023. Focus Your Attention (with Adaptive IIR Filters). In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 12538–12549, Singapore. Association for Computational Linguistics.
Cite (Informal):
Focus Your Attention (with Adaptive IIR Filters) (Lutati et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.772.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.772.mp4