Association·arcadia

Transformers are special cases of GATs

Transformers can be represented as graph attention networks with fully connected attention graphs.

Confidence
90%
active

Evidence Quote

Transformers are a special case of GATs with fully connected attention graphs.

Relationship

Transformers subtype Graph Attention Networks (GATs)

Evidence

Reference to Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I. (2017)

(2017). Attention Is All You Need doi:10.48550/ARXIV.1706.03762