Association·arcadia
Transformers are special cases of GATs
Transformers can be represented as graph attention networks with fully connected attention graphs.
Confidence
90%
active
Evidence Quote
“Transformers are a special case of GATs with fully connected attention graphs.”
Relationship
Transformers subtype Graph Attention Networks (GATs)
Connections (3)
Evidence
“Reference to Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I. (2017)”