Skip to content

My implementation of the gMLP model from the paper "Pay Attention to MLPs".

License

Notifications You must be signed in to change notification settings

antonyvigouret/Pay-Attention-to-MLPs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pay-Attention-to-MLPs

Implementation of the gMLP model introduced in Pay Attention to MLPs.

The authors of the paper propose a simple attention-free network architecture, gMLP, based solely on MLPs with gating, and show that it can perform as well as Transformers in key language and vision applications.

Releases

No releases published

Packages

No packages published

Languages