Interface administrators, Administrators (Semantic MediaWiki), Curators (Semantic MediaWiki), Editors (Semantic MediaWiki), Suppressors, Administrators
7,785
edits
No edit summary |
No edit summary |
||
Line 7: | Line 7: | ||
!Note | !Note | ||
|- | |- | ||
|'''[[Attention Is All You Need]]''' || [[arxiv:1706.03762]] || influential paper that introduced [[Transformer]] | |'''[[Attention Is All You Need]]''' || 2017/06/12 || [[arxiv:1706.03762]] || influential paper that introduced [[Transformer]] | ||
|- | |- | ||
|'''[[An Image is Worth 16x16 Words]]''' || [[arxiv:2010.11929]] || Transformers for Image Recognition at Scale - [[Vision Transformer]] ([[ViT]]) | |'''[[An Image is Worth 16x16 Words]]''' || 2020/10/22 || [[arxiv:2010.11929]] || Transformers for Image Recognition at Scale - [[Vision Transformer]] ([[ViT]]) | ||
|- | |- | ||
|'''[[Block-Recurrent Transformers]]''' || [[arxiv:2203.07852]] || | |'''[[Block-Recurrent Transformers]]''' || 2022/03/11 || [[arxiv:2203.07852]] || | ||
|- | |- | ||
|'''[[Language Models are Few-Shot Learners]]''' || [[arxiv:2005.14165]] || [[GPT]] | |'''[[Language Models are Few-Shot Learners]]''' || 2020/05/28 || [[arxiv:2005.14165]] || [[GPT]] | ||
|- | |- | ||
|'''[[Memorizing Transformers]]''' || [[arxiv:2203.08913]] || | |'''[[Memorizing Transformers]]''' || 2022/03/16 ||[[arxiv:2203.08913]] || | ||
|- | |- | ||
|'''[[MobileViT]]''' || [[arxiv:2110.02178]] || Light-weight, General-purpose, and Mobile-friendly Vision Transformer | |'''[[MobileViT]]''' || 2021/10/05 || [[arxiv:2110.02178]] || Light-weight, General-purpose, and Mobile-friendly Vision Transformer | ||
|- | |- | ||
|'''[[OpenAI CLIP]]''' || [[arxiv:2103.00020]]<br>[https://openai.com/blog/clip/ OpenAI Blog] || Learning Transferable Visual Models From Natural Language Supervision | |'''[[OpenAI CLIP]]''' || 2021/02/26 || [[arxiv:2103.00020]]<br>[https://openai.com/blog/clip/ OpenAI Blog] || Learning Transferable Visual Models From Natural Language Supervision | ||
|- | |- | ||
|'''[[STaR]]''' || [[arxiv:2203.14465]] || Bootstrapping Reasoning With Reasoning | |'''[[STaR]]''' || 2022/03/28 || [[arxiv:2203.14465]] || Bootstrapping Reasoning With Reasoning | ||
|- | |- | ||
|'''[[Transformer-XL]]''' || [[arxiv:1901.02860]] || Attentive Language Models Beyond a Fixed-Length Context | |'''[[Transformer-XL]]''' || 2019/01/09 || [[arxiv:1901.02860]] || Attentive Language Models Beyond a Fixed-Length Context | ||
|} | |} | ||
===Others=== | ===Others=== | ||
https://arxiv.org/abs/2301.13779 ([[FLAME: A small language model for spreadsheet formulas]]) - Small model specifically for spreadsheets by [[Miscrofot]] | https://arxiv.org/abs/2301.13779 ([[FLAME: A small language model for spreadsheet formulas]]) - Small model specifically for spreadsheets by [[Miscrofot]] |