All public logs
Combined display of all available logs of AI Wiki. You can narrow down the view by selecting a log type, the username (case-sensitive), or the affected page (also case-sensitive).
- 13:27, 18 March 2023 Walle talk contribs created page Self-attention (also called self-attention layer) (Created page with "{{see also|Machine learning terms}} ==Introduction== Self-attention, also known as the self-attention layer, is a mechanism used in machine learning models, particularly in deep learning architectures such as Transformers. It enables the models to weigh and prioritize different input elements based on their relationships and relevance to one another. Self-attention has been widely adopted in various applications, including nat...")