# AI Reading Club - Where models store factual and linguistic knowledge
**Date de l'événement :** 14/05/2026
* Publié le 14/05/2026

### Date
14/05/2026

### Galerie d'image
![1.png](https://firebasestorage.googleapis.com/v0/b/memory-ai.appspot.com/o/prod%2FrKxsdSTpqCfzIFY8Y2hg%2FprojectsMedias%2FY73LtMJp0LdXzipDaFOI%2Fthumbs%2F1_1600x900.png?alt=media&token=c2bbb009-b794-4794-bafb-6f38b7754e8b) 

### Ville
`#Paris` 

## Description
Transformer Feed-Forward Layers Are Key-Value Memories (2020) Paper: https://arxiv.org/abs/2012.14913 After several sessions focused on attention, this paper shifts attention to another major part of the Transformer: the feed-forward layers. The authors show that these layers can be interpreted as key-value memories, where one part detects meaningful input patterns and another contributes associated output information. We will discuss what this means for understanding where models store factual and linguistic knowledge, why attention alone is not enough to explain Transformer behaviour, and how this paper changed the direction of interpretability work. Session format 10-15 minute overview by the discussion lead About 45 minutes of group discussion Discussion lead: TBD Discussion prompts If feed-forward layers store knowledge, what changes about how we think models remember facts? How convincing is the key-value memory interpretation, and where might it break down? What does this paper add after reading BERT attention and “Attention is not Explanation”? Join Discord: https://discord.gg/5rAMsuVXXp

**Lien de l'évènement :** [https://luma.com/event/evt-0zYwvAZhZVTcyTw](https://luma.com/event/evt-0zYwvAZhZVTcyTw)

### Pays
`#France` 

### Continent
`#Europe` 

**Médias associés :**
[Média 1](https://80954c1d.sibforms.com/serve/MUIFABojU8UBbDiX_TdcGa7Wv5VMoVB_nBZ92mkLkGlS1pJLpP7s-pVJusyN-7cG9KPrSuv3fv7TmXwuw_AoyNUShR8jZhmNDgUbZPJO2V5xYXlNz4YXOTjSb8X7Lj7PRIPzgzEWlLbA4f4uw_F8RM51EUsjSfQQko0qaby98GHMdYJVWLIXd5JzzaXBGmqN2CcYOFuqnbnaYEnw) 

## event_id
evt-0zYwvAZhZVTcyTw@events.lu.ma

### Outils
`#Transformer` `#BERT` 



---
### Navigation pour IA
- [Index de tous les contenus](https://ai-memory.io/llms.txt)
- [Plan du site (Sitemap)](https://ai-memory.io/sitemap.xml)
- [Retour à l'accueil](https://ai-memory.io/)
