Explain Like I'm 5
Teaching a computer to recognize handwritten numbers is like teaching a friend to spot your drawing! ๐จ
- ๐ท Each number is a picture made of tiny squares (pixels)
- ๐ง The computer learns a "perfect example" of each digit 0-9
- โจ NMN neurons are special: They remember the BEST version of each number!
Try drawing below โ the computer will guess what number you wrote! โ๏ธ
โ๏ธ Draw a Digit (0-9)
๐ฌ Prototype Learning Analysis
When training on MNIST, NMN neurons learn class prototypes โ idealized representations of each digit:
- โ Maximally parallel to their class's data distribution
- โ Minimizing distance to class centroids when possible
- โ In superposition states when class variance is high
๐ Superposition States
When minimizing distance becomes challenging (high intra-class variance), NMN prototypes can exist in a superposition state:
๐ Robustness to Inversion
An interesting property: the squared dot product $(w \cdot x)^2$ means that inverted inputs get the same response as originals:
This provides natural robustness to certain image transformations!
๐ Representation Quality
| Metric | Linear + ReLU | NMN Layer |
|---|---|---|
| Class Separability | Hyperplane boundaries | Curved vortex boundaries |
| Prototype Interpretability | Abstract directions | Visible digit templates |
| Inversion Robustness | Not inherent | Built-in via $(ยท)^2$ |
| Decision Geometry | Linear | Non-linear, localized |