Ask HN: Analog Model of Transformers

(Sorry if this a stupid question)

Roughly 80 years ago many computers used analog logic, for example an amplifier with variable resistors could multiply and divide.

The matrix multiplication seems to be central in Transformer architecture.

I wonder if it would make sense to create a sort of Transformer on a vaguely similar concept?

If you wonder why this question, after all there are still guys who design digital computers with tubes or even mechanical relays, then why not analog Transformers?

https://hackaday.com/2023/12/05/a-single-board-computer-with-vacuum-tubes/

https://hackaday.io/project/189725-homebrew-16-bit-relay-computer

7 points | by JPLeRouzic 2 hours ago

1 comments

  • SaberTail 6 minutes ago
    The term I've heard for this sort of thing is "Physical Neural Networks" or "PNN"s. My impression is that one of the big things holding them back is that because we can't manufacture components to perfect tolerances, you can't train a single model and reuse it like you can with digital logic. Even if you can get close, every single circuit needs some amount of tuning. And we haven't worked out great ways to train them.

    There's a lot of research going on in this space though, because yeah, nature can solve certain mathematical problems more efficiently than digital systems.

    There's a decent review article that came out recently: https://www.nature.com/articles/s41586-025-09384-2 or https://arxiv.org/html/2406.03372v1