Alejandro Crosa

Remixing is all you need

Sometimes breakthroughs emerge by reconfiguring existing ideas rather than inventing entirely new ones.

Bitcoin

Bitcoin, introduced in the 2008 white paper by Satoshi Nakamoto 1, has been hailed as revolutionary. Yet, the paper didn’t introduce novel components. Cryptographic primitives like hash functions, proof-of-work mechanisms (inspired by Hashcash 2), and distributed consensus protocols already existed. Nakamoto’s contribution was an innovative integration of these elements into a cohesive system that solved the double-spending problem without requiring a trusted intermediary.

Transformers

The 2017 introduction of Transformers in Attention Is All You Need 3 transformed artificial intelligence, yet its cornerstone—attention mechanisms—wasn’t new. Bahdanau et al. 4 first used attention in conjunction with Recurrent Neural Networks (RNNs) for machine translation in 2014.

What made Transformers different? They discarded RNNs entirely, opting for self-attention and parallel computation. This reconfiguration dramatically improved scalability and efficiency. The result was a model architecture that has since powered groundbreaking systems like GPT. The key wasn’t invention but reimagining and optimizing a known mechanism in a new context.

Reinvention Over Invention

Both Bitcoin and Transformers illustrate that groundbreaking advances don’t necessarily demand entirely new inventions. Instead, they emerge when existing ideas are remixed, reconfigured, or applied in novel ways. This pattern suggests that technological leaps are often about recognizing the latent potential in what already exists.

Revolution isn’t about inventing a new wheel—it’s about learning to roll it differently.

Footnotes

  1. Bitcoin White Paper

  2. Hashcash

  3. Attention is all you need

  4. Bahdanau’s Attention Mechanism and Machine Translation