“Doubly-even self-dual linear binary error-correcting block code,” first invented by Claude Shannon in the 1940’s, has been discovered embedded WITHIN the equations of superstring theory!
Why does nature have this? What errors does it need to correct? What is an ‘error’ for nature? More importantly what is the explanation for this freakish discovery? Your guess is as good as mine.
1.) Recent NPR interview with Professor Gates: http://being.publicradio.org/programs/2012/codes-for-reality/gates-symbolsofp…
2.) Gates original paper: http://arxiv.org/abs/0806.0051
3.) A potential explanation, Bostrom’s Simulation Hypothesis: http://www.simulation-argument.com/simulation.html