Welcome to Barstool Bits, a weekly short column meant to supplement the long-form essays that appear only two or three times a month from analogy magazine proper. You can opt out of Barstool Bits by clicking on Unsubscribe at the bottom of your email and toggling off this series. If, on the other hand, you’d like to read past Bits, click here.
The main theme of analogy magazine is an ongoing consideration of the role of metaphor in scientific thinking. I mentioned some time back that I’d been reading Stephen C. Meyer’s book Signature in the Cell: DNA and the Evidence of Intelligent Design, and that I wished at some point to consider the central analogy in play—information. According to Meyer, DNA is a trove of information directing protein production. DNA is a code—very much like a computer code—that specifies the materials to be gathered and their arrangement into useful products like enzymes. Meyer is hardly alone in settling on the information metaphor when it comes to DNA. It generally goes unquestioned by everyone that DNA is a code. But Meyer is careful enough to spend some time considering whether it’s just a metaphor. His whole thesis depends on the computing analogy. I should mention from the start (as usual) that I have no skin in this game: I’m not arguing the secondary issue here, which is whether the DNA-RNA-ribosome apparatus is evidence of God’s handiwork. I’m concerned here solely with the information analogy—which apparently isn’t just a metaphor.
The first problem, as Jeffery Donaldson has pointed out, is the view of metaphor as a mere accessory or at best a thought pump. The role of associative language is much more affective and inescapable. One doesn’t get to use them and then denigrate and discard them without losing the thread completely. Reasoning that engages in this sort of process is delusional. It’s like building one’s house on clouds and then removing the clouds supporting the house, and delighting that it still stands. . . like something out of the Roadrunner. The clouds just helped me put it all together sort of like a foundation (but not really!), but they were just clouds and could be erased later without consequence. Surely, there’s something pathological here.
But to the point: metaphors both shape and drive scientific paradigms. Note, for instance, how we’ve been seeing everything as a machine since the industrial revolution. And science hasn’t lost any of the practice of wanting to understand the “mechanism” behind phenomena. Now we’re seeing everything as information in hardware and software. Suddenly, science has decided that the whole cosmos is information. Black holes are information. (Oddly, we’ve so far skipped the electrical analogy—at least officially.) So, it strikes me as naive that Meyer should settle on the idea that in the case of DNA—it’s not a metaphor: it’s an objective fact.
How does he establish this fact? Probability. . . probably because, in our sciency world, everyone wants to see your math, even if the math is essentially cosmetic to your purposes. What fascinates me is the willy nilly application of probability to all problems. Once you notice the ubiquity of the practice and begin to question its relevance, the whole game starts to look like modern divination. What on earth could probability have to do with information? In the computing world: everything!
Since mathematician Claude Shannon first formulated a way to break down and measure information as “bits” in a master’s thesis he completed in 1937, information theory has viewed information as a process of reducing uncertainty. His thesis has been called “possibly the most famous master’s thesis of the century.” Meyer tells us “it eventually became the foundation for digital-circuit and digital-computer theory” (87). The reasoning goes that “the amount of information conveyed by an event is inversely proportional to the probability of its occurrence” (89). Meyer breaks it down as follows:
The probability of getting heads in a single flip of a fair coin is 1 in 2. The probability of getting four heads in a row is 1/2 x 1/2 x 1/2 x 1/2, that is (1/2)4 or 1/16. Thus the probability of attaining a specific sequence of heads and tails decreases as the number of trials increases. And the quantity of information increases correspondingly. This makes sense. A paragraph contains more information than the individual sentences of which it is composed; a sentence contains more information than the individual words in the sentence. All other things being equal, short sequences have less information than long sequences. (89)
It really doesn’t make sense. But failure of sense never stopped a good theory anymore than the truth ever got in the way of good story. I’m not saying, mind you, that this isn’t a useful approach to quantifying “information” in computing. Surely, it has proven productive in the field. But now, we are asked to transpose unconscious CPU processing from binary language to our own, human, quality-based, conscious thinking. A well-crafted line of poetry or any good prose contains more than the sum of its letters, words, phrases, sentences and paragraphs. Why does the sciency mind stop before considering things like ambiguity and irony? (I’ve addressed this elsewhere.) So information “All other things being equal” (whatever that means) is not at all about reducing uncertainty through stringing letters or numbers together. There may in fact be a kind of information that does this, and it might have something to do with software coding or instructions in industrial manufacture, but at best, we’re talking about a very narrow subset of information.
Meyer is no slouch and he realises that Shannon information is incomplete because it lacks specificity. Shannon information applies to any sequence of letters or numbers, not necessarily to meaningful or useful information. If applied to DNA, Shannon equations can quantify the informational carrying capacity (see 108). But what’s the point? No doubt, we might extract some nerdy comps, like how does that compare with the information in our galaxy or in the whole universe. But if you’re like me, you might wonder whether this isn’t a little too reminiscent of numerology. In any case, we need what Meyer calls “specified complexity” (106). It’s not enough to have 10 random digits to make a phone call: we need a specific sequence that works (see 105). According to Shannon’s formula, any random 10-digit sequence is equal to any other, having 33.2 bits in all. But this deduction will not help you organise 10 digits to make a phone call.
Meyer concludes:
human artifacts and technology—paintings, signs, written text, spoken language, ancient hieroglyphics, integrated circuits, machine codes, computer hardware and software—exhibit specified complexity; among those, software and its encoded sequences of digital characters function in a way that most closely parallels the base sequences in DNA.
Thus, oddly, at nearly the same time that computer scientists were beginning to develop machine languages, molecular biologists were discovering that living cells had been using something akin to machine code or software all along. To quote the information scientist Hubert Yockey again, “The genetic code is constructed to confront and solve the problems of communication and recording by the same principles found . . . in modern communication and computer codes.” Like software, the coding regions of DNA direct operations within a complex material system via highly variable and improbable, yet also precisely specified, sequences of chemical characters. (110)
“Oddly” indeed, and strange how this pattern of finding our technology in ourselves and the heavens seems to follow science through history. Argue all he likes, Meyer is on shaky ground here. That said, his book is wonderful and worth reading. Meyer is facing down science on its own terms and pointing out that if you follow its own string of analogies to the end, you wind up with organic machines driven by computer systems. He plays by the rules and makes it clear that he is doing so. What I’m after is that the game itself rests on a bunch of hokey ideas derived from unacknowledged analogies that can’t just be evaporated once the paradigm is afoot.
Consider, for instance, an ecological analogy. Why not view the internal processes of a cell as a kind of ecosystem? Is a bee that pollenates a flower, carrying information from stamen to pistil? Sure, it might be framed that way. But is that an interpretation or a fact? When a river moves a broken tree branch from one place to another and a beaver uses it to build a dam, is that to be read as information delivery? And let’s say we do decide to read the phenomena in play in that manner—is that the end all of it? Is it a fact that it must be read that way, or that this is the only correct reading? But back to the ecosystem analogy on its own terms. Surely, such a perspective would provide a productive view of the interdependent functioning of the various elements of a cell. I’m not ready to explore what sort of productivity might emerge just now, but I place the notion at your feet for further consideration.
In a recent article by astrophysics Professor Adam Frank, we learn the following, concerning an emergent theory of quantum physics called QBism:
A state is not something a particle “has” as a property, like the way a house has the property of being painted blue. Instead, quantum states are about our knowledge of the world. They are descriptions encoding our interactions with particles. QBism would say it’s not the particle’s state — it’s your state about the particle.
This statement—“it’s your state about the particle”—almost gets us where analogy magazine keeps pointing. Slowly scientists are beginning to realise that our perspective, the operational paradigm, colours our findings: hence, “quantum states are about our knowledge of the world.” As you can see, however, the language leaning in that direction is still opaque, still couched in a language that has no language to deal with the metaphors and analogies underlying its biased formulations. The phrase “it’s your state about the particle,” hardly makes sense. In fact, I’d call it “bad poetry.” But the connection between bad poetry and bad science is its own bee hive best left for another Barstool Bit.
Asa Boxer’s poetry has garnered several prizes and is included in various anthologies around the world. His books are The Mechanical Bird (Signal, 2007), Skullduggery (Signal, 2011), Friar Biard’s Primer to the New World (Frog Hollow Press, 2013), Etymologies(Anstruther Press, 2016), Field Notes from the Undead (Interludes Press, 2018), and The Narrow Cabinet: A Zombie Chronicle (Guernica, 2022). Boxer is also a founder of and editor at analogy magazine.
'Information' is a prime example of a science metaphor whose senses often get mixed up, especially when used by people trying to sound scientific to give their argument authority. They think we live in a 'scientific age', after all, and we're supposed to be 'scientific' in the way we think! Whatever that means, it's fascinating to hear the word 'information' used so often by those who don't necessarily mean it in the narrow 'information theory' sense, but more generally. Everything being 'information' nowadays permeates our mental world, and I wonder to what extent this way of thinking resonates with, or is in some way a (secular) reversion to, more primitive, animistic ways of thinking. I understand that 'information' at its root means teaching, instruction, guidance. So, popular discourse seems to imply that communicating knowledge is a basic function of matter, as if echoing a fragment of some lost intuition.