One of the weirdest things I learned as a Boy Scout is that concealment in the dark is relative.
That is, if you’re trying to hide in a shadow, it doesn’t actually matter how dark the spot you’re standing in is, so long as the spot in which the person who is trying to see you is standing is comparatively brighter. It’s why you keep guard towers dark and light the area around them.
It’s always nice when the laws of optics make a neat metaphor.
The problem with neat metaphors is that, placed in the hands of a writer, the writer will seek to make use of them, regardless of whether they’re particularly appropriate to the situation at hand. Everything looks like a hammer, and all that.
There I go again.
It’s actually quite difficult to avoid speaking in metaphor. If nothing else, the rhetorical shorthand it provides is highly useful. (Does “rhetorical shorthand” count?) It’s an easy way to convey the approximate knowledge of many things with brevity and wit. Allusion is another such tool, though one which tends to contain a great deal of social information alongside the approximate knowledge it conveys. For example, the ability to drop in the occasional reference to a so-called “Great Book” suggests a certain kind of education and a certain type of person—in my experience, British, public (i.e. private) school educated, and born before 1960.
Medieval European writers, to no one’s surprise, favored the Biblical. If one wishes to tabulate, for example, the scriptural allusions in just one sermon by St. Bernard of Clairvaux (The “Honey-Tongued Doctor” of the Church), one had best stick to noting those which are at least four words long, if one doesn’t want to simply mark up the entire text.
These days, we mostly have memes. These are visual or, thanks largely to TikTok, auditory. Here, the proper deployment signals how in touch one is with the current culture of the internet, or at least certain parts of it, and this for many signals in turn whether your ideas merit due consideration or an “OK, Boomer.”
(NB: the author understands that this particular meme is quite dead and buried, but has chosen to speak to their audience even if it requires “embracing the cringe.”)
What’s left, if you remove metaphor and allusion? Not much. Few legal contracts would even manage to escape unscathed, I think. But these are imprecise, rough tools; we as writers are bound by a language and convention which make up in obscurity that which they lack in style or precision, to borrow a phrase. Writers, historians or otherwise, thrive in the probability clouds generated by language. We use them to amuse, inform, and to lie.
Often, these lies are vital mechanisms in communicating truth.
The importance of ambiguity to how we use language is one of the many reasons “Artificial Intelligence” projects (really, just probabilistic pattern calculators, or even, secretly, mechanical Turks) are a fundamentally flawed endeavor. Show me an “AI” that correctly pairs a TikTok sound to a video, and I’ll start acting concerned. For similar reasons, linguistic ambiguity is an obstacle for the digital humanities.
It’s not one of which we are ignorant—or at least we shouldn’t be. Some of the greatest microhistorians of the late 1970s and 80s found their methodological approach by trying to clarify the lexical ambiguities of their sources for the great quantitative computing efforts of the 1960s. And yet here we are again with a superfluidity of quantitative projects that rest on the assumption that you can pin down language, reduce it to the binary which is machine-intelligible. As traditional technical competencies have waned in graduate education, we’re also trying to do it blind. As one of my DH friends used to say, “I’m not going to try to do anything Google hasn’t managed to figure out.”
The humanities, with history as their Queen, have long tried to run away from that which makes them unique, towards the mathematical precision and sure knowledge claimed (falsely, of course) by the sciences. This has set us chasing an ever-moving set of goalposts because we’re playing the wrong game. The scientific method is not the only way to generate knowledge, nor is there any sort of objective classification of knowledge production mechanisms which could somehow mark the knowledge gleaned from science as superior to any other. Sure, science has split the atom; has it offered us true insight into what to do next?
Ambiguity is a—perhaps the—most important feature of the human experience. Humanistic learning is the way we understand it. And that is not ground humanists should ever concede.

Leave a comment