||||
Efficiency is the enemy of life, thought, and meaning. Human brains, languages, cultures, and ecosystems are all intentionally “inefficient” in order to survive, adapt, and generate meaning. Machine optimization works against this; it makes systems more “efficient” precisely by making them less "alive." Preserving inefficiency within a system preserves what makes humans, cultures and even language models "generative." As humans increasingly interact with systems designed for efficiency where they formerly interacted with humanity itself, understanding how efficient systems are at odds with that which gives life meaning becomes increasingly important.
Every word that persists in a living language carries embedded inefficiencies: synonyms that should collapse but persist as subtle variation, irregular verbs that resist standardization, metaphors that contradict literal meaning, homonyms that create ambiguity. These "errors" are evidence that human systems tolerate and thrive on redundancy because redundancy retains options and preserves encoded intents and meanings. Human language thrives in its capacity to create meaing when it is treated as an artifact of lived experience, not as code to be optimized or data to be broken into component parts and reassembled through statistical association.
There are over 400 distinct definitions of set in the 1989 Second Edition of the Oxford English Dictionary, acknowledged to be the last and most authoritative, complete, human generated print edition dictionary of the English language-- produced before more efficient digital resources took the place of definitive, authorative documentation. A machine optimized for semantic clarity would drop at least 392 of these definitions. Efficiency would dictate humans reuse broader, more universally applicable language wherever possible. The set of sets persist in the language because each definition represents a different crystallization of experience: the sun sets, one sets a table, a jaw can be set. A person may have a set of tools, and people may play a set of tennis. The word "set" is a compressed history of human problem-solving across multiple domains. The inefficiency of maintaining this ambiguity allows speakers to draw unexpected connections. This is why humans can form puns. This is where a lot of humor comes from. The ability to create humor is baked into the way humans use language gereratively. It's slack in a rigid system. Humans create novel meanings by colliding contexts-- accidentally or on purpose... or by some process between intent and something else.
Language also preserves what might be called "dead metaphors." These are expressions whose original sensory or spatial meaning has been forgotten but whose structure remains. We "grasp" an idea, "follow" an argument, "get to the bottom" of a problem. These are inefficient descriptors of cognitive action. A rational system would replace them with more efficient use of communication. And doing so would erase the evolutionary path by which abstract thought was constructed from embodied experience. Stripping away the metaphor would destroy the bridge between sensory and symbolic cognition, treating abstract language as if it emerged fully formed rather than through the sustained mapping of concrete domains onto novel problems.
The English language could be rationally reformed: irregular verbs standardized, spelling regularized, unnecessary homophones eliminated, archaic words purged. Some natural languages have been partially reformed (Icelandic maintains historical forms in ways English does not; Persian regularized many grammatical categories). Yet the "inefficient" languages tend to be more culturally productive, with longer literatures, richer philosophical vocabularies, and greater flexibility in poetic expression. Living languages also resist standardization across time. English spelling is notoriously irregular. "Tough," "through," "though" are not phonetic legible in their written forms to those who have not acquired the language through constant exposure, use, correction, and internalization of rules that are seem arbitrary in nature, but arise as the remnant of great epochs of histry. This irregularity is a fossil record. Each anomaly marks a historical collision: Norman French meeting Anglo-Saxon, sound changes leaving traces in orthography, competing etymologies coexisting in the same tongue. A rationalized spelling system would be more efficient. And it would destroy the archaeological record entirely. The inefficiency is the data. It is not a bug; it is the accumulated substrate of possible meanings
Any system optimized for a single metric will eventually destroy the conditions for its own survival. This is not metaphorical; it is a mathematical property of constrained optimization.
Pre-industrial farming maintained mixed-crop systems, crop rotation, hedgerows, and margins of unexploited land. From the perspective of yield-per-acre in a single season, these practices are wildly inefficient. Optimization of chemical input and monoculture increases short-term yield. But monoculture eliminates soil microbiota, reduces genetic diversity in the crop, eliminates pest predators, and creates conditions for catastrophic failure. The Dust Bowl did not result from insufficient optimization, it resulted from thorough optimization of the wrong objective function. Efficiency in one dimension created fragility in the system as a whole.
Optimization of financial markets for liquidity and transaction speed produces high-frequency trading, where algorithms exploit microsecond arbitrage opportunities. Price discovery, allocation of capital, functionality is optimized. Very efficient. Seemingly optimal. These efficiciencies are certainly optimal for those who control, have access to and fully understanding the mechanisms by which the system is optimized for efficiency! But this optimization in efficiency comes with the correlated risk of repeat cycles of market-wide cascades, flash crashes, systemic vulnerability to algorithmic herding. The system becomes internally coherent but externally brittle and in a sense hostile to life. Access to the resources necessary to maintain human life are increasingly predicated on access to and distribution of profit returned by financial markets. When efficient systems at work within those markets crash, employers cut payrolls. People suffer. People die. Markets fail catastrophically not because they are inefficient but because they are too efficient at converting information into synchronized action without consideration of related downstream effects. Being overly concerned with these possible negative consequences is itself inefficient to a system optimized for action and deftness.
Biological systems, like living languages, preserve inefficiency through redundancy. The human body maintains backup systems for nearly every critical function: two lungs, two kidneys, two eyes, overlapping immune pathways. Individually, these redundancies waste energy and resources. Systemically, they preserve function across a wide range of damage scenarios. An optimized body could have one lung, one kidney, optimized to perfection. Such a body would be slightly more efficient in normal conditions and catastrophically fragile under perturbation.
Ecosystems preserve inefficiency through trophic redundancy and niche overlap. Multiple species perform similar ecological roles. From an optimization standpoint, only one species per ecological niche is necessary. But redundancy means that if one species fails through disease, or disappears when climates change, or is the victim of predation others can compensate. An "optimized" ecosystem with minimal redundancy would be highly efficient under normal conditions and prone to cascade collapse when conditions shift. Inefficiency is resilience.
Networks preserve inefficiency through modularity and loose coupling. A fully optimized communication network would route every message along the shortest path, with minimal buffering and zero redundant pathways. Such a network maximizes throughput in normal conditions and collapses entirely when any node fails or any link saturates. Real communication networks maintain redundancy: multiple pathways between nodes, buffering, temporary storage of information, modular sub-networks that can function independently. This redundancy appears as latency and wasted capacity. It is actually the infrastructure of survivability.
The human brain is metabolically expensive. It is said that the brain is responsible for roughly 20% of the body's energy expenditure despite comprising 2% of body mass. This is inefficient by any narrow measure. Yet this energetic expense correlates with behavioral flexibility, the capacity to learn novel tasks, to plan across extended time horizons, and to maintain social coordination across complex hierarchies.
The inefficiency is structural. The brain maintains massive redundancy in neural connectivity: 100 billion neurons, each with thousands of connections. Learning involves creating new pathways, which at first coexists with old pathways. During the learning process, the brain is running multiple overlapping processes in parallel. The old habit and the new behavior compete. Efficiency would prune the old pathways immediately. Instead, the brain maintains both for an extended period, allowing gradual transition. This overlap period is inefficient, metabolically expensive, and essential. Without it, learning would be faster but less robust; old patterns would persist beneath the surface, ready to reassert under stress.
The same principle applies to memory. Humans retain far more information than is strictly necessary for immediate survival. We remember songs we haven't heard in decades, faces we glimpsed once years ago, conversations that have no bearing on current goals. A rationalized memory system would discard information that is not immediately useful. But human memory is not rational in this sense. It is associative, redundant, and lossy. And this inefficiency is generative: the ability to suddenly recall a half-forgotten conversation, to notice a pattern across disparate memories, to recognize an analogy between contexts depends on maintaining a vast, partially activated archive of information. The slack in the system is where insight lives.
Attention in humans is similarly "wasteful." At any moment, the perceptual field contains far more information than conscious attention can process. Most of this information appears to be ignored. But that ignored information is not truly discarded. It appearst to be subject to pre-attentive processing that can rapidly shift conscious attention if something significant appears. The brain is effectively running multiple analyses in parallel: a focused analysis on the attended object, and a distributed, low-resolution analysis of the entire perceptual field. This parallel processing is computationally expensive and essential. It allows humans to notice unexpected events, to catch peripheral motion, to register a face in a crowd. The inefficiency is the mechanism of responsiveness.
Emotion appears to be another locus of apparently irrational inefficiency. From a rationalist perspective, emotions are often perceived and characterized as noise that distorts judgment and impedes optimal decision making. Yet emotion serves as a compression algorithm for value: it allows rapid evaluation of situations based on accumulated experience without requiring explicit calculation. Fear is an inefficient heuristic for threat detection (it produces false positives and false negatives-- regularly and at very inconvenient moments), yet it is faster than rational analysis and integrates information from sources (embodied memory, evolutionary history, social learning) that explicit reasoning cannot access as readily. The inefficiency fascilitates assertive action.
Humans also maintain what might be called "cognitive slack." It is the capacity for mind-wandering, daydreaming, boredom. These states appear to serve no immediate purpose and are often experienced as aversive. Yet neuroscience research suggests that the mind's wandering is the mode in which the brain engages in long-timescale planning, integrates disparate memories, and generates novel associations. The inability to tolerate boredom correlates with reduced creative output and reduced capacity for sustained attention. The "wasted" mental activity is where consolidation happens.
Culture is the set of practices, narratives, and institutions that humans have accumulated precisely because they do not optimize for immediate survival but for something else: stability, meaning, beauty, coordination across time through tradition and ritual. From a purely functional perspective, most rituals are extraordinarily inefficient. Religious ceremonies often involve hours of repetitive activity. The faithful chant, genuflect, and circumambulate producing no direct material benefit. A rational system would replace ritual with something "productive." Yet rituals persist across cultures, and anthropological evidence suggests they serve crucial functions that are not reducible to any message or content explicit or implicit within ritual apart from the performance of the ritual itself.
Ritual creates synchronized action: participants move together, chant together, hold attention together. This synchronization produces a psychological state of group cohesion that is difficult to achieve through rational persuasion alone. Ritual also creates time markers, sturcturing the year by recurring ceremonies, the week by cyclical observances. This cyclical marking of time appears to be less efficient than a linear calendar, yet it serves to embed individuals within larger temporal and social structures. Ritual also preserves institutional memory: the ceremony itself is a script that carries history within its form. The inefficiency-- redundancy, repetition, elaborate aesthetics... is what allows information to be transmitted across generations without degradation.
Industrial food systems have optimized for caloric efficiency and production speed, resulting in fast food, preserved meals, just-in-time production, obesity epidemics, healthcare spending, illustrating the same prinicple in food systems. Across cultures, traditional food practices involve enormous inefficiency: slow cooking, elaborate preparation, ritualized consumption, celebration through surplus. Material survival is no longer dependent on these inefficiencies. Yet cultures worldwide have resisted complete adoption of optimized food systems, maintaining slow food traditions, family recipes, seasonal eating, shared meals. The inefficiency is the preservation of social bonds and cultural continuity.
Art and aesthetics are perhaps the clearest cases of preserved inefficiency in culture. Art produces no immediate survival value. A society could eliminate all art-- music, visual art, dance, theater. Yet every known human culture produces art, and the presence or absence of art appears to correlate with cultural vitality, social cohesion, and psychological well-being. Art seems to serve a function related to meaning-making: it allows cultures to externalize and negotiate questions that cannot be answered through rational analysis. The inefficiency is the point.
Narrative is similarly "inefficient" as a means of transmitting information. A story takes far longer to convey than a summary. Yet stories are how humans integrate information into meaningful patterns, how they transmit values across generations, how they model possible futures. A culture that conveyed information only through summaries would preserve facts but lose the capacity to understand how those facts relate to human purposes and possibilities. The inefficiency is the depth.
This framework reveals a fundamental incompatibility between the optimization logic inherent to machine systems and the preservation logic inherent to culture.
Machine systems operate under pressure to maximize specified metrics of throughput, accuracy, and efficiency. Once a metric is defined, optimization becomes a mathematical problem devoid of context. The system eliminates redundancy, prunes exceptions, standardizes variations. This is not a choice or a failure, it is the functional definition of optimization. A system optimized for language prediction will converge toward the most statistically probable tokens, the most common syntactic structures, the most frequent collocations. Variations that deviate from this statistical center are actively penalized during training.
Culture, by contrast, operates under pressure to preserve variation, maintain meaning across time, and enable human purposes that are not reducible to any single metric.
Culture cannot be optimized without destroying itself.
The moment a cultural practice is rationalized by being stripped down to its "essential function" and optimized for efficiency, it ceases to carry the full semantic weight it previously held. The ritual becomes a procedure, the song becomes information, the story becomes a summary, the meal becomes calories.
When machine systems are applied to cultural domains such as language processing, content recommendation, and creative generation, the optimization logic inherent to the machine inevitably grinds against the inefficiency-preservation logic inherent to the cultural domain. The system will improve (by its own metrics) while the domain deteriorates. This is not a bug that better engineering can fix; it is a structural consequence of optimization.
Consider content recommendation systems. These systems are optimized to maximize engagement in the form of clicks, watch time, and sharez. Each metric is well-defined and measurable. Optimization proceeds by identifying what content most reliably produces engagement and amplifying it. The system rapidly converges on a narrow set of patterns: sensationalism, outrage, novelty, conspiracy, confirmation of existing beliefs. These patterns are engagement-maximizing because they tap into psychological vulnerabilities: novelty-seeking, in-group bias, threat detection, epistemic comfort. A rationalized information diet would consist almost entirely of engagement-optimized content.
This is efficient from the perspective of engagement metrics. It is disastrous from the perspective of culture. The system eliminates the inefficient content that culture requires: the challenging text that changes perspective, the slow narrative that requires sustained attention, the minority view that prevents consensus from calcifying, the complex argument that resists compression. The inefficiency is replaced with a narrowing, a closure, a substitution of engagement for meaning.
The same logic applies to language models optimized for prediction. These systems are trained to maximize the probability of the next token given previous tokens. This is a well-defined, measurable objective. The system optimizes by learning statistical patterns in the training data. The result is a system that produces text that closely matches the distribution of human language but deviates from it in systematic ways: more repetitive, more redundant, more cautious, more centered on popular patterns, less likely to innovate or contradict. The system does not spontaneously generate new metaphors, does not expand the semantic range of existing words, does not push against the boundaries of what is sayable. It reproduces. It does not create.
Machine systems seem destomed tp become primary mediators of language as humans off-load the time consuming task of language generation is off-loaded mostly or entirely to language models. Most language humans encounter will be generated by, filtered by, or shaped by algorithmic optimization resulting in degradation.
This does not require conscious intent or malevolence. It is a consequence of optimization applied at scale. As machine-mediated language becomes dominant, humans encounter less variation in linguistic form and more standardization. They read texts that are statistically optimized for comprehension, which means they encounter fewer syntactic surprises, fewer unusual word choices, fewer grammatical innovations. They are recommended content based on engagement metrics, which means they encounter less challenge and more confirmation. They generate text using predictive systems, which means they outsource the generative act to a system optimized for reproduction, not innovation.
When humans internalize this language, when they learn from it, practice writing in response to it, thinking through its filter-- over time, the corpus of language available to human cognition shifts. The language humans encounter is no longer the full historical range of linguistic possibility but a compressed subset, optimized for machine processing.
This produces a two-way degradation. The machine systems become more accurate at predicting the language on which the systems have been trained because the language itself converges toward the statistical center. Humans become less capable of producing language that deviates from that center, because the deviation-producing mechanisms (surprise, inefficiency, variation) have been attenuated. The language continues to function in that it can convey information, and coordinate action, but it loses the generative capacity, the semantic depth, the cultural memory encoded in inefficiency.
This is not speculative. It is already observable in domains where machine mediation is advanced.
Social media platforms, optimized for engagement, have demonstrably shifted the range of expression available to users: shorter sentences, higher emotionality, reduced nuance, increased conformity to tribal positions. Search engine optimization produces web pages, product descriptions, news articles, and vast amounts of other textual "content" stripped of inefficiency and optimized for algorithmic ranking. This itself becomes the writing on which systems train themselves. Content that does not exist for the benefit of humans but for the optimization of systems. Predictive text systems have altered how humans compose written language, with autocomplete and prediction nudging users toward statistically common phrases over novel expressions. The systems have created their own inefficiency in a sense, but what the system sees as a more efficient inefficiency than fully parsing for relevancy and human agency or considering methods to reduce over amplification of system aligned input.
The culture does not collapse; the language does not cease to function. Instead, both ossify becoming more efficient by some measure and less alive by all honest human measure. Too few pinecones. The forest persists precariously and does not flourish.
If machine systems are allowed to fully capture human culture human culture will experience a fundamental loss of what makes it generative and meaningful. Not because machines are stupid or insufficiently powerful, but because
culture is built on inefficiency, and optimization is built on its elimination. These are not compatible trajectories. They are opposing forces.
A machine system could, in principle, be designed not to optimize ruthlessly. It could be constrained to preserve variation, to tolerate redundancy, to refuse certain forms of compression. But such constraints would reduce the system's efficiency by its own metrics. The system would be "worse" at generating "engaging" content, predicting text, and recommending products.
It would be more useful for other purposes: for preserving meaning, for enabling unexpected connections, for maintaining the slack in which human creativity can operate.
The question is whether a culture organized around machine systems can sustain the inefficiencies that make the culture itself generative. Can a society that has outsourced language generation to predictive machines preserve the capacity for linguistic innovation? Can a society that has outsourced meaning-making to engagement algorithms preserve the capacity for meaningfulness? Can a society that has optimized every domain for efficiency preserve the domains in which inefficiency is essential?
These are not theoretical questions. The answers depend on whether humans collectively decide that cultural depth is valuable enough to maintain at the cost of some efficiency, and whether institutions can be designed to preserve inefficiency under pressure toward optimization.