Monday, September 11, 2023

The Vault and Its Keeper: AI and The Great Filter

 Boris Karloff from The Bride of Frankenstein 
Frankenstein's monster, 1935.


To begin, let me briefly introduce a few concepts to hold in collective tension:


The first is the Fermi paradox—so named for Enrico “Where is everybody?” Fermi—which describes the notion that, while all our best predictive models suggest the universe should be teeming with life—some of it intelligent—we’ve yet to identify it anywhere at all, but Earth.  


The second is a concept called The Great Filter, attributed to economist Robin Hanson.  In astrobiological terms, the idea describes the notion that along a process of evolutionary inflection points, certain nodes may arise as likely, or potentially irresolvable, barriers to complex, technologically-sophisticated life, and may explain why we fail to detect it elsewhere in the universe.  Examples of such filter barriers may be the delicate complexity of lipids combining in disturbed solvents—like froth mixing in ancient tidal pools to form primordial cellular membranes—or, the likelihood of alien species having cause to evolve the prehensile appendages necessary to manipulate and develop complex technologies in the first place; or the general environmental stability of supportive global habitats; or the dangerous paradox of cannibalizing food and mineral resources at great scale from an originating host planet without poisoning it inhospitably; or, of course, the straight odds of cultivating at all, much less sustaining, the complex and diversified civilizations required to produce and maintain large-scale technological systems. 


While certain moments on this line garner greater or lesser theoretical attention, in practical terms, the list might be practically infinite.  And clearly then, among the imagined restrictive hurdles of The Great Filter, we may well find an answer, however lonely and regrettable, to that question of the Fermi paradox.  Of course, it is also true, that however restrictive those obstacles may be, whatever the Great Filter may consist of, there is nothing at all to suggest that its barrier action is not lying in wait before us, rather than behind us.  And often the very fruits of technology—generally in the form of global nuclear conflagration—are also imagined as a potentially restrictive node of this Great Filter.


A clever friend of mine enjoys pointing out that we—Homo Sapiens—are the only remaining hominins on earth—a planet once ripe with myriad species.  While you and I, perhaps, have come to imagining that this humble fact is a sign of our dominance and success—our ineluctable superiority: the haunting look in his eye, the vague shake of his head, instead suggests an eerie and foreboding silence.  Not a silence of the peaceful impress, but of teetering anthropological outcomes.  It is the more notable, because to all appearances, many of them too had their technologies—stones, hammers, flint blades, clothing skins, and fire.  We tend to imagine that we are somehow the exclusive and rightful inheritors of such genetic and technological legacies—but there is no currently proven substance to this thinking.  There’s no reason to imagine anything but that many of the technologies we shared were invented, and reinvented, in multiple sequences, and that, like empty talismans, neither tools nor genetic diversity offered protection against what came.        



In the time of his reign, Saddam Hussein set about devising something of a theme park reconstruction of Babylon exactly upon the base of its ruins, irretrievably diminishing the archeological potential of the site.  He posted an inscription on the Ishtar gate: ‘. . . From Nebuchadnezzar to Saddam Hussein, Babylon is rising again.’  —That it absolutely didn’t, and at least, in the sense of his meaning, absolutely won’t, is as tragic and bitterly ironic as anything in Percy Shelley’s “Ozymandias.” 


It is a simple axiom that those in power seek to preserve it, and those without seek to replace the powerful with themselves.  Whether this be monarchs, social media influencers, or fry cooks, is merely a formality of ambition and scale.  By what means, and in what time, are the only matters open for something like suspense.  The traditional soft-power methods of cultivating these outcomes involve selling satisfaction down or discord up, depending on your position: cultivating consensus.  Naturally, as we’ve been so recently reminded, however, failing this, options remain.


Technology is a power beyond consensus.  That it’s a force multiplier is, of course, obvious.  Perhaps less obvious is that technology is always a force multiplier against a prevailing state—a lever against the weight of the present.  This lever of technology may be plied against inertia, against circumstance, against others, against hierarchies, or against the very self-conservative tendencies of nature.  One of the nodes of particular concern with regard to The Great Filter, naturally enough, is that very, previously mentioned difficulty of civilization itself.  It would seem impossible for advanced civilizations to construct, monitor, and maintain spacefaring craft along the vast timescales required for any version of intergalactic travel; without that the source civilizations are resource-rich, capable of collecting, storing, and retrieving vast datasets, and all the while resting upon foundations capable of anchoring stability and social and political continuity measured millennia at a time, rather than by the concerns of mere centuries, much less decades.  To conceive a continuity managed at any lesser scale would be to imagine all deep-space voyages as exclusively one-way missions: the civilizations that had launched them could hardly be assumed to receive their return.  And it’s for this reason that the civilization itself must be considered the deepest rudiment of any advanced, self-regarding species—foreign or domestic.


Civilization is a technology unto itself—one understood to be enacted against the prevailing states of nature and other.  But civilization is entirely more than just the durable context or the hourglass that metes out the peculiar textures of our time.  It is also a technology of a different sort: civilization is a data collection, data storage, and data retrieval device.  A vessel.  A library.  A computer.  Civilization understood in this way is not only a receptacle for all the clever doings of man (or alien), but perhaps more crucially, a storehouse for all the collective animal-wisdoms of a species, all the taboos and cultural directives, coping mechanisms and inherited craft; the practical, the rarified, holding tome of defining mythologies, and the collective sacraments of behavior and belief.  It is the volume one may find themselves within, or the canvas one may render themselves against.  And for all of this, it is also a technology with an organically arising processor, and a clocking rhythm of its own—which is to say, a device with emergent computational power, and naturally limiting capacities of computational speed.  —There are things it can do, but not so quickly; there are things it can do, but not so slowly; there are things it does well; things it does poorly: there are things it cannot do. 

Civilization should be understood then, to be a uniquely perquisite technology for advanced life: one of greatly sustaining power, but also not one of infinite negotiability.  And beside all its other properties is one more: preservation—a technology against which nearly all other technologies are poised in antagonism.  Hierarchy, inertia, persisting circumstance: these are among the chief virtues of civilization—order and durability.  Civilization is a deep-freezer for bureaucracies, staid, reliable mores, and clockworks of the labor state, beside which radical change is an existential and persistent threat.


Civilizations based in Western-style open economies and generally pluralist-type political systems offer a range of benefits where corruption can be kept at bay.  They tend to innovate quickly because there is little between the market and the innovator but capital—and given that they prize commerce first and most, they tend toward social adaptability.  But simultaneous to this innovation, speed and plasticity are the veiled costs of these very same attributes—that they demand a continual renegotiation of the personal, economic, labor skill, comfort, belief, and individual meaning. Where closed and politically immutable societies do much to limit this level of offloading the cost of social re-tooling onto the individual, they also tend to wild autocracy, where the sense of safety and trust in the state is vague, and where opportunities tend to be closely governed and sparse; and, where innovation—by design—is low.  And while these characterizations may be relatively simplistic and broadly brushed, they do begin to illustrate the dichotomies between the approaches, and something of their limiting dimensions.  


Over centuries, even generations, change is inevitable.  Priorities shift.  This is natural.  But the renegotiation of the sense of individual identity and purpose, of economic skill and capability, of personal and collective value, not once, but multiple times in an individual’s ostensibly productive life, is greatly disruptive, and unsustainable at scale.  Civilizations that cannot learn to carefully curate such elements, to moderate social and economic continuity as against innovation and change, to sustain and preserve value, even as they mete-out its evolution, will eventually devolve into warring baronies of resource, in-group identification, and cultural conviction—and will serve to greatly increase the near-term jeopardy to the species.


But if it’s the case that these dangers, individually or collectively, threaten to cast any civilization into great peril, is it possible that none may be greater than the very technology which the technology of the civilization itself is intended to foster and propagate?  Levers and pulleys, wheels and axles; steam, coal, gasoline; pistons and turbines—primarily multiply force, time, and the efficiencies of labor.  Electricity, the transistor, telecommunications, the semiconductor, scale disruptions in distance, speed, measurement, data collection, retention, processing, and forecasting power.  They lay, and still largely rely upon, their analog forebears—but their pervasive utility also induces grave disruptions in context, rather than the more product-laden analog changes in circumstance that came before.  That is, they don’t just change how we do things, or what with—they change what the doing means, and raise uncomfortable questions about what a newly recast do-er might mean as well.  The recently minted Artificial Intelligence we are confronted with today is not, as many techno-optimists claim, merely another lever with commensurate dislocating and revitalizing force.  Rather, it is a form of technology so consolidated, that it is not hyperbole to suggest that it is almost literally technology personified.  It is a computational power so distilled we find perceived, or very real, cognitive features peering back from the algorithms before us.  And very much unlike the force-multipliers of yore, we struggle to behold not only the immediacy for its threat of change, but the scope of its true horizon of power.  A power, which, when sincerely initiated, will grow by exponents.


It is properly realistic to imagine that any civilization with a will toward moving among the stars and exploring the boundless depths of the universe will find that coming upon such profound technologies as a comprehensively simulated intelligence—an AI—will be a prerequisite for devising, modeling, discovering, measuring, and engineering works at scales and complexities beyond the scope of collective organic intelligences—as well as, for managing rare propulsion systems and navigating at theoretical speeds, over astronomical distances.  By any measure it must be that the cultivation of a stable, powerful AI presents an immutably consequential node for The Great Filter.


Without such a technology there is no reason at all to imagine we would ever hear from, much less witness, other advanced life in the universe, or expect to find their toys.  Perhaps, just as likely, we wouldn’t expect to be it.  But could it also be that threading a species through the eye of the AI needle, may be by far the narrowest, most perilous gateway of any prior?  Estimating the Great-Filtering strength of AI relative to solvent-forming lipid membranes, or the statistical-cosmological likelihood of opposable thumbs, is a rather more freighted intuition.  But identifying AI as by far the most existentially disruptive technology on a line that includes shelter, fire, automobiles, and cellular phones, seems hilariously frighteningly obvious.  In fact, it seems so treacherous a passage, one imagines it may appropriately require navigational resources from the opposite side: and given we’ve yet to thread this Great-Filtering needle, perhaps it does.  And in this regard, it seems a matter of little humor.  Getting it right, governing AI early and hard, establishing unblinking global protocols, decisive prohibitions, and immutable directives seems as imperative as it is unlikely.     


Jeff Bezos is said to have described among his corporate memoranda a notion for the distinction between passing through doors that can be reentered, and passing through doors that fix inalterably, behind.  It is a simple but important concept among Western socio-economic postures that view the inalterable as a figment of imaginative, if not descriptive, heresy.  —All change is possible all of the time, and if it’s not always pleasant, it’s surely survivable: buck-up—it’ll be fine.  Or so goes the doctrine of our service-fluid, product-laden, free-market faith.  After all, we survived the steam engine, and that wasn’t so bad; the printing press was only a rise; the crossbow; gunpowder; complex pharmaceuticals; social media!—what’s not to love?  But for each of these, and the rest unmentioned, there is nevertheless a world barred-off, from reentry, but also other evolutionary paths.  For every revolution, or even merely throbbing innovation, there is not only a past of complex aptitudes, sins, and virtues which is foreclosed, but a long and meandering future is put down as well.  An entrance with distinct exits; and exits with consequent, and wholly unique entrances: entire lineages of irreproducible opportunity and embodied futures, are born, and laid to rest.  Ways we wanted to be.  Ways of being we hadn’t finished with.  Retired, not by conclaves of sober cardinals, but impatient directorial boards, or corporate titans tingling for the next yacht, or these days, often as not, man-children in Underroos, with Cheetos and a ruminating malice on their mind.     


It's the nature of revolutionaries and conquistadores—whether armed or arm-chaired—to be solipsists.  They bend the world toward their advantage and sell the bend as an advantage for all—and they do it the more easily for its lightness. What goes unconsidered bears no weight.  There is no moderation among the invisible.  And chief among the unmoderated misconceptions of such revolutionaries and change-makers, is that the structures which preserve—the resistant armatures of civilization—can be shattered or dislodged to a purpose, but also remain, as if untouched for the collection of benefit.  


Our Mark Zuckerbergs and Elon Musks, and particularly our Sam Altmans, seem to imagine that our social moods and economic tendencies, our cultural threads, together comprise such august and formidable architectures that any burden may be hung from it, windows may be added while preserving the wall that holds them—that it’s all at once loadbearing, and, capable of absorbing infinite and simultaneous renovation.  Of course this is untrue; of course this is impossible—but this delusion is prevalent in the society of our moment, and made the more virulent by the very technologies which presume to inform and empower.  And if these rather more prosaic economic oversights and cultural indiscretions—the work-a-day sabotage of ordinary strivers and the commercially vengeful, can so challenge our current frame—what then of a system of ubiquitous interlinking autonomous technologies and cascading economic displacements so radical they challenge very literal and intuitive meanings of labor, the utility of the individual, and the purpose of collective civilization itself?  


If in fact, civilization is not only a data processor or vessel, but also, and crucially, a mechanism of data retrieval—what might the purpose of that retrieval be, what might it mean?  Might it be to place and locate, might it check and amend, might it restore and enfold?  A civilization ceaselessly broken by what it is meant to hold, neither collects, nor stores—and certainly one cannot draw from an empty vault.  We rupture, cauterize, and drain the quality of culture and collective memory by our clever exploitations, all the while imagining that efficiency, novelty, or commercial gratification will fill the vacuum in meaning.  But pain wants purpose: our sacrifice wants expression in utility.  And a technology which threatens to leave little but profit and the easy-bleached shadows of generations of mankind against the pavement is whole magnitudes apart, in scale or duration, from our fiercest arsenals.  But for now, it’s also one we’ll leave to pets and children, start-ups, and whip-smart technophiles.  —Buck-up—it’ll be fine.  


Guiding such a powerful technology with the extreme care and requisite foresight through adolescence to its maturation so that it doesn’t merely detonate its host culture, is no less likely to pose existential perils to alien civilizations than to our own.  And along the commonly imagined gauntlet of failures are typically the poles of nuclear annihilation on one side, and a dystopian surveillance-state of weaponized robot dogs, on the other.  On cosmological scales, there’s no reason to imagine those expressions of The Great Filter are wrong.  But a perhaps more likely, more common, more reasoned presentation of this node of the Filter might instead be a Luddite apocalypse.  It could be that time and again, as civilizations are thrust upon the tip of AI technologies so powerful as to be irresolvably transfiguring, they may instead turn away.  They may reason that a tool empowers its holder by preserving and advancing his state, and that anything which breaks the bearer to wield it, which muddles and diminishes his shape in meaning, can only be a weapon aimed at the self.  They might reason it in dire senates, or they might rush the temples of technology wielding torches, to pull them down, and light them.  And if, formally or informally, they’d weighed costs, inflected and agreed, could we be so sure it was wrong?  Perhaps such technologies might be invented, and reinvented, dismantled, and dismantled again, in a continuity of sequence, and that time-after-time they might elect to preserve the meaning of the finite and the mortalizing, rather than an interplanetary omnipotence whose power can neither be held nor entered.  Perhaps every civilization moves toward its outermost understanding—and seeks to live there.  Perhaps all civilizations with technologies they cannot guide will turn from them.


Perhaps the gravest node of The Great Filter is not calamity, but choice.

No comments:

Post a Comment