Truth as Fiction: The Dangers of Hubris in the Information Environment

 In C4ISR, Australia, Air, Environment

As Felipe Fernández-Armesto points out in his 2019 Out of our minds: what we think and how we came to think it, ideas about how humans tell truth from fal­si­ty are among the oldest and most impor­tant ones we ever have. The dig­i­tal infor­ma­tion age presents a range of new and old chal­lenges to the conun­drum, and in nation­al secu­ri­ty they become par­tic­u­lar­ly acute. In the dig­i­tal age, we’re drown­ing in infor­ma­tion — and most projections sug­gest it will only get worse. A ten­den­cy, how­ev­er, to over­re­act to these new iter­a­tions of old prob­lems will quick­ly for­feit any strate­gic gains the dig­i­tal infor­ma­tion age once seemed to offer.

Another of the oldest and most impor­tant human con­cepts is nar­ra­tive. The sto­ries we tell about our­selves, both indi­vid­u­al­ly and col­lec­tive­ly, are the sub­strates of iden­ti­ty. Narrative is a dif­fer­ent species from truth and fal­si­ty; it spans the gap between fac­tu­al truth and truth as mean­ing — a del­i­cate, dynam­ic and com­plex assem­blage of infor­ma­tion, knowl­edge, under­stand­ing and illu­sion. Humans par­tic­i­pate in con­struct­ing their own nar­ra­tives, but are not their sole archi­tects. The world out­side of human con­trol has the final say.

In Stalin’s Soviet Union as well as Nazi Germany, ‘truth’ was con­sid­ered a the­o­ret­i­cal con­struct under the con­trol of human archi­tects. As Eugene Lyons noted of life in the USSR in his 1938 Assignment in Utopia, under cer­tain con­di­tions utter non­sense can seem true — non­sense with a spe­cial type of momen­tum where­by each accu­mu­lat­ed fail­ure only stiff­ens col­lec­tive com­mit­ment to the lie. When cen­tralised regimes con­trol much of the infor­ma­tion infra­struc­ture, false­hoods can seem plau­si­ble for a period of time, but not because logic bends to human will — it doesn’t. By con­trol­ling infor­ma­tion infra­struc­ture and by imple­ment­ing a cli­mate not only of fear but of van­guardism, Soviet and Nazi pro­pa­gan­dists cre­at­ed truth out of thin air. The cli­mates of terror and the infra­struc­tures of con­trol these regimes pro­duced even­tu­al­ly fell — and so did the non­sense they held aloft for a brief period of time. The 20th cen­tu­ry stands as the great­est warn­ing against hubris and human author­ship.

Despite those lessons, the temp­ta­tion to treat nar­ra­tive truth as a fic­tion of our own making — unbur­dened by fact — is again on the agenda. Three fac­tors have put it there: our self-inflict­ed glut of data; the con­vic­tion that adver­saries are con­duct­ing nar­ra­tive war­fare against us and that we’re losing; and highly speculative the­o­ries about what tech­nol­o­gy can do to combat the prob­lem.

The belief, driven pri­mar­i­ly by the big data com­merce indus­try (as opposed to the big data science indus­try), that data must con­tain a type of magic dust — dis­cernible pat­terns and reg­u­lar­i­ties in human behav­iour that offer sig­nif­i­cant insights — is losing its lustre. Much of what has passed, and been bought and sold, as behav­iour­al ana­lyt­ics derived from data has little or no legit­i­ma­cy. Some ana­lyt­i­cal tools perform no better than human intuition at pre­dict­ing social out­comes, yet the idea that com­plex human behav­iour can be steered by social engi­neers is part of the zeit­geist. That the age of big data might yield diminishing returns is infor­ma­tion-age heresy.

Two decades of con­flict against an adver­sary unbound by eth­i­cal and moral con­straints in com­mu­ni­ca­tion has left the West’s secu­ri­ty agen­cies and their per­son­nel push­ing bound­aries to win the con­test of nar­ra­tives by coun­ter­ing the con­tent, flow or, in many cases, the com­mu­ni­ca­tors. The over­ar­ch­ing con­cern, char­ac­terised as a nation­al secu­ri­ty risk, has been ampli­fied in recent years as revi­sion­ist states use infor­ma­tion as a tool or weapon. Again, the desire to counter (to be seen to be doing some­thing, usu­al­ly offen­sive, against the iden­ti­fied threat) has led to that same secu­ri­ty appa­ra­tus con­sid­er­ing for­go­ing the very values, ethics and morals that make Western democ­ra­cy worth pre­serv­ing.

This isn’t mere spec­u­la­tion. Decisions within the US secu­ri­ty appa­ra­tus to reframe long-held pro­hi­bi­tions against tor­ture in order to gain a sup­port­ing legal opin­ion jus­ti­fy­ing ‘enhanced’ inter­ro­ga­tion meth­ods are a sobering case study. Arguably, a sig­nif­i­cant erosion of trust in US deci­sion-making was cement­ed fol­low­ing the release of reports on pre­vi­ous­ly covert actions relat­ed to enhanced inter­ro­ga­tion — ren­di­tion to black sites, water­board­ing and so on. The neg­a­tive impact was greater because those acts were fully thought through and autho­rised in order to ‘win’, not the actions of poorly led indi­vid­u­als as occurred at Abu Ghraib. Strategic con­se­quences flow from seri­ous breach­es of trust.

The fight can’t be solely about coun­ter­ing the adver­sary. We need to pro­tect what we have and ensure that we don’t throw it away through frus­tra­tion and the desire to report a suc­cess. Freedom is not free and, once squan­dered, can’t be regained with­out tumul­tuous change. For demo­c­ra­t­ic states, people’s and partners’ trust in nation­al insti­tu­tions is a cen­tral require­ment. While ‘trust’ is an increas­ing­ly flex­i­ble con­cept, short-term wins have neg­a­tive long-term strate­gic con­se­quences. Bellingcat’s work to sep­a­rate fact from fic­tion and cor­rect­ly attribute online influ­ence efforts pro­vides a great lon­gi­tu­di­nal view of the increas­ing number of states or strate­gic agen­cies attempt­ing to ‘win’. It’s also a sober­ing insight into the fragili­ty of the con­cepts behind engag­ing in clan­des­tine or covert actions while out­sourc­ing deliv­ery and hiding in the sea of big data. Here, hubris is increas­ing­ly obvi­ous.

The third factor — sta­tis­ti­cal infer­ence soft­ware (unhelp­ful­ly known as arti­fi­cial intel­li­gence, or AI) — offers high-speed infor­ma­tion sort­ing. The tech­nol­o­gy is a cru­cial part of the response to the data deluge in the nation­al secu­ri­ty sphere. But AI might never accom­plish the sim­u­lat­ed assem­bly of com­plex infor­ma­tion, knowl­edge, under­stand­ing and illu­sion that gives rise to nar­ra­tive iden­ti­ty. We accrue intellectual debt when we offload cog­ni­tive tasks. It would take his­tor­i­cal­ly epic hubris to con­tend that sta­tis­ti­cal infer­ence can replace or even aug­ment this process. It stretch­es creduli­ty even fur­ther to sug­gest that states should attempt such inter­ven­tions as they grap­ple with the uncer­tain­ty of the dig­i­tal age. The price of hubris could be a type of hidden defeat.

Intervening in human com­plex­i­ty is per­ma­nent­ly fraught. If we believe an adver­sary is con­duct­ing such activities, why not let them fail? No cur­rent tech­nol­o­gy offers a way out that doesn’t include sig­nif­i­cant soci­etal costs. Statistical infer­ence soft­ware helps in sort­ing and analysing infor­ma­tion that’s bound­ed and dis­crete — the iden­ti­fi­ca­tion of threat sig­na­tures in cyber­at­tacks is one exam­ple. Expecting this tech­nol­o­gy to play a mean­ing­ful role in the com­plex assem­bly of nar­ra­tive and mean­ing, how­ev­er, is to make a sci­en­tif­i­cal­ly unsup­port­ed leap of faith.

Expecting humans and the machines they make to become the authors and archi­tects of nar­ra­tive mean­ing, with­out exac­er­bat­ing the very prob­lem they’re sup­pos­ed­ly attempt­ing to mit­i­gate, is an old fan­ta­sy using some new gad­gets. It never ends well.

The astute approach to what humans and the machines they make can do to help tell truth from fal­si­ty — this oldest of human conun­drums — requires a large dose of scep­ti­cal con­ser­vatism from the nation­al secu­ri­ty com­mu­ni­ty. Australia has an open, demo­c­ra­t­ic social fabric to pro­tect and strength­en. Truth may be a partly human fic­tion, but it’s one of the most impor­tant fic­tions a nation can con­ceive. Turning it over to algo­rith­mic alche­my and a hand­ful of cen­tral con­trollers in a frenzy of pre­sen­tism is the fastest way to unrav­el the whole tapes­try.

Source: Australian Strategic Policy Institute

Recommended Posts

Start typing and press Enter to search