Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Thursday, May 16, 2013

Googlenature

Also published at the World Future Society.

In a recent conference promoting not only their latest gizmos but their company's animating vision as well, Google executives declared they were working toward a future in which technology "disappears," "fades into the background," becomes more "intuitive and anticipatory." Commenting on this apparently "bizarre mission for a tech company," Bianca Bosker warns that their genial and enthusiastic promotional language masks Google's aspiration to omnipresence via invisibility, an effort to render us dependent and uncritical of their prevalence through its marketing as easy, intuitive, companionable. I agree that there is something to this worry, but it is important to be very clear about it.

There is, paradoxically, nothing more "natural" than for our artifacts and techniques to vanish as "technologies" from our view as they grow familiar. It is a commonplace to point out that most of the time we do not attend to the feeling of our clothes against our skin -- and that we might go a bit mad were to notice this sort of thing all the time -- but it is also true that through our utter habituation to seeing and wearing clothes we no longer think of them as the "technologies" they happen to be. Technique, artifice, and ritual artifice suffuse our lives and worlds, all culture is prosthetic just as all prostheses are culture. That we think of only a fraction of culture as "technology" when all of it can be so thought indicates that the discourse of "technology" as such is to an important extent a register of familiarization and defamiliarization, naturalization and denaturalization, attention and inattention.

To the extent that "technology" is a conceptual site marking our ongoing elaboration of collective agency -- our effort to do things that matter together and to say what we are doing in a way that makes sense to each other -- it is not so surprising to find that those techniques and artifacts among so many that we explicitly think of as "technological" tend to be those that resonate with fears and fantasies of agency in particular: devices to amplify our strengths, to deliver our deepest desires, to disrupt the assumptions on which we imagine we depend, to threaten catastrophes out of our control. Daydreams of wish-fulfillment and nightmares of apocalypse utterly prevail over the technological imaginary, in everyday talk of technical anxieties and consumer desires, in the popular tech press, in advertising imagery, in science fiction entertainments, in Very Serious think-tank position papers on global investment and development, and so on.

This insight about "the technological" points a definite political moral. Since nearly everything about our made world has been different than it is now, could be different than it is now, and surely will be different than it is now, then whenever we treat the furniture of this contingent and open now as natural, as inevitable, as necessary, as logical, as the best of all possible worlds, as the best that can be expected, as normal we invest the status quo with an irresistibility and force that it could never accomplish or maintain on its own. And when we invest the status quo with this force we do so at the cost of our own power (to change together the terms on which we live in the made world with one another). It goes without saying, but I'll say it anyway, that those who benefit from the naturalization of the status quo are always those who preferentially benefit from its customary arrangements, whatever their inequities or irrationalities may be.

There is, of course, a special force in those ritual and material artifacts that would function as a fundamental interface through which we explore the made world and so set the terms on the basis of which we form our sense of what is natural and what is artifactual in the first place. Bosker's particular worry is that Google's product is just such an interface, even a kind of ultimate interface, a framing of experience through a selective annotation and curation of our exploration of the world as such, through the satisfaction on their terms of our "search." Thought of this way, the unnamed ambition in Google's vision to "disappear" is that it would naturalize through prevalence the very terms on which nature and non-nature are produced as such, and on terms that preferentially benefit Google's interests. Put this way, as I say, Bosker's point is an important one.

But there is not, nor could there be, one interface imposing the will of any singular constituency unilaterally upon the made world, whatever Google's competitive ambitions may be, whatever any fundamentalist's moralizing conviction may demand. Indeed, the very language of competitive prevalence that drives Google's discourse attests to their own naturalization of social conventions that are contestable and actually under contest in ways that are as likely to bedevil their vision as implement it. For one thing, you cannot slap a Google logo on that which is invisible, and it is hard not to notice that Google's endless crowing about their ambition to ubiquity is somewhat at odds with the silence of realized ubiquity. Considered on such terms, Google's behavior is indeed rather "bizarre… for a tech company." But that hardly means this behavior is not also fairly typical. The prevalence through which Google would presumably disappear into nature attests paradoxically both to the wishful but usually disavowed tendency toward monopoly in market orders as well as to the competition in which "all that is solid melts into air" (and hence is de-naturalized). Also, more particularly, Google's repeated testament to the aspiration to prevalence through "intuitive" and "person[able]" interfaces in particular signals that, like so many tech companies, they are uncritically invested themselves in the serially failed and utterly facile ideology of artificial intelligence, with what consequences to their ambitions nobody can finally say.

All this is just to say that Google did not code and does not own the interface through which they interface with their interface. It is not just what we think of as our language, but also our laws, our pricing conventions, our ways of signaling subcultural identifications and dis-identifications through sartorial and other lifeway choices, our architectural environment and infrastructural affordances that all encode and enforce moral, esthetic, political judgments. Understanding this is key to grasping the force of Bosker's point, but it also reminds us of the ineradicable plurality of these frames, their irreducibility to one another, and hence the final impossibility of a foreclosure of the open futurity inhering in the present. Just as it is important critically to interrogate the specific values encoded in our laws and affordances with what specific impacts to which specific stakeholders, it is important to interrogate the values, impacts, stakes in criticizing them. Criticality, like science more generally, depends equally on an acceptance that any belief can be up for grabs, but also that all beliefs cannot be up for grabs at once and certainly not belief as such. There is political force both in the ways material and ritual norms and forms settle into "nature" as well as in the ways they can be unsettled into "artifice."

It is not an accident that Bosker turns in her article to the expertise of a representative of the transhumanist think-tank IEET for guidance in thinking through the ultimate significance of the Google interface. Transhumanism assumes an essentially theological narrative vantage over the vicissitudes of technoscientific change, but technoscientific progress toward sustainable equity-in-diversity insists on the diverse determination and equitable distribution of costs, risks, and benefits of such change to its plural stakeholders in an ongoing democratic process of technodevelopmental social struggle. There is undeniably a reactionary politics in our uncritical acceptance of the status quo of the owned interface (be it of faith, or of legislation, or of browsers or search engines) or indeed of any plutocratic prevalence over the made and shared world, but there is a reactionary politics as well in our uncritical acceptance of an alien author of disruption, transcendence, apocalypse. Acquiescence to the fantasy that Google -- or whatever passes for the avatar of a monolithicized "technology" of the moment, Ford, IBM, Microsoft, the Pentagon -- is authorized to deliver totalizing techno-transcendence or techno-apocalypse is to divest ourselves of our authority to contest and produce the uses and meanings investing that made, shared world. Fantasies of total techno-transformation by alien powers (the history-shattering Robot God of the singularitarian transhumanists is merely the most obvious variation on the theme) function as a techno-supernaturalization of human history no less reactionary than the more customary naturalization of the plutocratic status-quo in which tech companies also have their hand.

No comments: