Continued elsewhere

I've decided to abandon this blog in favor of a newer, more experimental hypertext form of writing. Come over and see the new place.
Showing posts sorted by relevance for query repurpose. Sort by date Show all posts
Showing posts sorted by relevance for query repurpose. Sort by date Show all posts

Wednesday, December 22, 2010

"A power so great, it can only be used for Good or Evil"

As I knew would happen, my earlier unfair review of Kevin Kelly's What Technology Wants obligated me to read the thing and give it a real one. I have to say I wanted to pick a fight with this book, for reasons that are unclear to me. Partly it's because it is in some ways very close to my own point of view, yet so different in important respects, and I'm a People's-Front-Of-Judea type of guy. Partly it's that Kelly is selling grand visions, but he seems to be unable or unwilling to also adopt the equally important critical stance. But I was disappointed in my search for an argument, because this turns out to me an essentially religious book, and in my experience there's no point arguing with religion. Either you buy into the vision or you don't. So the subject of this book is not actually technology, but the nature of existence itself and where it is supposed to be going.

You'd think the story of technology would be a large enough subject in its own right, but Kelly's scope is so all-encompassingly vast that he has to devote the a good chunk of the book to a discussion of biological life. This is necessary because a key part of his argument is that life and technology are just different aspects of the same process. Evolution (or whatever you want to call it) may have started with chemistry and nucleic acids but is now working with electronics and silicon. In support of this argument, he compares the evolution of various technological forms with biological evolution (eg, the way variant forms of military helmets or trumpets can be arranged in phylogentic trees), and then elides the differences between them, in order to paint them as just different phases of some grander tendency of the universe to evolve towards complexity/goodness/whatever. Progress, in other words. The technium (Kelly's term for the whole sphere of technological development) is simply an extension of the biosphere, or both are manifestations of some underlying, more abstract tendency.

That's the first part of Kelly's thesis. The second is that the evolution of both life and technology is a strongly convergent process, meaning roughly that while evolution obviously involves large amounts of randomness and contingency, its general tendencies and ultimate destination is in some sense foreordained. It was inevitable that we'd evolve multicellularity, muscles, eyes, and computation, although the exact form may vary across the possible universes. Kelly labels this "ordained becoming" and I believe most of these ideas have their roots in the work of Simon Conway Morris, an evolutionist who (not coincidentally) also has a "theology of evolution" at the core of his thought.

Inventions and discoveries are crystals inherent in the technium, waiting to be manifested. There is nothing magical about these patters, nothing mystical about technology having a direction. All complex, adaptive systems...will exhibit emergent forms and inherent directions. (p 186)
Given the above points, the story becomes not so much "what technology wants" as where technology (and everything else) is inevitably heading. This may be a difference which may make no difference, just as it doesn't matter in some sense whether a sunflower follows the sun because it "wants" to or it has an innate tropism-generating mechanism. Calling the trajectory of technology a "want" suggests, though, that it might want something different, or that it can be induced to go somewhere else. The arguments Kelly advances suggest otherwise.

The third part of the book is devoted to those who think they can escape from technology (such as the Amish) or put a halt to its onslaught (such as the Unabomber). It's a bold choice to make Ted Kaczinsky the focus of a chapter, I thought. Kelly does a credible and fair job of presenting his viewpoint. But the (perhaps unintended) thrust of this is to paint anybody who hopes to argue with or resist the advancement of an autonomous, self-willed technium as a madman. There are many level-headed and sane critics of technology that could have been used as foils. In some ways Kaszinscky is Kelly's mirror-image: both are equally eager to totalize technology, to paint it as a unified and nearly unstoppable force.

The urge for self-preservation, self-extension, and self-growth is the state of any living thing...there comes a moment in the childhood of our biological offspring when their childish selfish nature confronts us, and we have ot acknowledge that they have their own agenda (sic)... Collectively we are at one of these moments with the technium...At a macroscale, the technium is following its inevitable progression. Yet at the microscale, volition rules. Our choice is to align ourselves with this direction, to expand choice and possibilities for everyone and everything, and to play out the details with grace and beauty. Or we can choose (unwisely, I believe) to resist (p187)

It's odd that more reasonable efforts to redirect technology are given short shrift, given that Kelly was an editor of the Whole Earth Catalog, which advanced the idea that you could repurpose and redirect technology for alternative purposes. I guess his later career at Wired has overridden that...but he does run a site called Cool Tools which is very close in spirit to the old WEC, so he hasn't abandoned that ethos (in fairness, Kelly does discuss his personal transition from low to high tech).

The fourth part of the book is an effort to characterize the nature of the inevitable progress of technology. "Technology wants what life wants" which is "increasing efficiency, increasing opportunity, increasing emergence, increasing complexity...increasing freedom, increasing mutualism increasing beauty... " (p 270). This is the point where things skidded off the road and into the gauzy, light-filled realm of heaven for me. I've been a technologist all my life, and while I certainly believe in the wonderful things it can do and the beauty it embody, I can't take such a rosy view. Technology is not just your iPhone and Facebook, it's hydrogen bombs, ecological disaster, and the constant radical undermining of human values. It's not just those either, of course, but to consider one side without the other can't be done if you are trying to get an honest picture of the technosphere. It's the same as with biological life, which for all its beauty has no particular interest in your personal well-being and contains many "wanters" that treat humans as so much raw material, from mountain lions to malaria parasites.

Kelly is not oblivious to the possible downsides of technology, of course. But when it comes time to tot up the good vs the bad, it comes down to this:

The message of the technium is that any choice is way better then no choice. That's why technology tends to tip the scales slightly toward the good, even though it produces so many problems...it compounds the good in the world because in addition to the direct good it brings, the arc of the technium keeps increasing choices, possibilities, freedom, and free will in the world, and that is an even greater good.
Argh, this is libertarian rot. A greater availability of choice is not always "good". Would we be better off if anyone could have the choice of obtaining RPGs or nuclear weapons at the local 7-11? More choices can make people overwhelmed and unhappy. And if we are being propelled irresistibly forward into some foreordained attractor, do we really have any choice at all?

Kelly is constantly revisiting the issue of whether technology makes us better people or not. I find this a ridiculous question, and it undercuts his own premise. We do not have the option of doing without technology, with all due respect to Amish refuseniks. As a civilization, a species, we've built ourselves a technological layer that we now live in and can't get rid of (unless we are prepared for an order-of-magnitude dieoff). The question of "is technology good or evil" is a stupid question, frankly. You can talk about a particular bit of technology and what its effects are and what human interests it serves or subverts, but to try to put a moral valence on technology as a whole is like a fish giving a lecture on "water: threat or savior"?

As before, I can't help but compare Kelly to Latour. Both start with what should be a fairly straightforward task of describing the processes of science and technology, but end up going off on wild metaphysical joyrides. The difference is that Latour has a political/sociological view, while Kelly's is primarily religious (not that Latour doesn't get into that now and then). Both are trying to locate agency somewhere other than in its traditional home of individual humans, but while Latour distributes it throughout the material world, Kelly seems to locate it in some transcendent heavenly omega point. That's why ultimately Latour seems to be more of a humanist -- the desires he talks about are human-scaled, even if they inhabit odd objects.

To summarize: this book is the product of a particular kind of vision, of a world that is hurtling despite itself towards a transcendently positive future of increasing complexity and capability. "Technology" is not the real subject, that just happens to be the current edge of the curve. It's an attractive vision, and certainly it's possible to see some of this in the world when approached from the right angle. Evolution and related processes do have a ratchet effect; the world is learning to do what it does better. That's great, but either this happens with human guidance or without it. If technology in truth can't be managed, then we don't really need to think about it, we can just play with our gadgets. If, on the other hand, technology can be shaped and guided by humans, then we need better ways to do just that. Dealing with climate change is the most obvious area where we need more control, not less. Getting transported into ecstasies by the technical sublime doesn't help. The reality of our technological world -- its glories and its disasters, its potentials and ptifalls -- has to be faced squarely. Trying to paint a moral valence on technology as a whole is a mistake; like the humans who propel it forward, it contains multitudes.


[[title courtesy of The Giant Rat of Sumatra by The Firesign Theater]]

Sunday, January 25, 2015

What's on my mind

Messing around with some computational language tools, I generated this list of words which are more frequent on this blog relative to a standard corpus (some misspellings removed), in order from most overused. Many of these are unsurprising, but I had no idea I used "cannot" more than is normal. Or "parasitical", which is more worrying.

cannot simpleminded parasitical excoriate delegitimize kvetching temperamentally treacly politcs cosmopolitans authoritarians twitter rightwingers inexpert constructivists constructionists entertainingly clathrate undesireable frenzies mystifies wastefulness repurpose gintis wobblies kunstler turmoils bukovsky bankrolls laitin smidgeon sociopaths scienceblogs cleavon oddsmaker vegetating reifying situationists doper yecs popularizer nobels cultish solidary arduino militarist prolixity congealing proft larded atran nixonian seatmate appeaser rationalists leftish libertarianism literalist materialist vitalism rejoinders schuon fusty facebook torahs arduously hugeness universalizing tinkerers factuality autoworkers parasitize rationalist dominionism physicalist incarnating idiocies axiomatically ferreted gourevitch glaringly symbiote averagely incisively shitheads skimped netzach appall metonymic onrush chokehold halldor churchy scampers starkest agentive dalliances emet mistimed ceasefires hallucinated reimagined overplaying bioethicist copleston disempower flippancy oversimplifies outrageousness indvidual ginned douchebags explicates plumbs mencius metaphysically schelling foregrounding polarizes outlives subtexts acquiesces nostrums undescribable malkuth marketeer analagous preeminently remediable flamers slipperiness bunraku proles burkean peaceniks materialists unaccountably athwart mcworld petraeus romanticizing unnamable huffpo ineffectually commonsensical interoperating empathizing wingnut supplicants hypostasis inchoate obama transhumanists fulminate affordance nonviolently geneological gashed mussed chuppah charnel felin reconstructionism verbalizing tegmark crabbed armys shalizi dehumanization hoohah vannevar copyable bungler unlikeliest preindustrial legitimated downscale fugs bilin slavering egomania naveh determinedly oligarchies chasten reappropriated bekki taleb bioethicists valdis ultraconservative wahabi straussian rewatch anthropomorphism ecstasies libertarians ruination exceptionalism vacillate overreach forthrightness informationally bushites rottenness biomorphic parceled twittering sorley parapsychological irreligious statists maddeningly selfing militarists bushite infuriates deconstructionist dallying harrows glutted worths misplacement engross jewishness hearkens girdled zombified prohibitionist braf sniggering positivists prostrating doomy schmaltzy yesod hewing philosophize doomsayers unconcern conflate jibes misappropriate convulse constructionist relabeled cavalierly mesmeric phantasms atrophied nattering reductionist personhood asocial placating incuding amorality incontestable weida greybeard inescapably scrabbling foreordained puthoff antiabortion commandeering iphone reinterpreting fudges minsky spluttering obsessional explicating rovian subdues ascription graeber counterargument plops

Now I'm playing the Burroughs-ish game of trying to find meaning in this shredded language. "physicalist incarnating idiocies axiomatically" sounds applicable to a number of discussions I've been having lately.

Tuesday, February 01, 2011

Academic units with mildly amusing/intriguing names, #4 and #5.

A couple of recent discoveries: First, near my current stomping grounds, the Program on Liberation Technology at Stanford, which has gotten very vocal lately on Twitter as it tracks what's going on in Egypt. From there I learned that George Clooney (!) is launching (so to speak) a project to repurpose satellite imagery to support human rights enforcement. Whoah.
Lying at the intersection of social science, computer science, and engineering, the Program on Liberation Technology seeks to understand how information technology can be used to defend human rights, improve governance, empower the poor, promote economic development, and pursue a variety of other social goods.
and at MIT. the home of various younger version of my self, there is The Dalai Lama Center for Ethics and Transformative Values:
This nonpartisan center is a collaborative think tank focused on the development of interdisciplinary research and programs in various fields of knowledge from science and technology, to education and international relations.

The Center is founded to honor the vision of the Fourteenth Dalai Lama and his call for a holistic education that includes the development of human and global ethics. It will emphasize responsibility as well as examine meaningfulness and moral purpose between individuals, organizations, and societies.
I'm trying to figure out what these have in common with the rest of this series, why they caught my eye. "Amusing" isn't quite right, although these are sufficiently on the fringe of respectable academia as to invite possible ridicule. But that's just it -- they are all efforts to slightly expand the kinds of discourse allowed within a university -- by bringing in religion, ethics, or explicit political agendas, or just by combining discordant elements. I'm all for this kind of thing, and have engaged in similar practices myself, so I don't want to mock, but any particular instance is going to be a somewhat chancy thing in which to invest your attention. But anything that promises to actually expand the space of possible discourses is something that I can't ignore.

I was a math major back in the day, and mathematics is probably the furthest away from interdisciplinary stuff like this, because in math there is no roughly no politics, and very clear standards for what constitutes worthwhile work (that is not entirely true). In places like The Center for Transcultural Vegetarianism, nobody is quite sure what good work looks like, which opens up a space of freedom that typically results in much crap and much worthwhile results too, and gives misfit intellectuals a temporary home.

Hm, and it occurs to me that "Artificial Intelligence Laboratory" may have had the same kind of ring to it back when the first ones were established. Now I'm afraid that field is somewhere between respectable and completely played out.

Previous entries in this series:
  1. Metaphysics Research Lab, Stanford

  2. The Institute for Research on Unlimited Love, Case Western

  3. The Greater Good Science Center, UC Berkley

Monday, October 17, 2005

Doctors Atomic

I went to see the new John Adams/Peter Sellars opera Doctor Atomic about Robert Oppenheimer, Los Alamos, and the Trinity test. Review to follow. At the same time (more or less) I was reading this depressing article about A. Q. Khan, the father of Pakistan's nuclear weapons and purveyor to the trade.

Put these two figures side-by-side and what does it reveal? Well, Oppenheimer is by far the more attractive figure, an incredibly intelligent, dynamic, and cultured guy, as opposed to the egotistical and sleazy Khan (or maybe he just had better press agents). Both served their country as they felt called to do, both ended up more-or-less disgraced. Oppenheimer's bomb was used on population centers; Khan's hasn't yet, but it may be only a matter of time.

Aside from the character of the chief scientists, what's interesting to me is the networks of people, technology, money, and resources that went into the respective bomb projects. The Manhattan Project was an incredible mobilization of resources, involving some $25 billion dollars (in today's money) and tens of thousands of people including many of the best scientists of the time. The Pakistani bomb was assembled by a fourth-rate power through an international network of shady deals and borrowed expertise. although it certainly wasn't trivial to do (a good thing).

I've noticed for a while that it takes big concentrations of power to achieve serious technological breakthroughs (in practice, this means governments or monoplistic corporations like the pre-breakup AT&T). The Internet, that paradigm of decentralized power, was only built the way it was due to the resources and oversight of the military. Once the big breakthrough is achieved though, smaller actors and networks can take over the results and repurpose it. That's happened with the Internet, and it happened with nuclear weapons technology.

Until just now, I always thought that this was an argument for concentration of power, which I grudgingly accepted despite my leftist leanings. But maybe it's an argument against. A decentralized society might not have made the internet but they wouldn't have made thermonuclear weapons either.