Thursday, December 30, 2010

Blogyear in review

Continuing the tradition, here's a collection of some of the more substantial posts of the past year, awkwardly clustered into categories. Focusing on just one of those categories might be a good idea...but unlikely to happen. So I expect that my readers and I can look forward to yet another year of miscellaneous.

Hell in a Handbasket

(aka doom, doom, and more doom)
Collapse the Movie
The Ship is Sinking
Cheap Shit Means Dead Pigs
Dancing on the Edge
Bombs Bursting in Air
I'm Not Saying We Wouldn't Get Our Hair Mussed

Wednesday, December 22, 2010

"A power so great, it can only be used for Good or Evil"

As I knew would happen, my earlier unfair review of Kevin Kelly's What Technology Wants obligated me to read the thing and give it a real one. I have to say I wanted to pick a fight with this book, for reasons that are unclear to me. Partly it's because it is in some ways very close to my own point of view, yet so different in important respects, and I'm a People's-Front-Of-Judea type of guy. Partly it's that Kelly is selling grand visions, but he seems to be unable or unwilling to also adopt the equally important critical stance. But I was disappointed in my search for an argument, because this turns out to me an essentially religious book, and in my experience there's no point arguing with religion. Either you buy into the vision or you don't. So the subject of this book is not actually technology, but the nature of existence itself and where it is supposed to be going.

You'd think the story of technology would be a large enough subject in its own right, but Kelly's scope is so all-encompassingly vast that he has to devote the a good chunk of the book to a discussion of biological life. This is necessary because a key part of his argument is that life and technology are just different aspects of the same process. Evolution (or whatever you want to call it) may have started with chemistry and nucleic acids but is now working with electronics and silicon. In support of this argument, he compares the evolution of various technological forms with biological evolution (eg, the way variant forms of military helmets or trumpets can be arranged in phylogentic trees), and then elides the differences between them, in order to paint them as just different phases of some grander tendency of the universe to evolve towards complexity/goodness/whatever. Progress, in other words. The technium (Kelly's term for the whole sphere of technological development) is simply an extension of the biosphere, or both are manifestations of some underlying, more abstract tendency.

That's the first part of Kelly's thesis. The second is that the evolution of both life and technology is a strongly convergent process, meaning roughly that while evolution obviously involves large amounts of randomness and contingency, its general tendencies and ultimate destination is in some sense foreordained. It was inevitable that we'd evolve multicellularity, muscles, eyes, and computation, although the exact form may vary across the possible universes. Kelly labels this "ordained becoming" and I believe most of these ideas have their roots in the work of Simon Conway Morris, an evolutionist who (not coincidentally) also has a "theology of evolution" at the core of his thought.

Inventions and discoveries are crystals inherent in the technium, waiting to be manifested. There is nothing magical about these patters, nothing mystical about technology having a direction. All complex, adaptive systems...will exhibit emergent forms and inherent directions. (p 186)
Given the above points, the story becomes not so much "what technology wants" as where technology (and everything else) is inevitably heading. This may be a difference which may make no difference, just as it doesn't matter in some sense whether a sunflower follows the sun because it "wants" to or it has an innate tropism-generating mechanism. Calling the trajectory of technology a "want" suggests, though, that it might want something different, or that it can be induced to go somewhere else. The arguments Kelly advances suggest otherwise.

The third part of the book is devoted to those who think they can escape from technology (such as the Amish) or put a halt to its onslaught (such as the Unabomber). It's a bold choice to make Ted Kaczinsky the focus of a chapter, I thought. Kelly does a credible and fair job of presenting his viewpoint. But the (perhaps unintended) thrust of this is to paint anybody who hopes to argue with or resist the advancement of an autonomous, self-willed technium as a madman. There are many level-headed and sane critics of technology that could have been used as foils. In some ways Kaszinscky is Kelly's mirror-image: both are equally eager to totalize technology, to paint it as a unified and nearly unstoppable force.

The urge for self-preservation, self-extension, and self-growth is the state of any living thing...there comes a moment in the childhood of our biological offspring when their childish selfish nature confronts us, and we have ot acknowledge that they have their own agenda (sic)... Collectively we are at one of these moments with the technium...At a macroscale, the technium is following its inevitable progression. Yet at the microscale, volition rules. Our choice is to align ourselves with this direction, to expand choice and possibilities for everyone and everything, and to play out the details with grace and beauty. Or we can choose (unwisely, I believe) to resist (p187)

It's odd that more reasonable efforts to redirect technology are given short shrift, given that Kelly was an editor of the Whole Earth Catalog, which advanced the idea that you could repurpose and redirect technology for alternative purposes. I guess his later career at Wired has overridden that...but he does run a site called Cool Tools which is very close in spirit to the old WEC, so he hasn't abandoned that ethos (in fairness, Kelly does discuss his personal transition from low to high tech).

The fourth part of the book is an effort to characterize the nature of the inevitable progress of technology. "Technology wants what life wants" which is "increasing efficiency, increasing opportunity, increasing emergence, increasing complexity...increasing freedom, increasing mutualism increasing beauty... " (p 270). This is the point where things skidded off the road and into the gauzy, light-filled realm of heaven for me. I've been a technologist all my life, and while I certainly believe in the wonderful things it can do and the beauty it embody, I can't take such a rosy view. Technology is not just your iPhone and Facebook, it's hydrogen bombs, ecological disaster, and the constant radical undermining of human values. It's not just those either, of course, but to consider one side without the other can't be done if you are trying to get an honest picture of the technosphere. It's the same as with biological life, which for all its beauty has no particular interest in your personal well-being and contains many "wanters" that treat humans as so much raw material, from mountain lions to malaria parasites.

Kelly is not oblivious to the possible downsides of technology, of course. But when it comes time to tot up the good vs the bad, it comes down to this:

The message of the technium is that any choice is way better then no choice. That's why technology tends to tip the scales slightly toward the good, even though it produces so many compounds the good in the world because in addition to the direct good it brings, the arc of the technium keeps increasing choices, possibilities, freedom, and free will in the world, and that is an even greater good.
Argh, this is libertarian rot. A greater availability of choice is not always "good". Would we be better off if anyone could have the choice of obtaining RPGs or nuclear weapons at the local 7-11? More choices can make people overwhelmed and unhappy. And if we are being propelled irresistibly forward into some foreordained attractor, do we really have any choice at all?

Kelly is constantly revisiting the issue of whether technology makes us better people or not. I find this a ridiculous question, and it undercuts his own premise. We do not have the option of doing without technology, with all due respect to Amish refuseniks. As a civilization, a species, we've built ourselves a technological layer that we now live in and can't get rid of (unless we are prepared for an order-of-magnitude dieoff). The question of "is technology good or evil" is a stupid question, frankly. You can talk about a particular bit of technology and what its effects are and what human interests it serves or subverts, but to try to put a moral valence on technology as a whole is like a fish giving a lecture on "water: threat or savior"?

As before, I can't help but compare Kelly to Latour. Both start with what should be a fairly straightforward task of describing the processes of science and technology, but end up going off on wild metaphysical joyrides. The difference is that Latour has a political/sociological view, while Kelly's is primarily religious (not that Latour doesn't get into that now and then). Both are trying to locate agency somewhere other than in its traditional home of individual humans, but while Latour distributes it throughout the material world, Kelly seems to locate it in some transcendent heavenly omega point. That's why ultimately Latour seems to be more of a humanist -- the desires he talks about are human-scaled, even if they inhabit odd objects.

To summarize: this book is the product of a particular kind of vision, of a world that is hurtling despite itself towards a transcendently positive future of increasing complexity and capability. "Technology" is not the real subject, that just happens to be the current edge of the curve. It's an attractive vision, and certainly it's possible to see some of this in the world when approached from the right angle. Evolution and related processes do have a ratchet effect; the world is learning to do what it does better. That's great, but either this happens with human guidance or without it. If technology in truth can't be managed, then we don't really need to think about it, we can just play with our gadgets. If, on the other hand, technology can be shaped and guided by humans, then we need better ways to do just that. Dealing with climate change is the most obvious area where we need more control, not less. Getting transported into ecstasies by the technical sublime doesn't help. The reality of our technological world -- its glories and its disasters, its potentials and ptifalls -- has to be faced squarely. Trying to paint a moral valence on technology as a whole is a mistake; like the humans who propel it forward, it contains multitudes.

[[title courtesy of The Giant Rat of Sumatra by The Firesign Theater]]

Go tell the Spartans

In honor of the repeal of DADT, here's Bill Hicks:

Yes, it's a great thing that homosexuals now have the same opportunities to become cogs in a relentless machine of imperial conquest as the rest of us. A real step forward.

I am straight and so not really entitled to an opinion, but I think I preferred it when gays were transgressive rather than determined to be normal middle-class people with marriages and jobs, including jobs in the bureaucracy of violence and death. But I suppose nobody really wants to live as an outsider if they can avoid it; it is difficult and risky and you can't get health insurance. So gay people have fought for and largely won the right to be normal, which is good for them perhaps, but leaves society short of strangeness.

[[Update: Here is IOZ on the idea that gays and women in the miltary will somehow make that institution more warm and fuzzy:

The a vast metaphoric rape machine, a big hard thing shoving itself in where it isn't wanted. To waste time pondering how "feminine traits" like "intercultural dialogue" ... can be further incorporated to help "stabilize" the world's Afghanistans, so that we can teach their backward cultures what it would be like if they "privileged, remunerated and valorized the care and feeding of functional future citizens in the same way that [they] valorize soldiering," is to avoid the rather more pertinent question: what are we doing there in the first place?

Sunday, December 19, 2010

Distracted from distraction by distraction

The weird thing is that this struck me as an excellent image of social media even before I read to the last line:
Neither plenitude nor vacancy. Only a flicker
Over the strained time-ridden faces
Distracted from distraction by distraction
Filled with fancies and empty of meaning
Tumid apathy with no concentration
Men and bits of paper, whirled by the cold wind
That blows before and after time,
Wind in and out of unwholesome lungs
Time before and time after.
Eructation of unhealthy souls
Into the faded air, the torpid
Driven on the wind that sweeps the gloomy hills of London,
Hampstead and Clerkenwell, Campden and Putney,
Highgate, Primrose and Ludgate. Not here
Not here the darkness, in this twittering world.
T.S. Eliot, Burnt Norton

Tuesday, December 14, 2010

Nightmares of Reason

Over at tggp's blog I was sucked into a fairly pointless discussion about the meaning of "technocrat". That led me to read this article about Robert McNamara, that portrayed his career as somehow paradigmatic of a certain generation of mangers, and did so in a much more sympathetic way than I am used to. (via)

I have a certain idea of McNamara in my head as some kind of monster of rationalism, a bloodless bureaucrat presiding over horrific violence and death without the slightest bit of human compassion softening his considerations. Sort of Eichmann-lite. From this article (and also from Errol Morris's film The Fog of War) he appears to be an altogether more appealing person, a tragic figure who simply was lead astray in his efforts to put his strengths into service. Those strengths were rationality, measurement, and goal-directed action. These talents worked pretty well for him in his career prior to the Kennedy administration, but utterly failed in government, when politics and conflict enter into the picture.

So was Vietnam "blundering efforts to do good", as McNamara would have it, or just another in a long line of evil imperialist actions, as the Chomskyite left would have it? I find myself caught between these two irreconcilable views. Can't it be be both? Can't McNamara be a good man who found himself unknowingly caught up in a bad system? Someone whose worldview left him blind to the effects of his own actions? Thinking along these lines leads to wondering about the nature of evil and if even Hitler was doing good by his own lights.

If McNamara's story is a tragedy of reason, the story of the left since the Vietnam era is a tragedy in the opposite direction. The war and the failure to put a stop to it led large segments of the cultural and political left to be suspicious of reason as such and to abandon it, for new age nostrums or smug deconstructionist pseudo-critique. Essentially, it prompted a new round of romantic reaction to the failures of the modern world, in this case represented by the button-downed rational managers of the postwar military-industrial complex.

In my own career I've been on the fringes of the artificial intelligence field, which had its origins in the same cold war rationalism that McNamara exemplified. The field has also suffered from the failings of narrow instrumental rationalism, which constricted the set of allowable models of intelligence to a very small and boring set. When I was in grad school I was loosely connected to a set of people trying to reform and break away from those limitations. Most of those people, myself included, instead drifted away from AI to pursue other areas (biology, sociology, user interface research, Buddhism...) I now find myself in closer contact with the old-fashioned kind of AI than I have been in years, and remembering why I never could be as enthusiastic about the field as I needed to be to work in it. It's not just the explicit military applications; it's an entire concept of what it means to be intelligent that is just so overwhelmingly wrong that it makes me want to scream. Yet the field chugs on, possibly even making some advances although it's hard to see what they are. The "peripheral" areas of AI, like robotics and vision, tend to make steady visible progress, but the more central areas like planning, reasoning and representation seem to be stuck, working on the same problems they were 20 or 30 years ago.

Monday, December 06, 2010

WikiLeaks and Open Government

I argue with my friend Amy Bruckman about the goodness of WikiLeaks on her blog. Basically she's taking the not unreasonable position that institutions need to be able to have a degree of privacy if they are to operate, and I'm saying, well, if your institution is up to no good, then it deserves to have its veil torn away. And the function of WikiLeaks, and the press more generally, is to provide that kind of a check to power.

I don't think there is an objective ethical solution to this (which she seems to be trying to provide). It's a battle of interests between the institutional insiders and outsiders.

On NPR I heard a guy from the Government Accountability Project criticize WikiLeaks for being irresponsible, and damaging their efforts to get better legal protections for responsible whistleblowers. That was a very good argument, I thought, although it still seems to highlight a division between "insiders" (in this case, lawyers who want to get good laws passed) and "outsiders" (the hacker/anarchist/whatevers of wikileaks). I am temperamentally sympathetic to outsiders, but I suppose more real change happens due to the boring activities of the more adult insiders.

Tom Slee is a guy who write critically of libertarianism and starry-eyed technology visionaries, and usually I agree with him, but I think his take on Open Government is overly negative. Here he's talking about the relationship between WikiLeaks and Open Gov, because apparently a lot of other people are, but this seems very confused. Open Gov is about making very ordinary government data and services public, like crime statistics or health code violations, with the idea that developers and others will create apps that help connect government to the citizenry in useful ways. I think this is a great idea, although with an associated hype bubble. But it's basically an apolitical idea, a technocratic vision that thinks the government should do basically what it does now, but more efficiently and with sexier interfaces than you typically associate with the DMV. Nobody in the Open Gov movement, as far as I know, expects the CIA or DoD to open up their operations to anybody with an iPhone. WikiLeaks is just operating at a very different level of government with very different issues, and I don't see the two as having a whole lot to do with each other.

Wednesday, December 01, 2010

Book Review -- Were You Born on the Wrong Continent?: How the European Model Can Help You Get A Life

This chatty and engaging book by Thomas Geoghegan explores the little-known world of the present-day German industrial economy -- a form of quasi-socialism which involves large measures of worker control, through unions, "works councils", and other mechanisms. Contrary to what you would expect if you read only the financial press, this model appears to work quite well -- Germany is the world's biggest exporter (outdoing China), and they've managed to retain and make use of a skilled manufacturing workforce. According to Geoghegan, the lifestyle of a typical middle-class German is vastly better that that of a similar USian, at least along some obvious dimensions (vacation time, guaranteed health care, job security and hence no need to work like a dog to keep your job) and some non-obvious ones (a strong feeling of social solidarity, the way job stability helps to build a high-value workforce).

This is a book of personal impressions rather than something systematically researched and thought out. Geoghegan wanders in and out of the country in a daze, not quite believing this can work. Most of the people he talks to in Germany believe their system is doomed and the Anglo-American model of capitalism will triumph there as it has everywhere, and the German workers will join the race to the bottom along with the rest of the world. Yet even the center-right parties (the Christian Democrats) have robust support for the existing system.

There is some wistful but unconvincing speculation about how the US could somehow someday adopt a model like this. I'm dubious. The social and economic fabric of the US has frayed so far that it is hard to imagine it coming together again to make a society like that of Germany, where people feel responsibility for each other and enact that responsibility through social institutions with enough actual power to restrain economic rapaciousness. OTOH, think about the hellish world that present-day Germany grew out of. So dramatic changes for the good can happen. I hope it doesn't require passing through utter catastrophe.

Geoghegan's earlier book, Which Side Are You On: Trying to be for Labor When It's Flat on Its Back is also very good.

Speaking of solidarity, another interesting-looking book on my queue is Yellow Blue Tibia, a novel in which Stalin after the end of WWII enlists a troop of Soviet science fiction writers to create a a new threat to bind the country together -- sort of Watchmen meets Gary Shteyngart. Uniting-against-a-fake-common-enemy is a hoary idea, but presumably the setting puts a new spin on it.