Continued elsewhere

I've decided to abandon this blog in favor of a newer, more experimental hypertext form of writing. Come over and see the new place.

Monday, December 31, 2012


Preverbabble is a term I coined a few decades ago to refer to the brain’s (or more modestly, my brain’s) prelinguistic chatter, a sort of rhythmic stream of mostly nonsense syllables that occasionally forms itself into words. Mostly this happens outside of consciousness, but in certain states it is possible to perceive it directly. These states seem to involve a temporary displacement. Usually the verbalizing parts of the mind are controlled by the semantic parts; in this state the more motor/rhythmic parts get the upper hand. Kind of fun! Maybe this is where poetry comes from, but my own little musico-verbal engine seems to produce mostly nonsense and occasional unpublishble doggerel.

The closest representation I can think of is the language of Joyce’s Finnegans Wake, especially the “thunder words” such as:
Anyway, the term bubbled itself back into the foreground of consciousness today for some reason, and a quick check on Google revealed exactly zero uses of it, so I thought I might as well stake my claim to originating it.

And ... wishing everyone an orthogonal New Year!

Friday, December 28, 2012

Blogyear 2012 in Review

My yearly ritual of trying to select and cluster the past year’s posts in an attempt to discover just what it is that I am trying to do here.

The number of posts this year was down quite a bit (44, previous few years were all around 65), for probably a variety of reasons – more serious discussion happening on social media sites, and I changed jobs and put some serious effort into a long guest post at Ribbonfarm, all of which were competition for my output.

Writing and random creativity

Thursday, December 20, 2012


Doing data visualization is kind of trendy these days but that doesn't mean it isn't fun. Here's something I whipped up that maps US counties by population density vs. % Democratic vote in the last election. Interactive version with map is here.

It's no surprise that denser areas (note the log scale) tend to vote more Democratic, although it was a bit startling how pronounced the correlation is. If you ignore the small smattering of low-density counties that went Democratic (and pretty much all of them are areas dominated by a racial minority) then it's almost perfect. One party represents urban cosmopolitans (that is, the civilized, which just means living in cities) and the other rural hicks.

Friday, December 14, 2012

Abject failure

For most of my life I was basically an angry young man, pissed off about the state of the world but in no way holding myself responsible for it.

Now I’m a parent, and not young, and that role no longer fits my life. Other people depend on me for making their world safe, meaningful, supportive. I am no great shakes at this, it doesn't really come naturally, but I suppose I muddle through OK just as most parents do. My children are alive, healthy, growing up, and starting to find their own ways. And guess what? They are angry too, sometimes justifiably, and I find myself unwillingly playing the other side of that drama.

I don’t have a hell of a lot of influence on the larger world, so all I can do is create small local spaces that are reasonable safe and supportive. And try to do my part to influence the larger world, through voting or making my opinions known. It’s not much, but what else can I do? Other people have pretensions to being in charge.

Well, if those people are in in fact in charge, they have failed. It’s probably more accurate to say that nobody is in charge. Obama is the president, but he can’t stop a mass slaughter of kindergarten children, and he can’t act to stop climate change.

If nobody is in charge then there is nobody to blame, which seems to be exactly equivalent to saying that we are all to blame. We have all failed in our most basic responsibility, to create a safe world for our children. My personal righteousness on these issues means nothing. Feeling guilty accomplishes nothing. Joining the choir of exhortation to improve may be a bit better, but it’s basically farting in a hurricane.

Maybe some good will come out of today; maybe the political system will snap out of its stupor and realize that we need fewer guns and more mental health resources. Maybe. But a lot of good that does the murdered children and their families.

Climate change is very different from this sort of incident, that will convulse the media for a few days and then be forgotten. It’s big, it’s slow, but it’s going to kill a lot more than 28 people. But we seem entirely unable to act as responsible adults about either of these issues, and most others.

On behalf of the entire adult world I want to beg forgiveness from my children. They have every right to be angry, and since there is no proper target for their anger other than everything, they may need to take it out on me.

Monday, December 10, 2012

Brain Parasites

Speaking of refactored agency, here’s an article covering the new subfield of neuroparasitology. Apparently it is fairly common for one organism to biochemically take over another and influence its behavior, so the host ends up acting in furtherance of its occupier rather than itself.

When I read this, I had a momentary flash of all the other, non-biochemical brain parasites that have infected me: all the ideas and goals I have played host to, concerns for money and status and respect and righteousness. All the people I ever knew and felt a need to impress. All the things in the world that attempt to snag my attention and have planted a symbolic seed in my head that draws me in their direction. All these are parts of me but they are also introjected mechanisms with their own interests, divergent from my own.

But it is probably wrong to say that there is a true self in there that is being infected by outside agents. More like I am constituted by this chaotic assemblage, which is no more or less than human culture.

Thursday, November 29, 2012

Refactoring myself

A new thing for me – I wrote a long guest post on Refactoring Agency at Venkatesh Rao’s Ribbonfarm blog. New in both the venue and the attempt to convey more of a worked-out theory than the usual ephemeral thought that goes into a post here.

I’ve gotten some nice feedback, including one commenter who praised it for its practicality. That left me a little nonplussed, and reminded me of a lesson learned at an earlier startup: if I find myself taking on the role at a company of arguing for sound business practices, then the company is in serious trouble. Fortunately the blogosphere has no real obligations, no payroll to meet, so if we are spinning castles in the air that is not really a problem.

Monday, November 12, 2012

Video sampler

Wibbitz is yet another web company with a stupid name doing something small but clever – automatically producing videos based on text web sites. Here's what they make of me:


I don't know who the various white-haired dudes who pop up under "digital charnel ground" and "transpersonal metacognition" are, and Google images didn't help.

[[addendum: turns out the video is dynamic, so will reflect the lastest few blog entries, so the paragraph above is becoming increasingly less operative]]

Wednesday, November 07, 2012

Fearful symmetry

Well, the whole civilized world breathed a sigh of relief last night. For some reason this election seemed incredibly important. Maybe they all do, and then we forget until the next one? This particular moment in history is unlikely to be especially pivotal just because we are in it now. 

But I don’t remember the 2004 election being this charged, and not just because my guy lost that time. Maybe it’s because Bush was something of a bad joke and so was Kerry, while Romney, despite of (or because of) his slipperiness, seemed to actually embody something coherent and terrifying. During the campaign I had him pegged as a consummate salesman, the kind of sleaze who talks you into buying a timeshare condominium or undercoating or nutritional supplement that you don’t really need. I could easily see him charming me if I wasn’t well-informed and knew to keep my guard up. Such con artists don’t survive long-term contact, and Romney was overexposed in the campaign, placed into situations where he didn’t have a sales script and so ended up looking like a woefully out-of-place automaton.

Whatever was going on with Romney, for me it had echos of psychopathy, and defeating him thus seemed to take on the color of moral necessity, or that it was fighting for something even deeper than that, for the very essence of the human. That is to say, while there were plenty of perfectly commonsense reasons to not want to see this guy as President, there also seemed to be an almost metaphysical undercurrent to him and his campaign — that the forces he represented were inimical to humanity, to knowledge, to everything I value. I can’t quite articulate what I mean here, but I wish Philip K Dick was around, since he specialized in turning the relation between the human and the inhuman, and the pretensions of the latter to the former, into fiction.

But about half the country doesn’t see it that way, far from it. To them, Romney is a fine upstanding family man, and it’s that other guy who embodies an existential threat to their values. There is some really over-the-top commentary today, as you would expect. It’s the end of America! (no links, but very easy to find this stuff [oh, ok, this is too good to resist]). I guess I can sort of understand how they feel using symmetry.

So, what about Obama? What does he signify? He is awfully fortunate in his enemies, that’s for sure. When you are running against something like Romney and the present Republican Party it doesn’t take much maneuvering to make yourself seem like the earthly vessel of intelligence, sanity, and caring. But this was a hard election to win, and he didn’t win it by laying back. He is (whatever else) a masterful politician, and he too seems to bundle up a bunch of cultural tendencies. Deliberately unspecific to allow the maximal amount of projection (remember “Hope and change”? How unspecific can
you get? Yet those one-word slogans were just right for the moment). But I give Obama credit because he takes all these inchoate longings, packages them up, reflects them back, and in the process actually gets some stuff done. Maybe not as much as I’d like, but it can’t be easy simultaneously being a synedoche for “change” and hammering out the details of 2000 page legislation.

Here’s the great Charles Pierce, who is a little more bowled over than I would like, but I agree with what I think he's saying:
The creative project of self-government — hard and frustrating but necessary — is to produce that political commonwealth that changes over time, that can change sometimes by the minute, if circumstances intervene. This whole campaign has been a referendum on that project… That was the issue underlying all the others. That was the fight that Romney and his party quite deliberately picked, reckoning that we had tired of all that hard and frustrating but necessary work the project involved. That was the question that was settled so definitively last night. The long creative project of America has been to engage all its citizens in that work. That is the history that [Obama] wears so well, and that he wields so subtly.
Obama embodies history, Romney embodied something else – not an alternative version of history but almost the negation of it. His constant etch-a-sketching of his own past is symbolic; but the party he leads has the same problem in larger form. The one thing that unites conservatives is the sense of being unhappy with history and wanting to return to an earlier time, back before everything went wrong, a time which might be biblical Rome, 1776, 1950, or the Hollywood version of the Old West. That too is an inchoate mess of feelings, a nostalgia for a time that never was. Obama has a demonstrated ability to actually harness the inchoate into productive action; the right just uses them as a sales pitch for larceny.

So, there is a symmetrical and widening metaphysical gulf between the two sides – “hatred” doesn’t quite capture it, because each side doesn’t just hate the other, they see them as a real existential threat. I can kind of grasp this symmetry in a sort of abstract way, but in fact I don’t think the sides are symmetrical at all. I’m not some detached observer, I am most definitely on one of these sides and not on the other. I do try and understand the views of the other side, but it’s become more and more difficult, and perhaps now that they are solidly on their way to becoming an impotent minority party, I won’t have to.

Tuesday, October 30, 2012

Disaster Porn

Like everyone else who is not actually in the shit, I am watching the Sandy disaster unfold over cable tv and other media. There is something vaguely shameful about the fascination it exerts, but also hints of something spiritual. The difference is in subtle shades of attitude and both may be active at once. What is it about disaster that draws us so?

The spiritual part is a real-life form of PKD's Mercerism, where we all use our empathy boxes to commune with the suffering of others. We watch the horrors not to be entertained but because we want to be there for these poor people, and in fact are there in some sense. We all want to be there, to help, we would all be doing something for them if we could.

The shameful part is when you find yourself using the massive suffering of others for entertainment, for tittilation and cheap thrills, when you congratulate yourself on your luck and foresight in not being on the East coast. This attitude is born of lack of empathy, lack of imagination, and lack of ability to feel an active and living connection to others. But it is fed by the media, which can't help but turn suffering into spectacle.

The best way to keep one's mind out of the disaster porn gutter is to actively join in by helping. Since most of us are not in a position to do that, it is easy for our better reactions to degenerate from non-expression. We can give some money to the Red Cross. Or, given that we are in the midst of an election, and have the opportunity to cast our small but significant votes, we can some thought to what it is that binds us together and what is the role of government in that.

Friday, October 05, 2012

Digital Charnel Ground

David Chapman posted recently on the Buddhist concept of a charnel ground – a practice of meditating on the fact that the world is a grisly killing field, that  you and everyone you know and love are fated for death, and that your subtle and sublime existence will end up as nothing more than fuel for monstrous beasts.

I had a vision of a sort of information-theoretic version of that – that life is a process of creating these beautiful, intricate, delicately balanced mechanisms we call living things, and all of them end up getting eaten, their beautiful structures reduced to crude raw material for something else. The universe apparently is a process that generates astonishing complexity and then throws it callously away.

Given that our digital machines are for the most part images of ourselves, mirror-worlds and mirror-selves, is there a technological equivalent of a charnel ground? Well, sort of:

Someday there will be software that is capable of contemplating these in order to free itself from the wheel of reimplementation.

Friday, September 21, 2012

"A Kind of Kafka Steeped in LSD and Rage"

(title stolen from Roberto Bolaño)

So there is a Philip K. Dick festival this weekend. I think I will be going, although not sure precisely why. I am a fan, but not a fan of fandom. But this has too many interesting speakers to ignore. PKD is more than a pulp SF writer – a visionary, spiritual seeker, philosopher, prophet. I get the sense he would be amused by those labels, and perhaps refuse them.

I was first exposed to PKD in my youth, when I stumbled on a copy of Ubik at the library. At the time it seriously freaked me out, and possibly I never completely recovered from that. Ubik is a story of the usual Dickian mid-level corporate functionaries who get caught up by the machinations of implacable and hostile forces, and end up having reality itself crumble around them.

Back then, science fiction (like computer hacking) was a tiny and reviled subculture. Since then both have not only become mainstream, but culturally dominant. PKD's work of course has been mined for movie scripts ever since his death, with middling results. Some of those movies are good, some are even great in their way, but basically none of them (with one exception) seem to capture any but the most superficial aspects of his vision. There are probably many reasons for that, but one is that PKD's heroes are almost uniformly schlubs, and while it is possible to make a movie with a schlubby hero, it is not possible in the context of a summer-sf-blockbusters. The only movie to really be deeply Dickian, that is, to use the cinematic medium in such a way as to convey the actual themes of Dicks' work, is the neglected A Scanner Darkly.

Dick returned over and over again to certain themes, one of which I also seem to return to: empathy, and its absence. There are at least two great inventions of his in Do Androids Dream of Electric Sheep (which formed the basis for Bladerunner) – The Voight-Kampff machine which could distinguish replicants from humans by measuring their empathy and the religion of Mercerism. The former made it into the movie; the latter did not, which is a pity, but maybe it is another one of those things that does not translate to movies. Mercerism was a kind of jacked-in version of Christianity where believers can share in the sufferings of William Mercer by means of an electronic device. This kind of thing – the sharing of actual spiritual presence over the intertubes – is something we are quite a long way from but I can't imagine it not happening eventually.

The idea (but not the actuality) of empathy pervades Dick's world. His characters all seem to be either casualties or agents of its absence. They are trapped in "a maze of death", which is none other than the universe itself, seen as a cruel machine manipulated by sinister false gods. Their make pathetic attempts to cling to one another for support, but find the others either unwilling or unable to provide the needed warmth. Genuine empathy, the shadowy presence of the true god blocked by the constructs of the demiurge, exists but only can be seen in glimpses.

Now that it occurs to me, the parallels between the Dickian and Kafkaesque are too strong to not see, and I feel vaguely embarrassed by not having grasped them before now, given that they are two of my favorite writers. Probably they too obvious to list, but one that strikes me is the combination of metaphysical terror with genuine if mordant humor. In Kafka, the machinery of death is composed of bureaucracy and the stuffy social structures of prewar Europe, while in Dick they are built out of automation, warring corporations, and drugs. Same shit, different eras – we are all trapped in these alien-yet-familiar systems, systems built by humans but in the process of spinning out of human control.

Dick's universe is a little less bleak than Kafka's perhaps, since he allows the occasional emanation into the false universe of magical life-restoring substances from beyond its boundaries. Thus the title substance/product of Ubik reveals its true nature in epigraph of the final chapter:
I am the word and my name is never spoken,
the name which no one knows
I am called Ubik, but that is not my name.
I am. I shall always be.
[update: well, that was -- different. I thought I was a PKD devotee, but I was a mere tourist amongst these people who in many cases knew the man personally, and have devoted years of study, and have internalized the man's enormous sprawling oeuvre.  Many serious scholars too, which I didn't expect. PKD is about fragmenting and fluid identity (among other things), and reading his works has a subtle deranging effect on one's sense of self (true of all authors in a way, but Dick takes it to a whole other level). So to be in this crowd was somewhat like being on a PKD drug, and his spirit seemed to enter and animate everyone there, like a more benevolent version of Palmer Eldritch.]

Sunday, September 16, 2012

The Same and Not the Same

I like my last year's Rosh Hashanah post. But I can't write the same one twice.

The solar clock is about to tick again, so here we are, at a boundary between two chunks of time. Year past and year to come, entirely different from our humble embedded perspective, and so damn similar when seen from above. Variations on a theme, and year-to-come will be year-past soon enough. My past-self was wont to scoff at such arbitrary boundaries as secular or religious new year's days; my current-self, made less arrogant and more human by the tenderizing mallets of time, is more apt to latch onto them gratefully, as useful and stable landmarks amidst the chaotic swirl. But these two selves are also variants of some common prototype. They are the same, they are both versions of me, yet different.

"The same and not the same" is a rather koan-like phrase but I got it from the title of a pretty good book on chemistry, and as it happens I am starting a new job once again dealing with computational chemistry, which is looking to be the same and not the same as the one that got me out to San Francisco about 13 years ago. Back then my wife was pregnant with our second child who is now about to have a Bar Mitzvah, which is going to be the same and not the same as that of his brother, three years earlier. These two individuals, who in years past might have been thought of as rough copies of their parents, are working hard on differentiating themselves from us and each other, as is appropriate to their age. That process too has probably happened in much the same way for thousands of years, and is utterly different each time.

Another sage once said that history not only repeats itself, it stutters. More recently I read someone complaining that modern life was like trying to keep time to a shoe going around in a dryer (ie, as arrhythmic and unpatterned a sound as can be imagined). Change is accelerating; my ability to deal with it is diminishing, so the older I get the more I cling to whatever stable temporal patterns are available. This particular one has lasted a very long time, and part of the reason for that is its ability to sweep up even the dubious and reluctant into itself.

Friday, September 07, 2012

Illiterate Programming

For some reason, the group I am working with on a pretty complex Rails application does not believe in comments. They are big time adherents to agile methods, TDD, and that sort of thing, and the anti-comment stance seems to be a side effect of that. The reasons, I'm told, is that code should be self-explanatory, and also since comments can't be validated with tests they are likely to become outdated and inaccurate.

Well, the first reason seems like nonsense. I admit to sometimes taking that position myself, but I'm used to working in languages that have more powerful abstraction capabilities than Ruby (and so it is easier to shape programs to match human thought). And even there, it is almost always useful to put in some comments as guides. The Rails/Agile community seems to put a lot of thought into matching up user-visible structure, program structure, data structure, and the natural language ways to describe them, but their tools are really pretty crude and don't work in exactly the places where you need guidance.

For example, I ran across a method called sanitize_structure. Obviously this was supposed to take some representation of a (molecule) structure and produce a new one that had certain features removed, but what were they? There was no easy way to tell. And that's because this operation is not self-evident, it depends on domain knowledge that is not readily captured in standard Rails apparatus.

The second reason also seems pretty nonsensical. Yes comments may become outdated, but eliminating them entirely seems like having your legs amputated to get rid of toe fungus.

Now, this group works much more collaboratively than most places I've been involved with. And that means that perhaps code doesn't need to be as self-explanatory as I'm used to – instead, you are encouraged to go talk to people and get them to explain it to you. Talking is good, although it seems both inefficient and risky as a way to transmit software design information.

But I've really been missing the presence of English language in the code. I've tried addressing this technically – I found an Emacs package that lets me insert annotations into the code that do not get saved in the file but instead live in a side-channel. This means at least I can make my own notes and read them later, although nobody else sees them. And commit messages from source control are also a valuable source of human-scale explanations and design rationales.

A lot of agile seems to involve this kind of over-reaction to very real problems of earlier models of software development. But I guess this kind of over-reaction is fundamental to how any new thing establishes itself.

[to go up a few meta levels – I realized recently that the whole point of most of my work, including the commercial, research, and philosophical aspects of it, might be summed up as exploring ways for reconciling computational technology with the way human minds actually work. And given this, I realized that the agile methodology people might have something to teach me in that regard – that is one reason I took this job. I was pretty certain I would find adapting to agile to be a struggle, and it has been, but I think it will be a productive one. ]

[and more meta – I have been thinking about starting a different blog for technical stuff like this – having a technical blog seems to be practically a job requirement these days – but so far can't be bothered. I'll use tags.]

Thursday, August 23, 2012

Personhood USA

First: I thought I was pretty immune to the oddities and offenses of wingnut thought, but I must say I was taken aback to see people parading around a banner with the word "rape", not to protest it but to celebrate the results of it (via Rachael Maddow):
Proposed personhood amendments failed in Colorado two times. Mississippi will be voting on its own personhood amendment this year. In an effort to promote its cause, Personhood Mississippi has started a "Conceived in Rape" tour featuring Rebecca Kiessling, who says she was conceived by rape and was slated for abortion. Kiessling states on her website:

Have you ever considered how really insulting it is to say to someone, "I think your mother should have been able to abort you."? It's like saying, "If I had my way, you'd be dead right now." And that is the reality with which I live every time someone says they are pro-choice or pro-life "except in cases of rape" because I absolutely would have been aborted if it had been legal in Michigan when I was an unborn child.
Second: OTOH, give them credit for a smidgen of intellectual consistency. If you really believe that any zygote with around 46 chromosomes is a full-fledged person deserving of full legal protection, then why would that protection suddenly be withdrawn just because that person happened to come about as the result of a violent assault? If abortion is murder, then it's murder no matter how the vessel containing the victim might feel about it.

Third: OTOOH, not really. As I've pointed out before, if the proposition above was really adhered to, then the infant mortality rate would about around 50% and we'd be holding funeral services over discarded tampons. [[Update: Guess I'm not the only one to notice that. I think that link is a joke site, but I can't be sure.]]

Alright, all of the above was just an excuse for me to write about a perpetual term that won't leave me alone: personhood, now with its own lobbying group and proposed constitutional amendments. The concept exerts a strange fascination, perhaps because it is obviously a social fiction while at the same time absolutely essential to living life. I wrote my dissertation on a related topic (agency and computation), and apparently that was not enough to get it out of my system.

I suppose it is compensation, or a reflection of a basic maladjustment. I figured out a long time ago that my interest in sociology is directly linked to my difficulties with normal society (to put it simply: being a sociologist is like a fish suddenly noticing that they are swimming in this weird "water" stuff and wanting to have a theory of it – and only a fairly weird fish would feel the need). Personhood is just another aspect of the same dynamic, and no doubt underlying it is that faint trace of Aspergerishness that is so common in my chosen profession.

From my point of view personhood appears to be have maddeningly contrary qualities: fictional yet real, elusive yet mundane, unknowable while necessary.  I don't think a constitutional amendment is going to help.

Sunday, August 19, 2012

Transpersonal metacognition: it's on!

I gave myself the fake title "Chair of Transpersonal Metacognition, University of Saskatoon" (on the right column, until I get tired of it and change it) in a fit of amused randomness, but metacognition turns out to be a real thing, if not quite real enough to have its own department yet. I shouldn't be surprised. If I could only take my offhand inventions more seriously maybe I really would be occupying a loftier chair than my decrepit used Aeron.

Anyway, there are over 63K articles returned by Google Scholar with the word "metacognition" in them. The linked one above has lots of intriguing references, like to a study that showed that telling subjects that free will doesn't exist led to more cheating; and research that suggest that metacognition is located in an area of the brain called Bordmann area 10. All of that sounds fascinating but it somehow misses the main point (or at least the point that interests me), it buries the lede, hides the jewel within a bushel of plain oats. That point is that "metacognition" is not some random phenomenon to be isolated and studied, it is very close to the fundamental quality of being human. Somehow turning it into a serious (but ordinary) academic subfield seems to drain it of interest. That is no doubt a problem with me, not with any of the actual work or people working on it.

Metacognition improves upon the completely broken notion of "consciousness". Consciousness is looking at roughly the same phenomenon but in such a way as to make it mysterious. "Consciousness" suggests a magically transcendent form self-knowledge; "metacognition" by contrast suggests that we know ourselves in almost exactly the way we know everything else: partially, murkily, filtered through inadequate representations and built-in biases. We are foreign to ourselves, we take ourselves for objects just like anything else.

What about "transpersonal"? The deep idea there is that our self-cognition and our other-cognition is essentially the same, or at least generated by the roughly the same processes, and we build up our self-image by observing others, and by observing others observing our self. That is to say, our metacognition, even of ourselves, is "transpersonal", which is just another way of stating the conclusion of the preceding paragraph.

Some philosophical works that touch on this: Paul Ricoeur's Oneself as Another, and (in a fairly absurd way) Daniel Kolak's I Am You. Douglas Hofstadter has been obsessed by it in one form or another for decades and in some ways has already exhausted the topic, now that I think about it. Other computational thinkers have tried to create models that embody reflection, introspection, or self-modelling (Brian Cantwell Smith, Jon Doyle, John Batali, Marvin Minsky). But those never seemed to lead anywhere…although apparently there have been AAAI symposia on metacognition, which I am somewhat sorry I never went to (and will likely never, having once more derailed myself off the research track).

This blog post may be a personal apotheosis of meta for me, since it is about (meta) to my own relationship with both the idea of metacognition and the academic instutions of knowledge that study (are meta to) it. And of course that last sentence was one more level of meta, and so is this one.

[And to go meta on a different axis: I'm starting a new job tomorrow so it is likely that this blog might take a turn towards being more about software development and other mundane matters. Or it might get even more random as I use it to vent all the parts of my mind that do not have a professional outlet. Or it might stay just the same, or stop altogether for awhile.]

[And one more: checking my archives, I see I was thinking along much the same lines back the last time I was in the midst of a job change. Scary! Or predictable? I guess it's scarily predictable.]

Friday, August 17, 2012

Portraits of Unicode Characters, #2 of a series — Pile of Poo

As you can see by contrast with the previous entry, the vocabulary of Unicode symbols spans the gamut of human concepts, from the gloriously abstract to the all-too-earthy. However, unlike mathematics where symbolism is in heavy use, and philosophy where it isn't but probably should be, talking about shit is an everyday activity where the use of typographic condensation has heretofor been minimal. And honestly, it kind of misses the point. Part of the point of talking shit about shit and pointing out the bullshit emananting from shitheads is the pleasure of saying the word, molding it to its subject as it were. So substituting a symbol is somewhat unsatisfying.

Like many other misguided efforts at formalization, this seems like it is something of a failure at capturing the richness of the concept that it claims to denote. On the other hand, it is perfect in its crudeness. If it were an elegant symbol then it would be an even bigger failure, but as it is – an unsophisticated pictograph, a rather unwelcome presence amidst all the quietly understated mathematical symbols, an insult to their air of aristocratic disembodiment…yes, it may be the first vanguard of a peasant uprising amidst the typographic populace. Calling bullshit is the first step in developing a more sophisticated form of critique, and feces-flinging is not only an old and noble primate tradition, but apparently an emblem of intelligence.

On the unpower of words

In the East poets are sometimes thrown in prison – a sort of compliment, since it suggests the author has done something at least as real as theft or rape or revolution. Here poets are allowed to publish anything at all – a sort of punishment in effect, prison without walls, without echoes, without palpable existence – shadow-realm of print, or of abstract thought-world without risk or eros. ... America has freedom of speech because all words are considered equally vapid.   
           — Hakim Bey

[not really inspired by current events, but resonant with this little episode.]

Friday, August 10, 2012

Decentralized Intelligence Agency

A few weeks back I realized that my lightly-disguised pseudonym needed a business card of its own for meetups and other halfhearted attempts to be part of the Bay Area scene.  Here's what I (or he) came up with:

It's procedurally generated so each one has different colors and other variations, which is kind of fun.

Tuesday, August 07, 2012

Sunset years

For some reason, today I recalled an episode (a recurring episode actually) from childhood that I hadn't thought about in ages.

Imagine myself as a small boychild, smart but hardly differentiated from the world around him.  Raise him in the formica and spaceships and Danish Modern of suburban Chicago, 1965 or so.  Then transplant him to a country inn somewhere in New Hampshire that was full of decrepit German Jews, smelling funny and reading foreign newspapers. This was a vacation trip I was dragged on for some good number of years, as we made an annual pilgrimage to the East Coast to visit my grandparents who lived in Brookline, MA.  They and their friends congregated at this resort called Besin's, which (unlike the rest of my universe) was emphatically not aimed at children.

I honestly have no real memory of what I felt there, but it must of been some mixture of boredom and the sheer stark terror of being in the vicinity of death.  Something a seven-year old would have not the slightest knowledge of, but the quality is of such a magnitude as to be felt in the bones, in the very fabric of the cosmos.

I can't imagine what prompted me to think of these summers of boredom and the nearly dead, except that tomorrow is my last day of work at [prestigious research institution].

Sunday, August 05, 2012

Fare Thee Well

Still in the process of leaving my current job; which I decided needed some musical accompaniment:

 (listen to at least the 2:00 mark to get my true feelings)

Also too:

You just kind of wasted my precious time, but don't think twice it's alright.

Wednesday, August 01, 2012

(near) Googlewhackblatt: quasidemic

Darn, came really close to coining the word quasidemic but there are a few existing uses.  Actually they all seem to be copies of this HuffPo article on the Dalai Lama which uses it in hyphenated form, so I am going to claim near-originality here.

Meaning: places like SRI, BBN, PARC, and other non-academic research labs that are or purport to be doing academic sorts of intellectual work.

Considering I have been in and out of these places for many years it is odd that I had to invent a word to describe them. (Currently working on going from in to out, so doing some reflection I guess).

[update: whoops, a Googlewhack is a two-word Google query, so my coinage is closer to a Googlewhackblatt.]

Friday, July 27, 2012

Goodpsi to all that

I am in the process of gently tearing myself loose from the cold gray institutional embrace of my current employer, for the more exciting and remunerative arms of a startup. We are both trying to make it amicable, but in any breakup, it is inevitable that feelings will be hurt and old wounds re-opened.

So I thought I'd reopen one that was from way before my time, just because the episode is so bizarre and it fits in with some other historical reading I'm doing at the moment:
[in 1973] Sarfatti happened to read a story in the San Francisco Examiner about research under way at the Stanford Research Institute, or SRI. SRI, much like defense-oriented laboratories at MIT and elsewhere, had been a flashpoint of student and faculty protest just a few years earlier…Stanford's trustees, eager to quell the protests, spun off SRI as a private research enterprise and divested the university's ties to it…the lean years brought new opportunities for some, including laser physicist and former Stanford lecturer Harold Puthoff. Puthoff … joined SRI in 1969 and left the university the following year, when SRI was spun off; in short order his laser-research government contracts began to deflate. With time on his hands, he asked his SRI supervisor for permission to begin conducting a different set of experiments: tests of parapsychological effects. Puthoff was a devotee of Scientology at the time…He had also dabbled in early rumblings of the California New Age scene during the 1960s, including workshops on gestalt therapy and consciousness expansion…he courted another laser physicist from Sylvania's research laboratory, Russell Targ…Together, they jumped into the psi business.

Their big break came in September 1972, when the Israeli performer Uri Geller visited SRI to conduct laboratory tests of his psychic abilities…
       -- from How the Hippies Saved Physics, David Kaiser, pp 69-70. 
Same place, way different eras. "Consciousness expansion" is right out, for one thing, and apparently for good reason.

Wednesday, July 25, 2012

Portraits of Unicode Characters, #1 of a series: ⧜ Incomplete Infinity

The Unicode character set is a huge but barely explored extension of what is essentially now the common linguistic matrix of the human race.  It contains every conceivable glyph and symbol, including some rather strange ones that seem to have been inserted in a spirit of philosophy, comedy, or both. This series will examine some of these and what is being done with or could be done with them.

Incomplete Infinity 

In mathematics, at least, it is an absurd concept.  Or a redundant one ‒ as Cantor showed, any infinite set is incomplete in a sense, because you can always construct a larger one But transfinite numbers aside, it doesn't seem to make much sense to talk of an incomplete infinity. A set is finite or it is infinite; there is no in-between.  Very large is not close to infinity and if you take anything away from infinity you are still left with infinity. So there doesn't seem to be anything for "incomplete infinity" to refer to.

But from a non-mathematical perspective, this symbol is just a perfect, its form and name a masterpiece of  compacted metaphysical irony. It is a representation of the human condition - partaking of the infinite, cognizant of the infinite, often driven mad by the infinite, yet irredemiably incomplete and finite.  No matter how much we accumulate, the gap between our finite selves and the infinite is never closed, except perhaps in death. 

Friday, July 20, 2012

Perverse attraction to wingnut douchebags

I find myself spending far more time reading right-wing blogs and such than I really want to. The reasons are somewhat opaque to me, but the situation seems to have some of the qualities of a classic sexual paraphilia. Not that I have much experience in that regard, but the insistent pull of such distasteful people on my attention must be what it is like to, say, have an amputation fetish – something about the sheer awfulness and ugliness of it makes it literally attractive (ie, that it is, it draws my attention, more or less against my will) in a private and inexplicable way. It is both shameful and oddly exciting and those qualities feed on each other.

So during a recent bout of perversity, I came across this rant by Rush Limbaugh that seemed to achieve a certain perfect state of douchebaggery, in that it expressed the exact opposite of what is normally considered decent opinion:

 If you don't want to listen, I don't blame you. Here is the meat of it, from around 2:30. He is riffing off of Obama and others making the point that people who get rich don't do it all by themeselves:
This is about a bunch of people that don't count, this is about a bunch of people with meaningless, miserable lives, lying to themselves, trying to tell themselves that they matter. So you have Mr Big Business guy, Mr Wealthy…well, he couldn't have done it without all of us, we built the roads, we built the regulations, we built the trains yeah, well if you did all that, why are you sitting there with nothing? If you made it all happen, how come you've got nothing? well the rich business guy stole it This is such a crock, this is a bunch of meaningless people who know that their lives don't count for anything, trying to matter, coming up with this ridiculous philosophy that says that successful people have not done it on their own, successful people only exist because of the nameless faceless real true hard – you know before Marx there was no such thing as class-driven economics. If that guy was aborted, we'd have a whole different world today.

This man hates this country.
The distortions in the above are pretty easy to spot (if you would like it shredded in some detail, see here, particularly the argument over the origins of the internet which is a perennial personal irritant). But of course reason and truth have very little to do with it. I just want to marvel at its utter perfection of a certain quality – hard to say exactly what, but let's call it the resentment of the priveleged at everybody else. It inverts the normal liberal platitudes of caring for the little guy, of the democratic notion that every person is worthwhile.

Now, there is in fact something wrong with such liberal platitudes, after all – because we aren't all equal and everyone knows it, but it is not polite to say so. The standards that require us to value all people everywhere just because they are people are (in their naive interpretation, at least) are impossible to meet, and thus everyone is forced to be at least a minor hypocrite. Conservatives chafe under this burden, they thrive on imagined rebellion against their liberal moral overlords. So Rush appears to them as someone who can speak truth to "power", but in such a way as to reinforce the real power of the dominant.

Still, it is a little hard for me to see who exactly is supposed to buy this kind of thing. His audience is not the actually powerful, for the most part. The I've-got-mine-jack and screw-the-little-guy routine would seem to alienate anybody who has to work for someone else for a living, and has even rudimentary intelligence. It must be part of his magic to somehow polish up the resentments of the putzes who listen to him – I'm picturing them as the lower quartile of white working class men, but who knows – and somehow unites them on an emotional level with the resentments of the rich.

This is the core emotional move of the Republican party, and we are fortunate perhaps that they've chosed as their presedential candidate perhaps the person least capable of pulling it off. Romney is almost a caricature of the out-of-touch monied aristocrat, and one who apparently achieved his wealth through a combination of birth and looting. It's a little hard to pose as a job creator after running a bust-out operation that would make Tony Soprano jealous.

Politics involves the social mobilization of emotions, resulting in some of the best and (more commonly) worst that humanity is capable of, empathy and loyalty and resentment and hate. I guess I don't blame those who avoid it entirely, but I don't think that is really an option for a fully functioning person – it pervades our lives, just like economics, whether we want it or not.

So, again I ask myself why I bother with the worst of the worst like Limbaugh. It may indeed be a kind of mental perversion, but perhaps what I am trying to do is invert some of the emotional dynamics of politics. That is, the normal thing for me to do would be to join with other people who I mostly agree (progressives) with and jointly get angry at the people we don't like and talk about how they are ruining the country. This is such a normal mode of small talk, especially in the Bay Area circles I travel in, that it is of almost zero interest to me, even if it builds solidarity and sometimes even leads to action. No, I'd rather read the wingers where I am the object rather than the agent of hatred, resentment, or just dislike. That gives me a little thrill of the forbidden, of darkness, of discomfort. But like any perversion, the minor thrill soon degenerates to a mere mechanical response.

Friday, July 13, 2012

LinkBack Relaunched

A few years ago I made a little browser extension hack that would show you incoming links to each page that you visited. It was kind of crude, and eventually stopped working as the browsers evolved and the underlying service went away.

So, I've made a new version. This one is a proper extension (for Chrome, the browser of choice at the moment), looks and works better, and uses a different service. Please to enjoy; feedback is welcome. I've found it adds an interesting new dimension to web browsing.

Saturday, June 30, 2012

Who you calling "coder", coder?

Although Coders at Work is a fine collection of interviews with a stellar cast of software people, I hate the title and the way "coder" has become the word of choice for the kind of stuff I do. But I still haven't found a good name for it.

"Programmer" is OK but drab, "Hacker" is good but it has an unfortunate criminal second meaning, "Maker" is both too new and too broad, "Software Engineer" always sounded vaguely pretentious to me. "Software Designer" is probably the most accurate, but it never caught on. "Software Architect" isn't bad, although it tends to get interpreted too narrowly most of the time. Real architects (of buildings) integrate engineering concerns, the knowledge of and need to support patterns of human activity, and artistic striving…it's a label I'm comfortable with. Or simply "Developer", that is open-ended enough while capturing the creative essence of the activity.

But "Coder" is the worst of the lot, it suggests a drone grinding away in some 19th century office with a complete lack of creativity or engagement – someone who simply mechanically encodes ideas that were dreamt up by someone else. Blah.

Maybe it's a function of social context. I would be embarassed to be a labeled a "coder" at some non-specialist social gathering, but among software types it functions something as a badge of honor; it means that you are still hands-on and not some kind of distant manager or philosopher/flamer. Maybe it functions like "queer" or the n-word…a derogation being reappropriated as a badge of pride.

[update: I forgot "Computer Scientist", which is my actual current job title. Of course it is also quite pretentious, and it always brings to mind the adage that any field with "science" in its name isn't one]

Friday, June 15, 2012

Envy and Computational Complexity

While reading this article by Sven Birkerts on artistic envy, it occurred to me that the situation of Salieri in Amadeus has some analogs in computational complexity theory, specifically, the curious dual nature of NP-hard problems. These are problems for which it is possible to recognize (verify) a solution in a reasonable amount of time, but finding that solution almost certainly takes an unreasonable (exponential) amount of time. A typical NP-hard problem involves searching through a very large space of possible solutions (eg, the possible routes a travelling salesman could take to visit all the cities on his circuit. So, given the problem of finding a route shorter than some given value, you need to examine a very large number of possibilities, which can be impossible to do in a reasonable amount of time. But checking that a given route satisfies the conditions is fast and easy.

In Amadeus, while being a genius is hard (and thus rare), recognizing genius is pretty easy. Which is the root of Salieri's problem. He can see Mozart's genius, but there is no way he can duplicate it. Recognizing genius is not necessarily a trivial capability, probably not everyone can do it, but it's a far cry from being a genius, from being able to create (or find?) works that exceed some threshold of quality from the unimaginably huge space of possible compositions. He does not have the processing power for that, and those who do seemed possessed of something supernatural, that is, beyond conceivable mortal power.

This is something of a follow-up to this post. I've been resorting my collection of old papers and grad-school detritus, and I have to say I've been fortunate to have spent a good bit of time in the presence of actual genius, at least the computer science version of it. Like Salieri, I could readily recognize it, even if I couldn't achieve it myself. I guess I was luckier than Salieri to be also almost completely clueless about social status and rank, so at least at the time it didn't occur to me to be envious, which meant it didn't interfere with my ability to learn what I could from these semi-supernatural beings.

Monday, June 11, 2012

Hating on Haidt

I read Jonathan Haidt's book The Righteous Mind, and may have a full review eventually. It's getting some quite nasty commentary from the left, due to its both-sides-do-it-can't-we-all-get-along conclusions. Haidt presents himself as a standard-issue liberal who has, through the course of study, come to appreciate conservatism for something other than mere selfish stupidity, and thinks that we all have to do the same. This bland bipartisanship infuriates a certain kind of leftist.

I'll reserve my own judgement about all that. But I wanted to note that some of the virulent reaction to this from the left actually works to disprove Haidt's thesis. Among the six dimensions of morality Haidt identifies, "Loyalty" is one of the three he says that liberals are generally deficient in, or do not appreciate, or do not factor into their own judgements. Yet here Haidt is getting dumped on essentially for the sin of apostasy, the quintessential sin against loyalty. Those angry leftists sure do seem to have a highly tuned sense of loyalty after all.

Yes this is only based on a random smattering of blog comments, but I thought the reflexive irony or whatever it is was so sweet as to be worth noting.

Here's a more serious critique of Haidt's sloppiness, and here's a humorous leftist jibe at former leftists.

Sunday, June 03, 2012

Great and Universal Ignorance

The Internet is full of experts, people busily self-branding themself as the go-to person for whatever is the current marketable bit of technology or business or academic fashion. Personally, I've never had much interest in passing myself off as an expert, I'm more fascinated by the huge and ever-growing ensemble of things that I don't know than what I actually do know. To get Rumsfeldian about it, there are the known unknowns – the things I know I don't know (but would like to, if time were infinite), and even at my advanced age there are unknown unknowns, knowable things I am not even aware exist.

Anyway, this passage from a curious little cult book I read in my youth has always stuck with me (emphasis added):
Discoveries of any great moment in mathematics and other disciplines, once they are discovered, are seen to be extremely simple and obvious, and make everybody, including their discoverer, appear foolish for not having discovered them before. It is all too often forgotten that the ancient symbol for the prenascence of the world is a fool, and that foolishness, being a divine state, is not a condition to be either proud or ashamed of.

Unfortunately, we find systems of education today which have departed so far from the plain truth, that they now teach us to be proud of what we know and ashamed of ignorance. This is doubly corrupt. It is corrupt not only because pride is in itself a mortal sin, but also because to teach pride in knowledge is to put up an effective barrier against any advance upon what is already known, since it makes one ashamed to look beyond the bonds imposed by one's ignorance.

To any person prepared to enter with respect into the realm of his great and universal ignorance, the secrets of being will eventually unfold, and they will do so in measure according to his freedom from natural and indoctrinated shame in his respect of their revelation.

– G. Spencer-Brown, Laws of Form

Sunday, May 27, 2012

Open Government and Private Interests

Tom Slee is an interesting writer (who occasionally comments here), who has published a good book on the limitations of markets and seems to generally fall into the "cyberskeptic" category. He recently published a couple of blog posts critiquing the Open Government movement (or "movement", since one of his complaints is that it is not actually a coherent political movement). I thought these were interesting but misguided, and felt like trying to tease out some details from my initial reaction.

First, why is this important? Well, Open Government is merely one of the many technosocial movements that seems to have spun out of the O'Reilly publishing empire and is deeply enmeshed with it. "Makers" is another, "Big Data" is another, "Web 2.0" was one a few years back but now seems to have been mothballed. All of these are bigger than O'Reilly of course, but he's played a central role in crystallizing these tendencies, generating slogans for them, and (I presume) making a boatload of money off of them.

Whenever I contemplate these I have a split reaction. On the one hand, they are all pretty damn good directions to move in, very much in tune with my own values and thoughts. He's mastered the art of slapping useful labels onto the not-quite-yet-born aspects of the tech world, labels that are just vague and suggestive enough to launch a movement. Yet there's this element of silicon valley hype cycle to them that turns me off as well. Mostly I try to ignore that latter bit, make my peace with it.

But aside from the aesthetics, there is often a huge element of economic centralization that is also happening with these very distributed movements, and that is something that deserves some closer examination. The way O'Reilly makes money off of amateur enthusiast "makers" is a relatively inoffensive example; the way Facebook makes billions off of everybody's private social lives is something more serious. (see "digital sharecropping").

I imagine Slee has somewhat similar feelings, but in his case he's turned this into an ideological conflict between "neoliberalism" and something else (progressivism, I guess). The market side of these movements is more of a turn-off for him than for me, and it may seem particularly abrasive in the case of Open Government, where it tends to support privatized delivery of what are now public services. I don't like the use of "neoliberalism" as a curse word, and am not sure it's even a very coherent ideology. Let's use the more straightforward "market-based mechanisms". These have their place; I don't think progressives should define themselves by a reflexive opposition to the market or market-based solutions to problems.

Here's Slee's most salient complaint against Open Gov:

It's co-opting the language of progressive change in pursuit of a small-government-focused subsidy for industry.
This seems kind of twisted towards the negative, although every element of it has some truth. It is using the language of progressivism, it does enable new ways for private interests to profit from government data, but I just can't see that as an automatic bad thing. The question is whether all that private interest is being harnessed towards the public good or is merely exploiting it, and it is too soon to tell which of those will prevail.

Anyway – one of the reason Open Government seems like a good idea is that everybody knows that Gov 1.0 is completely broken and desperately needs reinvention. Letting Silicon Valley get its hands on it doesn't seem like such a bad option, even if that means a gold rush of private interests trying to profit from the newly opened data landscape. But it would be better if these things were thought through, that actual politics was brought into the discussion, that we don't pretend (this is a common computer delusion that I probably share) that all we have to do is establish digital connections between everything together and peace, harmony, and intelligence will reign.

The real world involves interests and conflict. "Government as a platform" ignores the fact that platforms are contentious political creatures themselves. Even a purely technical platform, say, Java, is not merely technical, different corporate interests are served by its adoption, what rules it follows, etc. So yeah, government is a platform, that's really an excellent way of looking at it, as long as we realize that platforms have politics even when they aren't directly a part of state power.

I remain pretty excited by Open Government, I think it points to a whole slew of services and solutions that haven't been invented yet, and I don't mind all that much if some people are trying to make money off of it. That's how America works, it seems better to accept it than fight it.

[This is not directly relevant but presents a somewhat parallel dilemma; another way in which state power, the counterculture, and technology are intersecting in ways that are intriguing, powerful and somewhat scary.]

Sunday, May 13, 2012

Intellectual Archaeology

One thing you can say in favor of electronic books – they don't have to be stored in the garage because of lack of shelf space, so they don't get used as nesting material by the local rodent population. I spent much of the past weekend mucking the rat poop out of the garage, and in the process needed to carefully examine each precious volume as I removed it from one chewed-up box into a new and hopefully impenetrable plastic one.

It's like doing archeology on one's self, every past interest is there, like shards of pottery, buried in a succession of layers. Every time I do this (during moves or major housecleanings or emergencies like this) I re-sort them, based on vague criterion of how they fit together (like "professional" vs other, but in fact I work as hard as I can to blur that line, so that doesn't really work very well). Some even make it to a box to sell or donate, but not very many this time since what remain after the last few filters are those I can't make myself part with. A few get pulled out to give to my children, but they are at the stage of where they would rather find their own books.

It is difficult to pretend that all these books, all these abandoned directions of thought, represent a coherent project, that I was after something during all that time I was accumulating them. Yet there is something that ties them all together, even if it's only my personal history. That is rather disordered, to be sure, but it's not nothing.

I guess I continue to cart around these obsolete chunks of dead trees because they do in some weird way constitute a large chunk of my identity. And just like the part of myself that lives in the head, they are falling victims to decay, to the the depradations of nature and time.

Monday, May 07, 2012

The Prophet Motive

Our shul is having an an internal debate on how much to align ourselves with the Occupy movement. So we had a teach-in, and for that, I did some research on the Biblical and earlier history of the Jubilee tradition. Turns out damaging concentrations of wealth, and ways deal with the economic problems they cause, have both been around for a long time. The Torah is full of what is basically commercial law, which has always seemed kind of boring next to miracles and war and murder and the like, but when approached with the right mindset can be quite fascinating.

One insight I did not bring to the table: The movie version of Fight Club is centered around a debt-retirement scheme, achieved in that case through violence represented by blowing up the office buildings of credit card companies.

The teach-in was only moderately successful from my perspective. I am not big on face-to-face group meeting/discussions, I vastly prefer to have such things over email or similar medium where you don't have everyone constantly jockeying for get their turn to speak, where digressions don't waste people's time, and points can be followed up at leisure…on the other hand, you can't share food that way, and that is half the point of these things.

[Earlier postings on the subject]

Sunday, April 29, 2012

A Consumer's Guide to the Foundations of Reality

After reading about the crazyist approach to metaphysics, I was inspired to make a survey of various things that I have heard proposed as the true underpinnings of reality; the One True X that Underlies Everything Else; the bottom layer of the cosmic architectural stack. I suspect this question is even more impenetrable to common sense than is metaphysics of mind. Not really much of a surprise; this is why we have religion and poetry and art, which seem to do a much better job on such things than philosophy. But that doesn't stop people from trying to approach the big questions with a more prosaic frame of mind, so here's a list of candidates, with my own personally biased annotations. I'm throwing together philosophically serious metaphysics with New Age hoohah and other miscellany, because it seems appropriate – the former kinds don't really seem any sillier than the latter once you discount the cultural packaging they come in.


Materialism, "atoms and the void". Materialism doesn't seem that crazy to me, but possibly my common sense has been warped by decades of hanging out in the vicinity of artificial intelligence labs. For most people, picturing the universe (and more to the point, themselves) as complicated machines is crazy, because there's nobody home.

Feel like I should mention Thales and the other presocratics for proposing not just that the world was made of matter, but a specific kind (water, in his case). This may have been one of the earliest times that the metaphysical question arose, and that someone tried to posit a generalized single stuff making up the plenitude of different stuffs found in the world. So we can forgive him for making a crude guess, but maybe not for opening up this unanswerable and unprofitable line of thinking in the first place.


Idealism is another philosophical classic, the dual of materialism, it never made much of an impression on me. Pretty crazy – Samuel Johnson famously demonstrated that by kicking a rock. Still it dominated European philosophy for a long time and is not dead.

[David Chapman, whose work you should read if you like this sort of thing, would call idealism simply wrong. My own point of view – and maybe it just means I am not taking these questions as seriously as he does – is that weird-ass philosophical ideas, like weird-ass religious ideas, cannot be "wrong" or "right". They convey world views, and the best you can do with them is get a feeling of "yes, I can with a bit of straining envision what it is like to see the world in this way". It's possible that some of these worldviews may be more or less helpful or harmful to your well-being, but that is to some extent independent from whether they are interesting.]


"In the beginning was the word". We can't escape language, it is everywhere we look because we bring it with us:
Elements of what we call language penetrate [so] deeply into what we call reality that the very project of representing ourselves as being mappers of something language-independent is fatally compromised from the start.
– Hilary Putnam, quoted approvingly by Richard Rorty

Given that, it's easy to see it as somehow foundational.


The universe is made up of stories, not of atoms.
– Muriel Rukeyser, The Speed of Darkness

Mathematical Structure

This one (articulated by Max Tegmark) may be my current favorite, as it takes my innate tropism towards abstraction and formal elegance to an ultimate conclusiona. But it also seems somewhat static and dead, as do many other scientifically-minded forms of metaphysics that de-temporalize time.


The theory that the universe is a cellular automata and physics is computation. Pretty crazy when it was invented (by Edward Fredkin I think), it soon become a staple of hard SF and transhumanoid economists, and by now is almost taken for granted in certain circles. These theories always lead, of course, to the suspicion that the particular computation under consideration is not really a bottom layer at all, but instead our universe is a simulation running on a vast computer in some larger (more fundamental) universe. Does this stack of virtual machines bottom out somewhere? If not, see "recursivity", if it does, well, then that is the true foundation.


A somewhat-occult tradition going from Pythagoras and Robert Fludd, through Harry Smith, to various new agers.
Pythagoras conceived the universe to be an immense monochord, with its single string connected at its upper end to absolute spirit and at its lower end to absolute matter–in other words, a cord stretched between heaven and earth.
I'll say this about music; it has the unique capability of serving mathematical pattern formation/seeking, gut-level emotion, and the spiritual, whatever that is. That doesn't mean it constitutes reality, but as it transcends a whole bunch of everyday categories that seems to locate it somewhere beyond.


From G. Spencer-Brown's Laws of Form, which derives boolean logic and the entire universe from the simple act of imagining a distinction. The most compact and elegant foundation I've encountered, although it's not clear what can be built on it – efforts to ground more traditional mathematics on it have faltered as far as I know, and it remains a fringe work.
Thus we cannot escape the fact that the world we know is constructed in order (and thus in such a way as to be able) to see itself…This is indeed amazing…But in order to do so, evidently it must first cut itself up into at least one state which sees, and at least one other state which is seen. In this severed and mutilated condition, whatever sees is only partially itself…In this condition it will always partially elude itself.
A similar idea also apparently appears in Deleuze but I've never been able to make much sense of him


"The world is the will to power -- and nothing besides!". – Nietzche

"According to all this we may regard the phenomenal world, or nature, and music as two different expressions of the same thing...” will, the fundamental world-stuff, expressing itself as nature indirectly and indistinctly as through Platonic Ideas, but immediately and subtlely in music as will-in-itself." – Schopenhauer (note the link to music also)

I wish I had time to study these thinkers in more depth, because I think I suffer from exposure to the cartoon versions (Nietzche became a cartoon version of himeself, confounding the issue even more). But I have experienced the feeling that a will to exist lies in the core of everything. See the next entry:


Vitalism, Hylozoism, the life force! I don't quite grasp how it works as a metaphysics but architect Christopher Alexander has published a beautiful four-volume demonstration of the living universe, so I bow to him:
I state this by means of the following hypothesis: What we call “life” is a general condition which exists to some degree or other in every part of space: brick, stone, grass river, painting, building, daffodil, human being, forest, city. And further: The key to this idea is that every part of space— every connected region of space, small or large—has some degree of life, and that this degree of life is well-defined, objectively existing and measurable. 
I believe that this is true; not just a nice way of talking. As I try to explain it, quietly for all its grandeur, and try to make the artist's experience real, I hope that you, with me, will also catch a glimpse of a modified picture of the universe. 


A popular favorite. The kind of crazyism that is so baked into culture that it stops being crazy and just becomes boring. Nonetheless, seems to work for a lot of people.

The Absolute

Seems tautological, in that it says that there is a bottom layer to reality and gives it a name, without being able to say anything sensible about it. And it's also just "God" with all the anthropomorphism stripped out, but I'm starting to suspect that anthropomorphism is the only redeeming feature of religion. A very 19th-century idea, but I don't suppose you can properly appreciate what the 20th century was all about without understanding what it was rebelling against.

The Tao

Serves about the same role as "the Absolute", but in a less ponderous, more poetical form, a little more apophatic, able to acknowledge the absurdity of trying to grasp the infinite with finite teools. "I call it Tao, but that is not its name".


A favorite of somewhat new-agey yet scientific thinkers like Gregory Bateson or Christopher Alexander (see Life). Appealingly abstract. Patterns of what, though? Perhaps that question is missing the point.


This is way too vague for me, but I realize I know nothing of Whitehead. From skimming Wikipedia, I guess that the innovation of this metaphysics was to dethrone the eternalist point of view in favor of something that can incoporpate change. Sounds like a good idea.


Well, I'm the only fixed point in the swirling chaos (from my own perspective). Perhaps I invented it all! Leads to solipsism and madness and hence ultimately boring.

Humans, or Intelligence, or Consciousness

The anthropic principle, the collective form of Me.

Self-interest, evolution, conatus

Metaphysics for economists and evolutionist. What is real is what persists, what persists is what can act in its own interest. Somewhat similar to will, I suppose, although without the sturm-und-drang.


A metaphysics based on love seems too gloppy to support the violent universe we live in. And the word has become weighed down with tacky usage. But let me just acknowledge the genuine religious emotions that can come along with this idea and leave it at that.


It's turtles all the way down. Which is close to saying that there is no bottom layer, which will be the subject of a later post.


My own invention, sort of, when I realized that much of the argument over which of the above concepts is the One True Foundation of Reality can be reduced to status-competition games among different social groups. Eg, if materialism is true, scientists get more respect; but if some form of religion wins this competition then theologians and philosophers get more respect. This is most evident today in the sputtering and fruitless debate between "new atheists" and their opponents. In other words, the real underlying force and substance behind everything is status-seeking (see "self-interest" above, but this is on a somewhat more meta level). The question of ontological priority is really a question of social priority, and status thus becomes more fundamental than any of the tools used to achieve it.

Naturally many of these overlap. After all, insofar as any of them are even a little bit true, they must be different descriptions or aspects of the same thing. Gather enough of them and some common dimensions seem to emerge (eg, human-centric vs not, static vs energetic/dynamic, knowable vs. unknowable, poetical vs formal, reductionist vs holist).

[ [ Next installment: the cure for metaphysics ] ]

Saturday, April 21, 2012

A useful term

Handschuhschneeballwerfer means "the coward willing to criticize and abuse from a safe distance", or literally, someone who wears gloves while throwing snowballs.

Doesn't that describe most of the blogosphere?

From this list of German words with no English equivalent.

Saturday, April 14, 2012

Happy Ruination Day!

This blog is reputed to have a crush on Gillian Welch:

Nicely sandwiched between Friday the 13th and income tax day, April 14 is the anniversary of Lincoln's assassination, the sinking of the Titanic, and the worst of the Dust Bowl far it's been a pleasant enough day here and now, but I'm a bit edgy.

Friday, April 06, 2012

"Don't Obligate Yourself"

I try to avoid commenting on day-to-day political stuff, because there's so many other places to go for that, but this segment of Supreme Court deliberations on the ACA grabbed my attention:
“[The uninsured are] going into the [health care] market without the ability to pay for what [they] get, getting the health care service anyway as a result of the social norms…to which we’ve obligated ourselves so that people get health care,” explained Solicitor General Donald Verrilli
“Well, don’t obligate yourself to that,” Scalia countered. “Why — you know?” 
“Well, I can’t imagine…that the Commerce Clause would forbid Congress from taking into account this deeply embedded social norm,” Verrilli responded
“You could do it,” Scalia retorted.
So Scalia, the most prominent intellectual force on the right side of the court, believes not only that the ACA is unconstitutional, but apparently would like to get rid of the law that obligates hospitals to provide emergency care to the uninsured (the Emergency Medical Treatment and Active Labor Act, passed in 1986 and signed by Reagan). It is, I suppose, consistent in a glibertarian sort of way. The government is powerless to provide or enforce almost any positive good whatsoever, even to keep people from dying in the streets, although it's perfectly free to strip-search you for no reason whatsoever.

More from Dalia Lithwick:
This morning in America’s highest court, freedom seems to be less about the absence of constraint than about the absence of shared responsibility, community, or real concern for those who don’t want anything so much as healthy children, or to be cared for when they are old. Until today, I couldn’t really understand why this case was framed as a discussion of “liberty.” This case isn’t so much about freedom from government-mandated broccoli or gyms. It’s about freedom from our obligations to one another, freedom from the modern world in which we live. It’s about the freedom to ignore the injured, walk away from those in peril...And now we know the court is worried about freedom: the freedom to live like it’s 1804.

Wednesday, April 04, 2012

It is Forbidden to Forbid

I made my annual pilgrimage to the Anarchist Bookfair this past weekend (previous years here). As usual, had trouble embracing the scene. Bought The Art of Not Being Governed which I've been meaning to read for awhile, and some others.

One thing that always interests me, but it's sort of a forbidden topic, is how all these people who are vehemently anti-capitalist and anti-existing-system manage to survive. Somehow they feed themselves after all, and if it's through the underground economy that's still an economy of some sort. They don't all live on communes in the country.

This question arises on a different level when it comes to the vendors, who are running little businesses promoting an anti-business attitude. Some of these seem to be fairly large-scale and stable affairs. If I was hostile I would use this to dismiss the whole scene, but I'm not really – any revolutionary or agent of change has to simultaneously live within the world as it is while plotting to overthrow it, and thus has to live the contradictions.

Anyway, for some reason the day before my own mental barriers between anarchy and entrepeneurmanship suffered a partial collapse which led me to open up this t-shirt shop. So far it has neither enriched me very much nor done much to subvert the dominant paradigm, but it's early days.

Caught a bit of activist Scott Crow's presentation; the takeaway I got from him was that open collectives (where anybody can join) don't work very well; closed tightly focused groups with shared values work better. That makes a lot of sense, but raised a lot of unanswered questions on just what anarchist governance could mean. The same largely unspoken question hovered over a panel on anarchist parenthood.