Sunday, December 25, 2005
Without him, apples wouldn't know how to fall and orbits would the result of some sort of Platonistic metaphysics.
The poles are melting and it's pretty much too late to do anything about it. Some are predicting sea-level rise of eighty feet.
The NYTimes does a piece on information overload, dead-trees-bound-in-cardboard holiday version. BTW I've been playing with both Delicious Library and the Web 2.0 competitor, LibraryThing, and am thinking the solution may be in compiling databases of books rather than actually reading them.
If you were wondering where the title of Syriana came from, here's an explanation, sort of.
Wednesday, December 21, 2005
WELLINGTON, N.Z. (AP) - A group of 40 people dressed in Santa Claus outfits, many of them drunk, went on a rampage through Auckland, New Zealand's largest city, robbing stores, assaulting security guards and urinating from highway overpasses, police said Sunday.Hat tip to Orac.
Monday, December 19, 2005
Here's the essence, in my view:
- They extract value from massive amounts of freely available stuff. Doing anything massively requires a big computation infrstructure, so there's a barrier to entry.
- They provide some added value for free to the community (search, tools for making more stuff). This brings them attention, the important currency of this economy.
- They figure out a way to extract some money on the side (literally in their case, that's where the ads are).
Of course, that's the user's view. In reality the tail wags the dog, and the advertising part of Google is actually much bigger and more important than the first two points. Google puts much more effort into the advertising side than in the search side, which makes sense -- all those free meals and fancy buildings and jet planes have to be paid for somehow.
In the meantime here I am creating content for free using Google-owned tools.
Recently bhyde and others have commented that long-tail economics brings benefit only to the hubs, the filtering and aggregating services. I don't think that's quite right. Assuming Google as a paradigm, it actually does bring some benefit to small content creators who can run advertising and take a share of the money stream. The people who are going to get killed are the middle-sized middlemen. Economies of scale will produce a few massive hubs like Google or EBay, consumers and small producers on the edges will get some benefits, and the smaller aggreators (like the huge publishing house that I work for, or the small publishing house that my friend runs) will suffer.
Note: I do not actually have an Internet pundit license and may have no idea what I'm talking about.
Sunday, December 18, 2005
That doesn't mean the lefties are wrong to act indignant and press the issue, now that it's in the open and acknowledged, but there seems to be an element of theater to the whole thing.
Max Sawicky at least seems to share my attitude:
For a red diaper doper baby (o.k., not really red, at least a light pink) like myself, it's possible to get very cynical about these revelations that the USG is flagrantly violating the law in its surveillance of U.S. citizens and treatment of foreigners. To me this is not news, it's always been done, it's the way this unAmerican American government works.There is an important difference, however, in what you or I think and what everybody thinks. (More than one, actually.) Even when it's the same thing. The difference is timing, which hinges on what is commonly regarded as reliable reportage. The masses may decide what is true much later than you, and that point in time is an important political milestone. So cynicism (or as I prefer to say, realism) aside, now is the time to pump up the volume.
Saturday, December 10, 2005
"Q: Why did the chicken cross the road?"
"A: To cause a global pandemic."]
Tuesday, December 06, 2005
So yeah, a standard visual notation for expressing software designs is a good thing. Nonetheless UML appears supremely braindamaged to me, for many reasons:
- the presentation of UML seems inextricably linked to a particular notion of software development process. Thus, you have diagram types that are sort of similar (concept and class diagrams) but whose main difference is that they are intended to be used at different phases of the process (analysis and design). Unnecessarily redundant, and they are close enough to each other to cause cognitive confusion.
- UML concepts are loose, vague, sloppy, and weirdly named. For instance, "attributes" and "associations". Both of these are (ultimately) links between an object and something else, but the first is used for primitive types and the latter for object-object relationships. Argh. But actually, when I quizzed the presenter about this, he said that associations turn into "association attributes" or maybe it was "Attribute associations". Double argh.
- UML is a visual language, but the tools used to create diagrams all suck like a black hole. OK, maybe they don't all suck, I haven't tried them all, but Visio and ArgoUML and Rational Rose do. They all have one or more of: lousy UIs, missing features, idiosyncratic notation (to pile on top of the built-in idiosyncracies of UML).
- Since the tools suck in different ways (for instance, ArgoUML is actually sort of OK but is missing some of the diagram types), it might be possible to compensate by using more than one and passing the UML between them. Ha. UML is a well-established industry standard, right? You'd think that as a standard, it might be possible to write out a UML diagram from one UML editor and read it in with another. You'd be wrong. There is NO established serialization format for UML. This I find amazing and hard to understand even taking UML on its own terms.
- Related, but getting more philosophical, UML constantly fudges the extent to which it is trying to be a formal language. On the one hand, we are just drawing pretty pictures so the marketing guys can see what we are doing. On the other hand, we introduce all sorts of details that suggest we are really writing code, like public/private indicators on attributes. And then everybody tries to make tools that can do "round trip engineering" and go back and forth from UML to code (I've never seen one of these work).
- There's some undefineable grating quality about all discourse surrounding UML. I think it's related to the above, its attempting to straddle the gap between actual code and something else. This something else might be "business" but if so it's expressed in some way I don't understand.
I guess I'm extra miffed because I actually like diagrams and visual programming, but it seems to be that we have a case of a poor standard sucking all the air out of the space of visual object oriented programming. Also, I completely agree with what I see as the main goal of UML, which is enhanced communication between engineers and customers/domain experts. Again, I just think this is an abysmal way to go about it. Object-oriented programming is supposed to do this in itself, without an extra layer of sloppily-defined cruft.
Maybe I'm just missing some sort of business-oriented DNA. I went through an experience some years ago, where I (and my cow-orkers) first encountered relational databases. Everybody there was from sheltered academic environments where we had never seen such a thing before, and we regarded mostly with hostility and didn't make very good use of it. Eventually I had an epiphany where I finally got the value of the relational model, and understood how real-world applications might actually be interested in data. Perhaps UML similarly has value that is just hidden from me due to my lack of appropriate background, although I doubt it.
Google fodder: UML sucks
Wednesday, November 30, 2005
Somebody with more time and energy than I have really ought to put up a website for the Secular Conspiracy to Ban Christmas. Trolling the right is fun; back in my youth when The Last Temptation of Christ came out and was being picketed by religious wackjobs, me and some subgenii joined in with a protest of our own that was subtly designed to be ambiguously read (Satan wants you to see this movie! Independent thought is from Satan!). So what's with the youth of today with their web design skills and all that? America needs you!
This latest wingnut meme is so ridiculous one might be tempted to ignore the blatant antisemitic undertones, but they are pretty obvious.
And it has become pretty general. Last Christmas most people had a hard time finding Christmas cards that indicated in any way that Christmas commemorated Someone's Birth. Easter they will have the same difficulty in finding Easter cards that contain any suggestion that Easter commemorates a certain event. There will be rabbits and eggs and spring flowers, but a hint of the Resurrection will be hard to find. Now, all this begins with the designers of the cards.Then there is the anarcho-theocratic christmas conspiracy, which I don't think is a put-on, but honestly I can't tell for sure, so it's a damn good job if I'm wrong.
The International Jew: The World's Foremost Problem
Update: OK, I knew you folks wouldn't let me down.
Here's the official proclamation.
A more succinct statement.
Saturday, November 26, 2005
This last point, which is not really an argument, is probably at the root of it. Publishers are scared, and rightlly so, of Google and the entire Internet, which is a threat to their business model. Publishers are middlemen and the net generally serves to drive them out of business with much cheaper and often nbetter alternatives (like Craigslist is doing to newspaper classifieds). I happen to work for a company that is owned by one of the largest publishers in the world, and they are scared, so I guess it's reasonable for a one-person company that is run out of a living room to be scared too.
- It's a copyright violation to scan the whole book (arguably true, but the argument applies equally to scanning web pages, so if this were to hold it would put search engines out of business, not good for anybody).
- Serving up excerpts is not fair use (seems false to me though I suppose there's some legal case to be made)
- Once they serve up excerpts they will then go ahead to serve up entire copyrighted works without paying the copyright holders (certainly a false argument; they could obviously be sued if they started doing that).
- It's rude of them to mess with copyrighted works without asking permission first (not a legal argument, but true as far as it goes. Google seems to be pissing people off unnecessarily).
- Google is a huge behemoth with a $400 stock price, whereas small-press publishers is a tiny, marginal, we-do-it-for-love operation, and they are afraid of getting crushed under the wheels.
Publishers are not mere middlemen, they can add a lot of value by finding, nurturing, and promoting authors. A lot of the infrastructure of the counterculture is associated with threatened old-media microinstitutions like independent bookstores and small publishers and magazines. These marginal economic activities provide a living to a multitude of authors and middlemen. If all this is replaced with a structure that consists of unpaid content creators (bloggers) and huge technocorporate behemoths (Google, telecoms) that is not necessarily an improvement. Content may be more diverse, but all the money flows to the big entities rather than the creators.
Thursday, November 24, 2005
He doesn't speculate as to why, then, the Bush administration and its backers insist on cutting taxes while debts are mounting. Is it possible they aren't even good at being selfish? I don't think it's any sort of actual anti-government ideology, or they'd be cutting spending as well as taxes. Stupidity then, or more specifically blind animal greed that can't even recognize its own self-interests.
Tuesday, November 22, 2005
Does "Web 2.0" mean anything more than the name of a conference yet? I don't like to admit it, but it's starting to. When people say "Web 2.0" now, I have some idea what they mean. And the fact that I both despise the phrase and understand it is the surest proof that it has started to mean something.Yeah, I had the same sort of epiphany he describes. Someething is happening here/what it is ain't exactly clear.
But there is a common thread. Web 2.0 means using the web the way it's meant to be used. The "trends" we're seeing now are simply the inherent nature of the web emerging from under the broken models that got imposed on it during the Bubble.
That's an interesting thought; Web 2.0 is really a return to Web 0.1 -- more of the intelligence augmentation, social intelligence, peer-to-peer, conversational stuff that was part of the original hypertext vision through Bush, Englebart, Nelson, and Berners-Lee. A return to traditional values, as it were.
Monday, November 21, 2005
Ominous signs: first, various neterati or whatever they are called are pointing out how Google is growing a global brain and we should panic and/or worship it.
Then, Amazon comes up with a scheme to turn humans into subroutines for their version of the GB, giving it a mildly creepy name to ensure unease.
Hm, it occurs to me that this service is sort of the converse to CAPTCHA systems. So if CAPTCHA is supposed to prevent various kinds of web-spam, turning human intelligence into a web service can defeat it, with some small cost increment. It's a whole new twist on the Turing test (or the more modern Voight-Kampff Empathy Test) -- make your mechanical system more human-like by incorporating actual bits of humanity in it.
Saturday, November 19, 2005
I realize I have similar relationships to economics and theology: I'm not a true believer in either, but I'm fascinated by the theories of both, and the little self-consistent model worlds they create. So I was doubly fascinated to see this debate on the nature of the relationship between economics and religion: Is Religion Rational? The Economics of Faith, a debate between Larry Iannaconne and Bryan Caplan.
Now, I thought that "Bob" Dobbs and L. Ron Hubbard had perfected the union of religion and moneymaking some time ago, but apparently there is more to it than that. The core of the theory seems to be investigating reasons why people might rationally choose to believe irrational things. I guess it goes back to Pascal's wager.
Here's a sample of the flavor:
The combined actions of religious consumers and religious producers form a religious market which, like other markets, tends toward a steady-state equilibrium. As in other markets, the consumers' freedom to choose constrains the producers of religion. A "seller" (whether of automobiles or absolution) cannot long survive without the steady support of "buyers" (whether money-paying customers, dues-paying members, contributors and coworkers, or governmental subsidizers). Consumer preferences thus shape the content of religious commodities and the structure of the institutions that provide them. These effects are felt more strongly where religion is less regulated and, as a consequence, competition among religious firms is more pronounced.Apparently it's a whole subfield (but what isn't). Seems pretty dry, and I'm dubious whether any meaningful concepts can bridge the gap between the radically individualist models of economics and religion, which if it's anything other than a scam is about getting outside of the egotistical, individualist frame of mind.
However one defines religion and religious goods, it is clear that religious activities involve a large amount of risk. The promised rewards may never materialize, the beliefs may prove false, the sacrifices may be for naught. In this respect, religion is the ultimate "credence good", a fact noted by several authors... Expected utility models might seem like the natural first step, but as Montgomery (1996b) has emphasized, objective religious "information" may simply not exist, leaving no rational way to assign probabilities to most religious claims.Uh, yeah, that could be a problem.
File under: amusing academics
Wednesday, November 16, 2005
Perhaps the most significant challenge to the traditional peer-review practices comes from open-source projects like the Public Library of Science, which, though their journals are peer-reviewed, are available to all readers. Michael B. Eisen, an assistant biology professor at Berkeley and one of the co-founders (with Harold Varmus) of PloS, believes that academic bloggers face similar challenges to those of scientists who publish in open-source journals like his.
"One of the main issues we face in trying to convince junior academics to publish in PLoS instead of more established journals is their concern about how such publications will look at tenure time. I keep trying to convince people that, in an ideal world, tenure decisions should be made on the quality of one's work, not the venue of its publication. And there's no reason this shouldn't apply to things like blogs as well," he says.
Um, but the whole point of PLoS is (in their words):
The Public Library of Science (PLoS) was formed in 2000 by scientists and physicians to make peer-reviewed research freely accessible online to the world.So what does does this have to do with blogging? PLoS is a peer-reviewed journal that happens to have a different economic model behind it, and I presume should be treated like any other acadmic journal when it comes to assigning publishing credit. Blogging is not (in general) peer-reviewed or reviewd at all, which is why it's hard to account for it in tenure decisions. As far as I can see they have nothing to do with one another, other than both involving this new-fangled web thingy.
In a way, we should be grateful for O'Reilly and Robertson and Limbaugh and Ann Coulter and their slime-slinging ilk. They live in those black and nasty psycho-emotional places, so we don't have to. They show us how ugly we can be, how poisonous and ill, so we may recoil and say, "Whoa, you know what? I think I need to be more gentle and less judgmental and kinder to those I love." O'Reilly has an inverse effect on anyone with a vibrant and active soul -- he makes us better by sucking all the grossness into himself and blowing it out via a TV channel that no one of any spiritual acumen really respects anyway.It's a very San Francisco kind of rant. Anyone who can turn Fox news into a tool for enlightenment ought to set up shop as a guru or something.
Tuesday, November 15, 2005
Actually, "unpleasant" doesn't cut it, it's creepy journey and it threatens one's own self, as if spending too much time in the company of such stuff can be contaminating.
Anyway, we are not faced with the ultimate evil, but the political situation is evil enough...Digby has an excellent article that starts out "deconstructing" Jane Fonda but ends up doing it to Richard Nixon and the Republican Party he spawned. Actually it's based on an even more excellent article by Rick Perlstein (in turn a review of a book on Jane Fonda -- but enough climbing the citation tree).
It's remarkable how many things that we think of as permanent features of American culture can be traced back to specific political operations by the Nixon White House. We now take it as given, for example, that blue-collar voters have always been easy pickings for conservatives appealing to their cultural grievances. But Jefferson Cowie, among others, has shown the extent to which this was the result of a specific political strategy, worked out in response to a specific political problem. Without taking workers’ votes from the Democrats, Nixon would never have been able to achieve the "New Majority" he dreamed of. But to do so by means of economic concessions -- previously the only way politicians imagined working-class voters might be wooed -- would threaten his business constituency. So Nixon "stood the problem on its head", as Cowie says in Nixon's Class Struggle (2002), "by making workers' economic interests secondary to an appeal to their allegedly superior moral backbone and patriotic rectitude". It's not that the potential for that sort of behaviour wasn't always there. But Nixon had a gift for looking beneath social surfaces to see and exploit subterranean anxieties.
That is the nub of Republican success, whether it was exploiting the sexual anxieties of displaced insecure males in a newly feminized workplace, or convincing conservative evangelical voters that "liberals" were trying to repress their religion and force them to adopt lifestyles they found repugnant. Nixon wasn't the first dirty politician in American history, but he was the most successful at discerning the churning undercurrent of fear and anger in a rapidly changing society and using his personal brand of dark political arts to exploit it. The conservative movement of Barry Goldwater made a Faustian bargain with the Nixonian black operatives more than 35 years ago. The natural result of that soul selling deal is George W. Bush and Karl Rove.
Until we recognize that the modern Republican Party is the party of Richard Nixon and that the allegedly masterful Rovian vision of a permanent political majority is a rather simple outgrowth of Nixon's uncanny understanding of how to exploit the dark side of populist fear and loathing, we will continue to be stymied.
This is a nice glimpse of the dark underbelly of the current political situation, a start an answering the question, "why do people keep voting for these clowns?", or "what's the matter with Kansas and the other red states?".
We see here a kind of protofascism -- a mobilization of individual fear and resentment into poltical power. In this country it hasn't grown into real fascism. Could that happen? I'm guessing not, at least not at a national level, because we have a refreshingly diverse and skeptical population, compared to (say) Weimar Germany. If it happens, it will happen due to economic instability and hardship, as it was back then.
[[repeatedly edited to put back parts that blogger dropped on the floor]]
Sunday, November 13, 2005
Friday, November 11, 2005
My ability to be outraged has more or less atrophied after five years of the Bush regime. But this Veteran's Day provided a cocktail of news stories that had a synergistic effect:
Bill O'Reilly invites Al Qaeda to blow up San FranciscoGah. Just another day in the imperium, I suppose.
Senate votes to repeal haebeas corpus
The LA Times gives Jonah "doughy pantload" Goldberg an op-ed column.
On the other hand, Bush's approval is the 30s and the Republicans are showing signs of coming apart at the seams, so it's not all bad news today.
Wednesday, November 09, 2005
Still, I persist in holding to a constructionist ideal of education. The goal should be not to pour knowledge into kids heads, but to train them in thinking skills. From this standpoint, a controversy is an entry into the field, a chance to explore a variety of points of view, a "teachable moment".
Of course, it's possible to take this idea too far.
Actually I have to give Giblets the last word on this issue.
Tuesday, November 08, 2005
"What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.''Let me change the problem a little bit. Forget "attention", which is fleeting at best. I can give my short-term attention to CNN or a blog but whether or not I absorb any long-term knowledge or value from the act of attending is a different matter. Let's think about knowledge, whatever that is.
-- Herbert Simon
There are two kinds of knowledge: the stuff in my head, and the stuff outside. The latter category is growing at an absurd pace, while the stuff inside is growing at a snail's pace (or possibly shrinking). There is an impedance mismatch between these two worlds; transferring stuff in or out of the head is laborious and there is no technological fix in sight.
The network and the web browser and the search engine gives you "access" to all the world's knowledge, but that's not the same thing as knowing everything. What would that even mean? Too much knowledge is bad for you -- some of Borges' stories address this: The Aleph and Funes the Memorious, and of course The Library of Babel are all about the dangers of having too much information too close to hand. Unless you have an exceptional mind, you always have to trade off depth and breadth of knowledge. Knowing everything means knowing nothing very well.
Managing attention is only part of the general area of managing the relationship between what you know and what you know how to find out. The trick of the googlectual of the future is to have the skills for gathering just-in-time knowledge in a flexible yet rigorous way, and applying it when needed, and then, presumably, forgetting it but keeping a pointer to where to find it again.
Sunday, November 06, 2005
One of the people I used to argue with was Eric Raymond, who later went on to be a famous author and proponent of the naive communitarianism known as Open Source. I don't know if arguing with me had any influence on him, but I like to think so. He's still ostensibly a libertarian despite being one of the leading proselytizers for a very successful form of socialism.
I long ago tired of arguing with ideologues, but every so often the impuslse surfaces again. Nowadays it's different thought -- I can bash on famous economists with bestselling books, rather than unwashed basement-dwelling geeks. Twice now (that makes a trend) I've had to chasten Steven Levitt, the Freakonomicist, for applying his deformation professionelle to get obviously absurd results. The latest was on the occasion of today's column in the New York Times maagazine, where he (and partner Stephen Dubner) presents his mystification as to why people should vote. I and many others take him to task for taking an inappropriate stance to this issue, and I point to Valdis Krebs' paper on social networks and voting as a more interesting and meaningful way to look at voter turnout issues. But maybe the best comment was from GamblingEconomist:
If millions of people vote and the standard economic model predicts that people should not vote, the problem is probably with the model not with the people.An earlier posting of Levitt's dealt with Peak Oil and energy issues (inexplicably vanished from his site, link is to Google cache), which he dismissed with the wave of the magic wand of the markets -- prices will rise, demand will drop, equilibrium will be achieved and all will be well. I pointed out that the speed at which this happens matters a lot -- slow changes in prices allow time to adjust, rapid price increases will cause a lot of pain. Actually it was my attempt to respond to this that ended up in the creation of this blog, so I guess I owe them.
OK, my critiques are not exactly flashes of genius, more like flashes of common sense. Still, it would be nice to know what Levitt's response to them would be. I haven't seen that, because blogs+comments are really not the kind of conversational mechanism that mailing lists and usenet are/were. This is a peeve of mine about the blogging medium, it doesn't really support conversations that well. Comments generally suck -- there are too many (or in some cases not enough), no structure, no quality control, and no conversational structuring.
Partly it's the power-law effect -- the network in the old days was a flatter space, where anybody could wade into the conversation and get an argument going. Now that the web is a reflection of the outer social world, we are back to having celebreties and nonentities, although the barrier between them is perhaps more permeable. There is no way that Levitt could respond to all the people who would like to argue with him even if he wanted to.
Well, maybe the answer for me is to stop trying to debate people who are both a) too high up the curve and b) aren't really saying stuff that's new/relevant. For an amateur political economist, the folks over at Crooked Timber, ie, are both more accessible and more interesting.
Sunday, October 30, 2005
The US has a chance to do some good and advance its own interests at the same time.
Thrasymachus, Socrates, Aristotle, Alcibiades are...Republic Dogs.
A public figure who is competent, disinterested, intelligent, and well-spoken sure stands out these days.
Can extreme right-wing creationists save rationality from the depradations of the left?
Self-help for infectuous disease comix!
Friday, October 28, 2005
Unfortunately I haven't gotten around to reading much of it. One bit I did read is delightfully metacircular:
People behave sometimes as if they had two selves, one who wants clean lungs and long life and another who adores tobacco, or one who wnats a lean body and another who wnats dessert, or one who yearns to improve himself by reading Adam Smith on self-command (in The Theory of Moral Sentiments) and another who would rather watch an old movie on television. The two are in continual contest for control-- "The Intimate Contest for Self-Command"
Yeah. Well, this feeds into one of my old interests, the multiple-centers-of-control problem for minds, as described by Minsky for humans and Tinbergen for animals. It's interesting to get the economist's view. If you stare at this stuff long enough you have to take up Buddhism though, and I'm not sure I'm ready for that.
Schelling gets extra points for going against the prime directive of economics, which is a picture of a person as a rational agent with a unified self, well-ordered goals and desires, and a perfectly rational approach to achieving them. And for coining a good word to describe what he's talking about, egonomics.
I'm going to try to keep my self-improving self in charge long enough to at least finish reading that essay before the library sends their goons around to grab it out of my hands. Have to make the blogreading self go to sleep somehow...
Wednesday, October 26, 2005
Web 2.0 is going through a typical technology hype cycle, cresting with the recent trendola conference and now with some backlash starting to set in. As usual, I'm way out of sync -- I just got started using del.icio.us a few weeks ago and it was only a few days ago that I really had the epiphany, felt the power of it, got absorbed in the growing collectivity.
Blogging is part of it too...so, I'm trying to get with the program despite being an old isolationist fart whose idea of computational nirvana is to immerse myself in a Lisp environment, just me and the symbolic structures, no other people necessary. Well, I'm trying to get past that. I'm actually working on a project that (crazily) has multiple people living in the same Lisp environment.
Summary: Web 2.0 is obviously a hype bubble, but like Web 1.0 the hype bubble is just an overamplification of some real advances. Interaction on the network is starting to get more conversational, collective, and participatory, while interesting new forms of media are being invented on a very rapid basis. Somewhere I called this a "cambrian explosion of web software".
To take some of the critics seriously:
This essay on The Amorality of Web 2.0 came out awhile back and is being widely read and commented. I like how it punctures some of the more rapturous rhetoric that pours out some of our more, ah, effusive technopundits. Most of the substance of this post (and later followups) are based on a critique of Wikipedia's quality. I have to partly agree with this, and to tell the truth I've never quite seen the point of the Wikipedia. But the same critique does not apply to other Web 2.0 media, like tagging (which has a more workable model for joint authorship) and blogs (which are a bazarre of conversations rather than a jointly-built textual cathedral). The end of the essay is worth paying attention to as well, as he gets back to the title and points out that there's no guarantee that a world taken over by Web 2.0 is better.
In particular, if low-quality, free, amateur productions (Wikipedia) kill high-quality but expensive productions (Encyclopedia Britannica), has the world gotten better? I used to have this worry about open source and sometimes still do.
Joel Spolsky, on the other hand, is just being irritable and reactionary. Many other developers have echoed him. Can't say I blame them -- the antithesis of hype is anti-hype. What we need is a theory of hype -- some way to analyze technology trends that can filter out the hype and distinguish the reality underneath. Remember "Push"? "Kiss your browser goodbye: The radical future of media beyond the Web." Uh huh. Well, push was hyped to the max (defined as being the subject of a Wired magazine story), generated some companies which either faded or found something else to do, and now, eight years later, is back in the form of RSS feeds, is real and useful, and is being swept up in a larger hype bubble. It would be nice to understand the dynamics of such things, but I guess that's what VCs do.
Monday, October 24, 2005
Saturday, October 22, 2005
This offhand comment bothered me though:
But I have a suspicion that they corrupt the consciousnesses that they raise, because they confirm them in their belief in the moral authority of fame. With the exception of the cognitive habits of a Googling nation, nothing more disfigures personal authenticity in America than the veneration of celebrities. This is America's polytheism.So "the cognitive habits of a Googling nation" are the thing that most "disfigures personal authenticity"? What does that even mean? I started off writing this post because, as the coiner of the term "googlectual", I wanted to defend a style of thought and writing that relies heavily on just-in-time searching and reading of texts on the Internet...or something like that. But I realize that I have no idea what Wieseltier is talking about, so I can't really form a reply to it. Maybe my cognitive habits have become permanently disfigured.
There's a serious issue in here somewhere -- what does the Internet, Google, etc. do to serious thought? Are academic blogs a great way for ideas to circulate both within and without the intellectual communities they serve, or just a distraction from serious work? Are we all becoming ADD-addled from information overload? Well, no time to answer that now, gotta go catch up with my inbox.
Cosma Shalizi has a nice takedown of Stephen Wolfram's A New Kind of Science, somewhat late for a book that came out in 2002. Back then, I leafed through it casually and my impression was that it was a mixture of self-aggrandizement and old ideas. Now Cosma knows much more about this stuff then I and has actually read the book, and come to much the same conclusion, so two points for my intuition.
Since everything I think these days gets mapped to network theory, the major lesson I draw from this (not very original) is that science is a network of people and ideas, and if you operate outside of that network you are very likely a crank. Wolfram's egomania led him to pull out of the standard networks of scientific exchange and build his own little cult empire. By draining energy from real science, his book is an anti-contribution.
Speaking of science as a network, I recently stumbled upon an issue of PNAS devoted to Mapping Knowledge Domains. Cool stuff, but why isn't there a tool to do this as part of Google Scholar or something? Ok, here is something called HistCite, developed by Eugene Garfield who has been doing this sort of stuff for decades, but it is proprietary and looks kind of ugly to use. Good data though. Really, it's time for Google to take over and conquer this lucrative market. [[update here]]
Hm, Cosma also proposes applying social network analysis to the study of government cronyism, something I was vaguely thinking about myself. As he says, a difficult project to fund.
Wednesday, October 19, 2005
Monday, October 17, 2005
Put these two figures side-by-side and what does it reveal? Well, Oppenheimer is by far the more attractive figure, an incredibly intelligent, dynamic, and cultured guy, as opposed to the egotistical and sleazy Khan (or maybe he just had better press agents). Both served their country as they felt called to do, both ended up more-or-less disgraced. Oppenheimer's bomb was used on population centers; Khan's hasn't yet, but it may be only a matter of time.
Aside from the character of the chief scientists, what's interesting to me is the networks of people, technology, money, and resources that went into the respective bomb projects. The Manhattan Project was an incredible mobilization of resources, involving some $25 billion dollars (in today's money) and tens of thousands of people including many of the best scientists of the time. The Pakistani bomb was assembled by a fourth-rate power through an international network of shady deals and borrowed expertise. although it certainly wasn't trivial to do (a good thing).
I've noticed for a while that it takes big concentrations of power to achieve serious technological breakthroughs (in practice, this means governments or monoplistic corporations like the pre-breakup AT&T). The Internet, that paradigm of decentralized power, was only built the way it was due to the resources and oversight of the military. Once the big breakthrough is achieved though, smaller actors and networks can take over the results and repurpose it. That's happened with the Internet, and it happened with nuclear weapons technology.
Until just now, I always thought that this was an argument for concentration of power, which I grudgingly accepted despite my leftist leanings. But maybe it's an argument against. A decentralized society might not have made the internet but they wouldn't have made thermonuclear weapons either.
Friday, October 14, 2005
Thursday, October 13, 2005
Via Bruce Hoppe , who muses on whether they will uncover the deal-making networks of which Tom Delay is the hub. This seems to point to a general problem in both social network analysis and software: the real interesting networks probably don't want to be uncovered and made public in a neat little graph. Of course, the criminal investigation branch of this field has already thought about that.
Turns out these people have a history. And the scam is common enough to have its own name, "cramming". Apparently the complex multi-company billing infrastructure of the phone industry is being hijacked to allow all manner of bottom-feeders add any charge they feel like to your bill. Welcome to the world of the future. I imagine there is some sort of differntial pricing going on here, and they try to keep the charges low enough so you won't take the time to complain and try to get the charge dropped, or go to the FCC.
Any ecology (or economy) has its parasites. Networks make new forms of parasitism possible but thankfully they also strengthen the immune system, like that consumer watchdog site linked to above.
Tuesday, October 11, 2005
Monday, October 10, 2005
Well, before I get around to helping increase the guy's fame, he goes ahead and wins the Nobel Prize in Economics. I sort of feel like I wish I had bought stock in him. Anyway, he's a very readable and interesting writer and apparently a very nice guy, so congratulations to Thomas Schelling.
Update: Good summary of Schelling's major contributions from one of his mentees at Marginal Revolution.
Sunday, October 09, 2005
Watching "This Week in God" on Jon Stewart's Daily Show, we are, it might seem, witnessing the culmination of a historical progression, from Robert Ingersoll, the great nineteenth-century public unbeliever, to Clarence Darrow, who in the 1920s and '30s would debate a rabbi, priest, and minister during a single evening.I am occasionally against the sort of "villiage atheism" promulgated by people I mostly admire and agree with, such as Richard Dawkins and PZ Myers. It seems too simpleminded for me, and also strategically unwise, since religion won't be argued away by any amount of scientific evidence. This article puts the struggles between these scientists and creationists in historical context.
Atheism tries to take the place of faith but it just doesn't work, for most people. And it has some unpleasant characteristics of its own, which I think stem from its unavoidable tendency to assume the shape of a religion, despite its best efforts. Think of atheism as a sort of drug that is trying to block the religion neuroreceptor in the brain. It has to take the shape of a religion without having the effects. This is only partially successful, and atheism despite its efforts tends to assume some of the negative attributes of a religion, namely fundamentalism and zealotry.
Atheism has a complex relationship to optimism:
Classical atheists tended to be optimistic about the world's future, and their imaginations were indeed stirred by science and technology and the potential for human progress. Rejecting religion often coincided with placing hope in reason, education, democracy, and/or socialism, and those who did so were stirred by visions of a more humane, happier world organized according to human needs. Looking expectantly to the secular and social future meant rejecting the religious counsel of pessimism about our lot on earth.It's interesting that the secular materialist optimists have more or less given up on the problems of the world and instead invented their own kind of nerdvana-based religion, in which the singularity will confer godhood on us and/or our machines. A rather literal interpretation of Voltaire's "If God does not exist it would be necessary to invent him". But
It's safe to say that the future didn't turn out as anyone expected. Scientific and technological progress has been relentless, but its promises of liberation have gone flat. Few still believe that their children's world will be better than theirs. We live after Marxism, after progress, after the Holocaust—and few imaginations are stirred, few hopes raised by our world's long-range tendencies. Indeed, the opposite is happening as terrorism becomes the West's main preoccupation. In countries like the United States, Britain, and France, there has been a turning away from improving societies and toward improving the self.On this terrain, it is no surprise that belief in God has been revived, ... At stake, then, is far more than a conflict between belief and disbelief, but the kind of world in which a religious or a secular worldview flourishes. Where secular hope is in the ascendancy, as during most of the nineteenth and twentieth centuries, it seems as if the belief in human capacity and the here and now will be strong; where fear and pessimism increase, as they have so far in the twenty-first century, humans may increasingly look to God, to their souls, and to a future beyond this life.
Of the books reviewed, the only one I've read is Sam Harris' The End of Faith, of which I had a similar opinion as Aronson:
What is most striking after reading Baggini is Harris's own zealotry. Harris makes no effort to understand believers, be they moderate or fundamentalist; most serious in a book claiming a practical political mission of uniting "us" against "them" is his total lack of interest in any historical understanding.on the other hand:
Harris, for all his negative energy, provides a potentially rich idea about mysticism, as cultivated in Eastern religions, as a "rational enterprise." In Buddhism, he argues, reaching beyond the self has been carefully and closely described and need not be left to faith but may be empirically studied.I tend to agree. Buddhism is definitely the religious belief system most compatible with a scientific materialist worldview. But I'm dubious that it will have much market impact on theistic religions, which seem to give the people what they want.
Personally I'm leaning towards Yoism, the first open-source religion, which I learned about 10 minutes ago. Or maybe the Flying Spaghetti Monster.
Saturday, October 08, 2005
Marx and Engels look blockily down on the world they helped make.
After the quick visit to Sculpture Park, I managed to squeeze in another side-trip to Gödöllö, a three-umlaut town with a restored Baroque castle, a summer retreat for the Emperor Franz Joseph and the royal family. The hugeness and luxury of such places is supposed to fill one with awe at the power and sumptiousness of the art, but for me they mostly inspire rage at the wastefulness and inequality that had to go into the creation of such a lifestyle for the few. In other words, it gave me a bit of sympathy for the commies, despite their bad taste and their own forms of oppression and murder. It put it in context, at least.
In between these two trips I had to hurry to catch the train out, so succumbed to McDonald's, which I guess represents the end of history, no more class warefare, nobody suffers but the cows, the environment, and health.
Update: more here. The pictures of East European stamps whisked me back to my childhood philatelist days, when I was puzzling over all those blocky hammer designs and why they seemed to be associated with "democratic republics".
Friday, October 07, 2005
Actually, it looks like many of the older posts are missing or have been trampled upon when I let the website registration lapse. Foo. Well, I have them all on disk actually, including my proposal for "The Association of Anarchist Parents", which I will have to repost.
Thursday, October 06, 2005
James Clark McReynolds is widely considered one of the most unpleasant men to ever sit on the Court, being labeled "Scrooge" by Drew Pearson. He would not accept "Jews, drinkers, blacks, women, smokers, married or engaged individuals as law clerks." He was a blatant anti-semite and refused to sit near Louis Brandeis (the first Jew to sit on the Court) where he belonged on the basis of seniority for the Court's annual picture to be taken in 1924; ... During Benjamin Cardozo's swearing in ceremony he pointedly read a newspaper muttering "another one", and did not attend Felix Frankfurter's, exclaiming "My God, another Jew on the Court!" ... He was also a confirmed misogynist. Taft said McReynolds "seems to delight in making others uncomfortable."This is in support of the position that Harrient Miers is not really all that bad, considering.
Tuesday, October 04, 2005
3quarksdaily: People reject chance and uncertainty. I don't buy this one, and neither does PZ Meyers. People love chance and pay money to experience it, and they recognize uncertainty. If you read further in this post it touches on "purposelessness", which I think is closer to the mark.
Majikthise posits "disenchantment", also touches on "purposelessness", and at the end says:
Evolution will continue to be controversial as long as people believe that naturalism threatens meaning. I don't know how proponents of evolution can begin to make people feel more comfortable with the naturalistic worldview.This is getting closer...people really do feel threatened by the naturalistic view, but there are good reasons for this. See below.
PZ Myers focuses on purposelessness, fear, and emptiness. Quoting Eric Hoffer:
Faith in a holy cause is to a considerable extent a substitute for the lost faith in ourselves.Which leads to the group psychology of religious crusades, yet another factor.
My own take on this is from a cognitive point of view.
First, it's a mistake to try to explain religion in terms of stupidity, fear, or group identity. No doubt these play a role, but aside from being snobby and dismissive, they don't explain religion specifically. People do lots of things out of stupidity and fear. People do lots of things out of group identity, like cheer for the Red Sox, but that doesn't make the Red Sox a religion.
So, let's start with purposelessness, and let's take the standpoint that people are right, in some sense, to be wary and reject the purposlessness of a naturalistic worldview. Look at it in terms of practical reason.
To make a long story short, let's assume a drastically simplified model of people's mental models, but not completely simplified. People have two different frameworks or stances (in Daniel Dennett's terms) for explaining phenomenon, the mechanical or naturalistic stance, and the intentional or animate stance. You can look at, say, a person as an intentional agent, with desires, feelings, ideas, etc, or you can look on him as a sort of chemical machine, with various mechanisms that work according to the causal laws of physics. In fact, of course, he's both, but the two frames of reference are rather disjoint and stiching them back together is the difficult work of cybernetics, psychology and other somewhat mushy fields of science.
In everyday thought, people can apply these frames of reference as needed and switch between them. If their car breaks, they might curse it out (as if it was an agent) and then proceed to open the hood and repair it (as if it were a machine). Doctors have an elaborate methodology of ministering to a patient's humanity while treating their body as if it was a defective machine.
So what does this have to do with religion and evolution? Science has a professional bias towards mechanical explanations, either eliminating agency altogether or finding ways to reduce it to naturalistic explanations. That's fine, that's what science does. But ordinary people with lives find this disturbing, for good reasons. They know that it's improper or immoral to apply the mechanical stance to people except in special circumstances (this I think is at the root of the Frankenstein mythos: those angry peasants have a point). Real scientists are not (usually) monsters, they have ways to reconcile their humanity with their dedication to the amoral and merciless mechanical viewpoint, but the peasants don't know that.
Religion, on the other hand, has a bias towards intentional explanations, seeing agency everywhere, in people (souls), in nature (animism) and the universe as a whole (God). Science's challenges to the intentionality of the universe are one thing, but evolution and brain sciences threaten the very soul. This scares people, and goes against their moral and practical intuitions.
What I'm trying to get at is that ordinary people, who may be ignorant of science and not inclined to philisipophy, still have a ideas about the consequences of different styles of thought. They know that the mechanistic worldview is in some sense incompatible with their intuitions. What they don't have is the sophistication to try to reconcile these worldviews. Scientists and intellectuals keep working at this problem, but nobody can pretend it's solved.
In short, fear is at the root of people's rejection of evolution, but it's not a groundless fear. People understand, albeit vaguely, the different systems of thought involved and the conflicts between them. They feel that much is at stake, and they are right about that. It's not that surprising if the issues are played out at Kansas school board meetings, in unenlightening ways. This is hard stuff.
There's a whole spate of books that purport to have some psychological or evolutionary explanation of religion. I own a bunch and have even read a couple. The one that clicked most for me, in that it concentrated on ascription of animacy, was Religion Explained by Pascal Boyer.
I addressed some of these issues in my dissertation, which was ostensibly about programming environments but wandered off quite a bit.
Monday, October 03, 2005
I already know how to program, so I don't really want to spend time reading this classic. How can I dismiss it without having to read through the whole thing? Ah, here we go, look up recursion in the index and get this:
Recursion isn't useful often...For most problems, it produces massively complicated solutions -- in those cases, simple iteration is usually more understandable. Use recursion selectively.
So what the hell is netarchy? Rule by social networks, I guess. It could encompass a wide swath of social network theory, but it's emphasizing that these networks aren't just for chatting, but for coordinating goals, actions, and power. This is the interesting part of SNA (to me) but often seems lost with the emphasis on dating services like friendster or job-finding services like linkedin or marketing tools like visible path. The important social networks are outside of such simple-minded mapping schemes, I'm pretty sure. Something like theyrule.net is a little closer to the mark, but too conspiracy-minded, as is the ridiculous Discover the Network site, beautifully parodied here. Netarchy acknowledges such traditional power-networks, but instead of reacting in horror accepts them as a given and tries to work with them. One way to do this is simply by mapping and talking about the power networks, as the above sites do. Another is by encouraging the formation of counter-networks.
There's a paper by Valdis Krebs called "It's the Conversations, Stupid!" where he points out that ordinary social networks are a strong determinant of voting behavior. A somewhat obvious point, except that most analyses of voting behavior are in terms of individuals, demographic groups, or interest groups. These are not the same as networks, and the network approach provides some interesting strategic insight. Networks already are a big, unacknowledged part of democracy, even if they are not yet acknowledged by changing the name to netarchy. Netarchy encompasses not only the crony networks that are the obsession of conspiracy theorists, but the more bottom-up networks of a functioning democracy.
Netarchy can be used to describe existing political phenomena but may also be used as a guide, or at least a slogan, for generating some new ideas about how to run a society. Maybe it's time to revisit the lovely but rather old-fashioned models of democracy we have inherited. Direct democracy doesn't scale and lacks the damping factors of a separate political class; representative democracy has some of the opposite problems -- elites lead to self-serving corruption. The old science-fiction model of an electronic plebescite voting on issues directly might not be so great, if the model of California's ridiculous ballot propositions is anything to go by. But why not some sort of more flexible network based representation system? How about a website where I can pick new political representatives for myself at any time, rather than at artificial 2, 4, or 6 year intervals? How about if I can pick specific ones for particular issues? I don't have time to study every environmental question in detail, but I'd be happy to be able to designate the Sierra Club or the Viridian Underground as my representative on such questions, with an option to override on specific bills. When it's time to draw up the new constitutions for the Devolved Former States of America, let's try and be open to these and other netarchy-inspired ideas.
Saturday, October 01, 2005
And as if to start the proceedings off, today's paper reveals that
Pacific Gas and Electric Co. warned Friday that Northern California home heating bills would leap 70.8 percent in October as hurricanes Katrina and Rita drive up natural gas prices nationwide.and electricity prices along with it.
But take heart, all the news is not bad (if you read to the end)
The weather may provide Californians some consolation. Long-range forecasts from the National Weather Service call for higher than normal temperatures through December, said David Reynolds, meteorologist in charge of the service's Monterey office.See? What global warming taketh, global warming restoreth. Sort of-eth.
Update: All Gulf oil output shut down
The Long Emergency: Surviving the Converging Catastrophes of the Twenty-first Century
James Howard Kunstler
A book in the we're-all-doomed genre. Kunstler focuses on the coming peak oil crisis and our incapacity to deal with it. A long-time critic of suburbia, he is onvinced that our investment in car-dependent infrastructure is really what's going to doom us -- we will be completely unable to adapt to a world of scarce and expensive energy. While this is the central crisis, other related ominous trends feed into the general sense of doom -- global warming, emerging diseases, and a economy based on no-longer-supportable fictions. Globalism and the Wal-Mart economy comes in for some whacks, based on their cheap-energy-dependent business model.
None of this is exactly news, but Kunstler does the serivice of wrapping all these alarming trends into a readable package designed to wake people up. The future looks pretty bleak, with the degree of bleakness dependent on the degree of energy dependence vs. the degree of sustainable communities. The sunbelt states (Arizona, Southern California) will suffer the most as the cheap air conditioning goes bye-bye. Suburbanized areas will be doomed. The Northeast will recover its old structure of small industrial towns which can support local manufacturing and local farming.
This vision suffers from being completely backward-looking, and appears to be fueled by nostalgia. Essentially, the idea is that the entire 20th century will be undone, and we will go back to the way we lived before abundent petroleum-based energy -- we'll be mostly farmers. This seems extremely unlikely to me, and undesireable as well. Modern technology will not disappear, although it may change radically as its components become more expensive. Telecommuting has the capability to replace physical commuting, for instance -- this is not an option Kunstler considers, since he apparently believes that all modern technology is going to disappear along with the oil.
This is a form of flawed argument that pops up frequently in the book. There is an implict assumption that oil is going to completely run out (rather than just becoming expensive) and that anything that is even a little bit oil based now cannot work in the future. He dismisses solar energy, for instance, because it takes non-renewable resources to manufacture and transport the panels. Whether solar is workable under conditions of oil scarcity requires a quantitative economic argument -- it's not going to be an all-or-nothing thing.
The book is also marred by a misunderstanding of scientific concepts like entropy and closed system. In general Kunstler does a reasonable job of presenting the technical side of his argument, but occasionally distorts things to support his argument.
In short, this book performs the valuable service of waking people up, but is not a reliable guide to the future.
Friday, September 30, 2005
Seriously, why is someone like Bennett, who if you made him up in a novel would be too ridiculous a hypocrite to be beliveable, still spilling out nonsense on the news shows? OK, that was a rhetorical question.
He hits on Weyrich's version of the Prime Pundit Fallacy- since he's in it for the attention and media adulation everybody who ever puts themself out there is in it for the same reason. ... what should be obvious is that people who blog behind pseudonyms aren't, in fact, doing it for fame and fortune (why anyone thinks one who desires fame generally would turn to blogging I do not know). The main reason people start blogging is that they want to, in some small way, occasionally have an impact on the public discourse.
So here's the dilemma, for an obscure blogger like myself -- to have an impact on discourse or anything else, you have to get some attention. Not necessarily from the MSM, but at least from people on the nets. So blogging becomes a mix of high and low motives, or idealistic and cynical -- there's the social goal, and then there's the selfish goal.
I am not naturally an attention-seeker, in fact I seem to have a talent for not getting attention and credit for things I do, but every so often I find myself forced into scrabbling for it. In the blog world, that means shameless linkwhoring and generally advertising yourself. Definitely goes against my nature, but sometimes it's good to go against your nature.
Just about any action is going to have this mix of motives, and navigating the mix is one of the jobs of moral philosophy and political philosophy. Some people who can't handle complexity become Randroids and loudly proclaim their devotion to selfishness exclusively. Most of us muddle through, using our embeddedness in real-world social networks to guide our actions so they aren't exclusively self-seeking.
I am suspect of my own motives in writing this blog. Am I trying to change the world, or just barking at it? I don't have a social agenda, although I'm as much in favor of truth, justice, and the real American Way as anybody, not to mention turning the Republicans out of power as soon as possible. Mostly what I'm trying to do is unclog my brain, by forcing me to organize and externalize the clound of random ideas that flit through it. Whether this does me or anybody else any good remains to be seen.
BTW Atrios and Cranium are clearly right in this particular case -- a blogger had a clear social goal, went ahead and did something about it, achieved some success (and attention), absolutely nothing wrong with that, Weyrich is indeed a wanker.
Wednesday, September 28, 2005
This is hardly a new idea -- if you watch any long-running police show like The Wire (best thing on television BTW) you will see the cops making social network graphs of criminal organizations, usually on a funky pinboard, and the better-equipped cops in real life have software to do the same thing. And the concept of netwar has been around for a long time, although apparently the lesson is just now penetrating the heads of our leaders, who, when faced with an distributed asymmetrical enemy, decide to invade a nation because that's what they know how to do.
So, that brings me to the inspiration for this post: the sudden outburst of cronyism, or more accurately, of awareness of cronyism. This started building with various Jack Abramoff-related news, peaked with Michael Brown, and is now climaxing as Tom Delay gets indicted, Bill Frist gets investigated, and other exciting news.
Cronyism is the soft form of criminal conspiracy, but it's also not that far removed from the groovier, more celebrated forms of social networking. The much-vaunted networks of Silicon Valley are a crucial form of capital, but they are also crony networks with the attendant downsides of insider deals and quid pro quos for the connected.
Oh hell, I have nothing original to say about this topic except that I coined a lovely phrase to describe what's going on here -- social crapital -- the tool by which the kakistocracy retains its death-like grip on the nation's mechanisms of power. Does not appear in Google so you heard it here first.
Singularity! A Tough Guide to the Rapture of the Nerds
Keywords: singularity, hypertext, writingsystems