Saturday, May 15, 2010

Ubuntu Lucid update

Following my disappointment with Kubuntu Lucid, I just got around to replacing it with the standard Ubuntu Lucid desktop. It's possible to switch desktop environments using a couple of package manager commands, but I decided to do a from-scratch reinstall.

With a little effort, I was able to make most of the Ubuntu desktop behave OK. Window management is still not up to par with KDE 3.5 + KStep window decorations, but it's enough for now. I'll probably switch my window manager to WindowMaker at some point. (NeXTSTEP-style window decorations with X11 window management gestures are the apex of desktop window management, for reasons that I could go into at length but won't today.) Visually, the new Ubuntu theme looks nice; in fact it looks and feels much better when you're using it than it does in screen shots.

However, there's one fly in the ointment. Sound didn't work. At all. Note that for all its flaws, Kubuntu, which is derived from the same base distribution, had no such problem, so it isn't simply a driver issue. I could bore you with all the details of my debugging adventure, but at the end of the day I blame PulseAudio, and Ubuntu's decision to make PulseAudio central to their desktop sound system. After a couple of hours of unproductive web searching and config file wrangling, removing the PulseAudio packages in Synaptic made sound work, sort of.

Yes, sort of. It still doesn't work quite right. When I open the System > Preferences > Sound menu, I get a dialog box saying "Waiting for sound system to respond" and nothing else. (This behavior occurred before I uninstalled PulseAudio, so that's not the cause.) Apparently a whole lot of people have run into variations of this problem since at least Ubuntu 9.10, and nobody seems to have definitive answers on how to solve it. I'd report it as a bug, but I suspect that it's one of those opaque symptoms with dozens of underlying possible causes and it's probably futile.

I want to emphasize that I haven't had a problem like this with a Linux distribution in years. This is literally a regression in behavior to Linux ca. 2005. Poking around by hand with .conf files in /etc just to get something working on my desktop is something I used to do. It's not something I expect to be doing in the year 2010.

So, anyway, I can't set my sound preferences. I guess I'll just have to cross my fingers and hope Ubuntu didn't assign any really annoying sounds to desktop events.

Tuesday, May 11, 2010

How to design a popular programming language

This has been kicking around in my brain for at least half a decade, and if you know me well then I've probably spoken it aloud in your presence; so it's high time to get it down in writing. Here is my Grand Unified Theory of Programming Language Adoption. There are three steps:

  1. Find a new platform that will be a huge success in a few years.
  2. Make your language the default way to program on that platform.
  3. Wait.

That is all. Note that none of the above steps has anything to do with the language design itself. In fact, nearly all popular languages are terribly designed. Languages become popular by being the "native" way to program a certain kind of system. All of history's most widely used programming languages fit this model — Fortran (scientific programming), C (Unix), C++ (MS Windows), JavaScript (web pages), Objective-C (Mac OS X), . . .

Or, in fewer words: Languages ride platforms to popularity.

Why is this so? Well, to a first approximation, no piece of software ever gets rewritten in another language; and once a critical mass of software for a platform has been written in one language, nearly all the rest will follow, for two reasons:

  • Nobody has figured out how to make cross-language interoperability work well.
  • The network effects from language adoption are immense. Programming is, despite appearances, a deeply social profession. To write successful software quickly, you must exploit the skills of other programmers — either directly, by hiring them, or indirectly, by using library software they've written. And once a language becomes the most popular in a niche, the supply of both programmers and libraries for that language rapidly accumulates to the point where it becomes economically irrational to use any other language.

In fact, I claim that in all the history of programming languages, no language has ever successfully unseated the dominant language for programming on any platform. Instead, a new platform gets invented and a new language becomes the "founding language" for that platform.

Well, OK, there are exactly two exceptions: Java and Python. It took me a while to figure out what happened in those cases, and the answers I came up with were surprising (to me).

Java is anomalous because although it is widely used in its primary domain (Internet application servers), it is not predominant, the way that e.g. C++ is predominant in writing native Windows GUIs. My explanation is that the web architecture has a uniquely high-quality interoperability protocol in the form of HTTP and HTML(/XML/JSON/...). Hey, stop laughing. HTTP and HTML fail all kinds of subjective measures of elegance, but they succeed in isolating clients and servers so well that it is economically viable to write the server in any language. In other words, as unbelievable as it sounds, HTTP and HTML are the only example in history of cross-language interoperability working really well.

I'll abandon this explanation if I can find, in all the annals of computing, another protocol that connected diverse software components as successfully as HTTP and HTML. The only things I can think of that come close are (a) ASCII text over Unix pipes or (b) ODBC, and neither of these provide nearly the same richness or connect components of similar diversity.

Python is anomalous because rather than riding a new platform to success, it simply seems to be displacing Perl, PHP, etc. in the existing domains of shell scripting, text processing, and light web application servers. My explanation is that Python appears to be the only language in history whose design was so dramatically better than its competitors' that programmers willingly switched, en masse, primarily because of the language design itself. This says something, I think, both about Python and about its competitors.

Incidentally, this theory predicts that all the new(ish) programming languages attracting buzz these days — whether Ruby, or Scala, or Clojure, or Go, or whatever — will fail to attract large numbers of programmers.* (Unless, of course, those languages attach themselves to a popular new platform.)


UPDATE 2010-05-15: Reddit and HN weigh in.


*Which is fine. Very few languages become hugely popular, and in fact nearly all languages die without ever seeing more than a handful of users. Being either influential (so that later languages pick up your ideas), or even merely useful to a significant user population, are fine accomplishments.

Thursday, May 06, 2010

On internationalized TLDs (a contrarian opinion)

The Timberites rejoice. I'm obviously revealing my North America-centric roots but I think that this is a huge amount of cost for insufficient benefit.

Great civilizations leave their stamp on the conventions of world culture. The Romans gave us their calendar; the Indians gave us the modern numbering system; the Italians gave us terms and symbols used throughout the Western world in musical notation (piano, fortissimo, crescendo, ...). The global recognizability of these signifiers is part of what makes them useful.

In the modern era, American culture predominates in computing. In practically every programming language, English words like begin, define and integer (or abbreviations thereof) have special meanings understood by every programmer in the world.

With respect to TLDs, there are two alternatives before us. Alternative one is to make everyone in the world simply learn to use ASCII TLDs. Alternative two is to make everyone in the world learn to use, or at least recognize, TLDs in every Unicode script. Alternative one is actually the simpler alternative, even for non-English speakers.

Imagine if numbers were subject to the politics of modern i18n. We would have the modern positional decimal numeric system, but also the Roman numeral system, and the Babylonian numeral system, and so on, and nobody would ever have asked anyone to standardize on any of them. After all, we have to be sensitive to the local numeric culture of the Romans!

It's not like I'm saying people should communicate in English all the time. I'm only saying that people learn to type and to recognize ASCII TLDs. This is a relatively limited set of special-purpose identifiers. There are only about an order of magnitude more ccTLDs than months in the year or decimal digits. And I would claim that it's useful for everyone in the world to recognize that, say, .uk and .com look like the end of a domain name, whereas .foobar123 does not. Pop quiz: which one of the following is a new Arabic ccTLD, مص or مصام? The reason you can't recognize it is not just that you're an English speaker — people who only speak Mandarin or Spanish or Russian are in exactly the same boat as you. And when ICANN unveils the Chinese Simplified or Cyrillic or Bengali TLD scripts, Arabic speakers in turn won't be able to make heads or tails out of those.

But, whatever, my opinion's on the losing side of history, so it's almost pointless to express it. I just thought I'd get it out there that there was a real benefit to, and precedents for, the status quo where a convention originating in one culture diffuses and becomes universal.

Saturday, May 01, 2010

Kubuntu Lucid and KDE 4 reactions

Speaking of how software makes you dependent on other people, the newest Ubuntu Long Term Support (LTS) release just came out. This means that in a year, support for the previous LTS release will wind down; which in turn means that Ubuntu users must upgrade sooner or later, unless they want to sacrifice security updates and compatibility with new releases of third-party software.

So, I took the plunge: yesterday I downloaded and installed Kubuntu Lucid.

This is the first Ubuntu LTS release that runs KDE 4, the latest major revision of KDE. I've been using KDE for about 11 years, ever since version 1.1. My immediate reaction was simply that KDE 4 is a mess. And after playing around for a few hours, tweaking settings, and trying to settle in, I still think KDE 4 is a mess. As I use it more, I'm not settling into it; I'm simply accumulating more irritations.

Without exhaustively listing all the details, my complaints basically break down into three categories.

First, there are pervasive performance problems. In every corner of the UI, "shiny" effects have been prioritized over responsive, performant interactivity. To take just one example, under KDE 3.5, the Amarok media player used to be super snappy and responsive; it left iTunes or Windows Media player in the dust. In KDE 4, Amarok takes a couple of seconds to expand one album or to queue up songs, and resizing UI panels is painfully slow and janky. (My workstation has a 2.13GHz Core 2 Duo and a good graphics card. This should not be happening.) Similar problems can be observed in the desktop panels, file manager, etc.

Second, in general, the UI changes seem designed to push KDE's new technology into your attention space, rather than getting out of the way so you can accomplish tasks. Again, here's just one example: in the upper right corner of the desktop, there's a little unremovable widget that opens the "activities" menu:

The upper right corner of the desktop is a hugely valuable piece of screen real estate. By placing this widget in the upper right corner, the developers are signaling that this menu contains operations which will be frequently accessed. Do they really think users will add new panels to the desktop frequently? (For non-KDE users, a "panel" is KDE's equivalent of the Mac OS X dock or the Windows taskbar.) So far, almost every time I've clicked this widget has been by accident while trying to close or resize a window.

If you're a desktop developer who wants to show off your technology, this design may sound good: you put this menu there to make sure users discover your desktop widget and "activities" technology*. However, if you're a user, then this menu mostly gets in your way, and you wish it were tucked away somewhere more discreet.

Third, the KDE 4 version of every application has fewer features and more bugs than the KDE 3 version. The "Desktop" activity no longer has a way to "clean up" icons without repositioning all of them in the upper-left-hand corner. The Konsole terminal application's tab bar no longer has a button from which you can launch different session types. The list goes on.

Anyway, of course, I don't pay for KDE, and so in some sense this is all bitching about free beer. However, suppose I did pay for KDE. Would I have any more input into the process? Windows users pay for Windows; if you don't like the direction Vista and Windows 7 are taking the UI, do you think you personally have any chance of influencing Microsoft's behavior? Mac users pay for Mac OS X; if you disagree with Steve Jobs, do you have any chance of influencing Apple's behavior? In fact, you do not, and both user populations have experienced this reality multiple times in the past decade. Mac users loved the Mac OS 9 UI but they had to give it up when Apple stopped supporting it on new Macs. Microsoft users who are attached to the Windows XP UI will likewise be forced to give it up eventually, when Microsoft stops sending security patches.

The KDE 3 to KDE 4 transition is simply KDE's version of the OS 9 to OS X transition, or the XP to Vista/7 transition. Except that those seem to have worked out OK in the end, whereas KDE 4, which was released over two years ago, seems to have lost its way permanently.

I'm writing this post not just to point out KDE 4's defects — I mean, it feels good to vent, but who really cares — but also to marshal further evidence in support of my contention that owning software doesn't mean much anymore.

Even the fact that KDE is Free Software means little in this case. I mean, what am I supposed to do now? I can't stay with the previous Ubuntu LTS release forever, unless I want to expose myself to security risks, and also be unable to run or to compile new software, both of which are deadly for a software developer. Conversely, I can't singlehandedly maintain a fork of the KDE 3 environment forever; this guy's trying but without a large and active community behind the project, it's doubtful that it will remain current for long. And frankly, I'm getting older, and I don't have enough time to invest in both hacking around with my desktop environment and also accomplishing the other things I want to accomplish in my life.

So, I can either (1) suck it up and live with KDE 4, or (2) abandon the desktop environment I've grown to love over the past 11 years, and jump ship to GNOME or something. (Right now I'm leaning towards (2).) Adopting software means making a calculated bet on the behavior of other people. And sometimes you lose.


*BTW "activities" are 80% redundant with virtual desktops and therefore hugely problematic and confusing as UI design, but I won't get into that.