Thursday, May 06, 2010

On internationalized TLDs (a contrarian opinion)

The Timberites rejoice. I'm obviously revealing my North America-centric roots but I think that this is a huge amount of cost for insufficient benefit.

Great civilizations leave their stamp on the conventions of world culture. The Romans gave us their calendar; the Indians gave us the modern numbering system; the Italians gave us terms and symbols used throughout the Western world in musical notation (piano, fortissimo, crescendo, ...). The global recognizability of these signifiers is part of what makes them useful.

In the modern era, American culture predominates in computing. In practically every programming language, English words like begin, define and integer (or abbreviations thereof) have special meanings understood by every programmer in the world.

With respect to TLDs, there are two alternatives before us. Alternative one is to make everyone in the world simply learn to use ASCII TLDs. Alternative two is to make everyone in the world learn to use, or at least recognize, TLDs in every Unicode script. Alternative one is actually the simpler alternative, even for non-English speakers.

Imagine if numbers were subject to the politics of modern i18n. We would have the modern positional decimal numeric system, but also the Roman numeral system, and the Babylonian numeral system, and so on, and nobody would ever have asked anyone to standardize on any of them. After all, we have to be sensitive to the local numeric culture of the Romans!

It's not like I'm saying people should communicate in English all the time. I'm only saying that people learn to type and to recognize ASCII TLDs. This is a relatively limited set of special-purpose identifiers. There are only about an order of magnitude more ccTLDs than months in the year or decimal digits. And I would claim that it's useful for everyone in the world to recognize that, say, .uk and .com look like the end of a domain name, whereas .foobar123 does not. Pop quiz: which one of the following is a new Arabic ccTLD, مص or مصام? The reason you can't recognize it is not just that you're an English speaker — people who only speak Mandarin or Spanish or Russian are in exactly the same boat as you. And when ICANN unveils the Chinese Simplified or Cyrillic or Bengali TLD scripts, Arabic speakers in turn won't be able to make heads or tails out of those.

But, whatever, my opinion's on the losing side of history, so it's almost pointless to express it. I just thought I'd get it out there that there was a real benefit to, and precedents for, the status quo where a convention originating in one culture diffuses and becomes universal.

Saturday, May 01, 2010

Kubuntu Lucid and KDE 4 reactions

Speaking of how software makes you dependent on other people, the newest Ubuntu Long Term Support (LTS) release just came out. This means that in a year, support for the previous LTS release will wind down; which in turn means that Ubuntu users must upgrade sooner or later, unless they want to sacrifice security updates and compatibility with new releases of third-party software.

So, I took the plunge: yesterday I downloaded and installed Kubuntu Lucid.

This is the first Ubuntu LTS release that runs KDE 4, the latest major revision of KDE. I've been using KDE for about 11 years, ever since version 1.1. My immediate reaction was simply that KDE 4 is a mess. And after playing around for a few hours, tweaking settings, and trying to settle in, I still think KDE 4 is a mess. As I use it more, I'm not settling into it; I'm simply accumulating more irritations.

Without exhaustively listing all the details, my complaints basically break down into three categories.

First, there are pervasive performance problems. In every corner of the UI, "shiny" effects have been prioritized over responsive, performant interactivity. To take just one example, under KDE 3.5, the Amarok media player used to be super snappy and responsive; it left iTunes or Windows Media player in the dust. In KDE 4, Amarok takes a couple of seconds to expand one album or to queue up songs, and resizing UI panels is painfully slow and janky. (My workstation has a 2.13GHz Core 2 Duo and a good graphics card. This should not be happening.) Similar problems can be observed in the desktop panels, file manager, etc.

Second, in general, the UI changes seem designed to push KDE's new technology into your attention space, rather than getting out of the way so you can accomplish tasks. Again, here's just one example: in the upper right corner of the desktop, there's a little unremovable widget that opens the "activities" menu:

The upper right corner of the desktop is a hugely valuable piece of screen real estate. By placing this widget in the upper right corner, the developers are signaling that this menu contains operations which will be frequently accessed. Do they really think users will add new panels to the desktop frequently? (For non-KDE users, a "panel" is KDE's equivalent of the Mac OS X dock or the Windows taskbar.) So far, almost every time I've clicked this widget has been by accident while trying to close or resize a window.

If you're a desktop developer who wants to show off your technology, this design may sound good: you put this menu there to make sure users discover your desktop widget and "activities" technology*. However, if you're a user, then this menu mostly gets in your way, and you wish it were tucked away somewhere more discreet.

Third, the KDE 4 version of every application has fewer features and more bugs than the KDE 3 version. The "Desktop" activity no longer has a way to "clean up" icons without repositioning all of them in the upper-left-hand corner. The Konsole terminal application's tab bar no longer has a button from which you can launch different session types. The list goes on.

Anyway, of course, I don't pay for KDE, and so in some sense this is all bitching about free beer. However, suppose I did pay for KDE. Would I have any more input into the process? Windows users pay for Windows; if you don't like the direction Vista and Windows 7 are taking the UI, do you think you personally have any chance of influencing Microsoft's behavior? Mac users pay for Mac OS X; if you disagree with Steve Jobs, do you have any chance of influencing Apple's behavior? In fact, you do not, and both user populations have experienced this reality multiple times in the past decade. Mac users loved the Mac OS 9 UI but they had to give it up when Apple stopped supporting it on new Macs. Microsoft users who are attached to the Windows XP UI will likewise be forced to give it up eventually, when Microsoft stops sending security patches.

The KDE 3 to KDE 4 transition is simply KDE's version of the OS 9 to OS X transition, or the XP to Vista/7 transition. Except that those seem to have worked out OK in the end, whereas KDE 4, which was released over two years ago, seems to have lost its way permanently.

I'm writing this post not just to point out KDE 4's defects — I mean, it feels good to vent, but who really cares — but also to marshal further evidence in support of my contention that owning software doesn't mean much anymore.

Even the fact that KDE is Free Software means little in this case. I mean, what am I supposed to do now? I can't stay with the previous Ubuntu LTS release forever, unless I want to expose myself to security risks, and also be unable to run or to compile new software, both of which are deadly for a software developer. Conversely, I can't singlehandedly maintain a fork of the KDE 3 environment forever; this guy's trying but without a large and active community behind the project, it's doubtful that it will remain current for long. And frankly, I'm getting older, and I don't have enough time to invest in both hacking around with my desktop environment and also accomplishing the other things I want to accomplish in my life.

So, I can either (1) suck it up and live with KDE 4, or (2) abandon the desktop environment I've grown to love over the past 11 years, and jump ship to GNOME or something. (Right now I'm leaning towards (2).) Adopting software means making a calculated bet on the behavior of other people. And sometimes you lose.


*BTW "activities" are 80% redundant with virtual desktops and therefore hugely problematic and confusing as UI design, but I won't get into that.

Sunday, April 18, 2010

In which an Icelandic volcano prompts the funniest paragraph on the Internet today

Yglesias writes:

Ever since the eruption, I know I can’t be the only person who’s been wondering how to say “Eyjafjallajökull.” In principle, the Internet and its multimedia cornucopia ought to shed a lot of light on this issue. In practice, no matter how many times I click over here and hear it pronounced, I can’t come any closer to saying it myself.

I thought oh come on how bad can it be. And then I clicked through to Wikimedia and I burst out laughing.

Cynicism and libertarian ends

So when I wrote that cynicism about government does not help the libertarian cause, a libertarian might regard the suggestion with suspicion, given that I'm just another big government liberal. Well, at least one libertarian agrees with me. And his NYTimes column actually contains a lot of stuff that I find pretty risible.

Sunday, April 11, 2010

Computer science and the iPhone developer agreement

Full disclosure: I work for Google. However, this blog reflects my personal opinions only.

Programming and computer science are not synonymous, but obviously the two are deeply intertwined. The fundamental activity of programming is the construction of abstractions. Programming language design and implementation is one of the fundamental forms of abstraction building. It is central to the field, and has been so nearly since its inception. One of the oldest and most important research conferences in computer science is named Programming Language Design and Implementation.

This suggests a particular understanding of what Section 3.3.1 means. Section 3.3.1 says: "Thou shalt not build abstractions other than those we prescribe." It bans one of the fundamental activities of programming.

This would be a mere curiosity, except for Apple's unusually influential position in the computing industry. All trends point towards mobile devices* becoming much more pervasive than all other general-purpose computing devices. Indeed, the combination of mobile and cloud computing may someday replace all other user-visible hardware except what's needed to support input and output (screens, cameras, etc.). And Apple has the credible goal of becoming the preeminent mobile device provider, setting standards for the industry and defining the entire computing experience for a huge swath of future computer users.

Section 3.3.1 therefore constitutes a direct attack on computer science, delivered by a powerful and well-funded organization that aims to transform laypeople's interface to the field. As long as 3.3.1 stands, for a computer scientist to purchase an iPhone or iPad is akin to a biologist purchasing a textbook that advocates against teaching evolution. Full stop. Go ahead and do it if you can't resist the shiny, but understand the moral weight of the decision you're making.

I can already hear people ready to trot out the standard roster of excuses. Hit the comment box if you want, but realize that I've anticipated the common objections and the only thing stopping me from preemptively rebutting them all is the fact that I'm moving soon and I have a huge number of boxes to pack. To pick just three examples:

Q: "The iPad isn't for people like you. Why do you care?"
A: "This post isn't for people who don't care. Why are you reading it?"

Q: "Apple has a right to do whatever it wants with its platform. If you don't like it, you shouldn't use it."
A: "Thank you for agreeing with me."

Q: "You can program whatever you want in HTML5 and access it through Safari."
A: "Yes, the web is an open platform, which Apple fortunately does not control.** I'm talking about Apple's rules for programming on the platform that it does control."


*A.k.a. "phones". Incidentally, I think the British slang "mobile" is more elegant and generalizes far better any of the {cell,smart,super,...}phone terms that are used on this side of the pond.

**Although I will remark that it's naive to imagine that a platform can be preeminent for very long without influencing the market of content and applications to which non-participants in the platform regime have access. There's a reason Hulu used to work on Flash only. But that's a post for another day.