Sunday, April 15, 2012

Stross on ebooks

Disclaimer: Since 2006 I have been employed by Google, which sells ebooks and related technology, and therefore competes with several of the companies involved in the subject matter. My opinions are my own and not those of my employer.

Charlie Stross has a compelling analysis of the book publishers' position, in light of the recently opened Department of Justice lawsuit against Apple for alleged price fixing. (Incidentally Stross's comments are usually decent and he engages with his readers as well, so they're worth a skim.)

Stross's post reminds me a lot of something I wrote about 2 years ago. Rereading that post today, I find that I have very little to add to it.

There is one additional thing from Stross's post, though, that I find particularly galling:

. . . if your boss is a 70 year old billionaire who also owns a movie studio and listens to the MPAA, you don't get a vote. Speaking out against DRM was, as more than one editor told me over the past decade, potentially a career-limiting move.

Publishing companies like to portray themselves as scrappy underdogs locked in heroic battle on the side of knowledge against the forces of ignorance. In fact, they are merely subordinate tentacles of large, stupid media conglomerates; they aid the forces of knowledge when it is convenient for business, and do whatever they can to muzzle open discourse when that is convenient. (Fortunately their power to silence critics is fairly limited in a free society.) Critics have been repeating for years that ebooks should be convenient and DRM-free. Publishers never listened; instead they threatened the careers of people like Stross's editors for even bringing it up.

Anyway, I can't comment on the substance of the DoJ lawsuit (and anyway, I know nothing about the facts of the case, so my comments would be pointless), but I find it hard to muster much sympathy for these people. The solution to their problems has been staring them in the face for as long as ebooks have existed. Unfortunately, I'm somewhat less sanguine than Stross that publishers are going to learn the lesson. They seem pretty impervious to persuasion.


p.s. See also Tim O'Reilly's Plus post.

Friday, April 13, 2012

The ideology of Dilbert

So apparently people are still leaving comments on that post about Dilbert from half a decade ago. The latest calls me a pointy-haired boss (this is a little funny because I've never been a manager; on the other hand you could make the case that PHB-ness is a state of mind rather than a job title). I've been arguing with people on the Internet for so long, and am consequently so thick-skinned, that insults as mild as that one barely even register; but anyway the ping prompted me to think even more about how Dilbert sucks.

Aside from failures in basic craftsmanship, which I discussed in my previous post, Dilbert cultivates a poisonous worldview. Here are a few lessons that you will learn by reading Dilbert strips:

  • Your boss, co-workers, and clients are not human beings, but objects to be ridiculed.
  • You are never at fault for anything; it is always your stupid boss, stupid co-workers, and stupid clients.
  • There is no possibility of change. Every attempt at change will be thwarted by the system.
  • There is no possibility of escape. Everywhere else you go will be equally bad.

Would you want to be friends with someone who believes these things? Do you want to become a person who believes these things? Then why would you read Dilbert?

As a corollary of the above lessons, here are a couple of plots that you will never see on Dilbert, even though I think they make sense in the context of a dysfunctional workplace, and could be funny.

Plot: Dilbert runs a job interview with a smart, capable young college graduate, who passes all his questions with flying colors. Dilbert takes him aside and whispers in his ear, "Run. Run now, while you still can."

Reasons you will never see this: (1) Dilbert never shows any compassion for another human being. (2) Nobody on Dilbert is ever competent at anything except Dilbert himself. (3) It would raise the uncomfortable question of why Dilbert himself does not leave.

Plot: In a long-running arc, Dilbert leaves his office to found his own startup, small business, or consultancy. He makes all kinds of hilarious mistakes, of the type which founders inevitably make, thus bringing his business to the brink of failure. He finds that he can only be rescued by dogged persistence and the help of mentors and allies.

Reasons you will never see this: (1) Dilbert can never be shown to be at fault for anything. (2) Other people can never be seen as a force for good. (3) The root causes of problems in the workplace can never be shown to arise from the inherent difficulty of making anything work well; it must always be rank stupidity, arrogance, or some other venial human flaw. (4) Scott Adams has no idea how to draw any setting besides a cube farm.

The latter of these is particularly telling, because Adams obviously left his job to found his own business as a cartoonist (and as a result became quite wealthy). But the fantasy he sells to his audience is not one of struggling to change the objectively terrible conditions of their lives, just like he did; it is one of complacently remaining in place while cultivating a smug sense of superiority and alienation. The relationship between Adams and his audience is therefore one of condescension and contempt.

Long ago, Paul Graham ran an ad for his seed capital fund titled Larry and Sergey Won't Respect You In The Morning; Graham was articulating a disdain for large corporate workplaces that is a distant cousin of Adams's, and yet there is a crucial difference. Graham is (admittedly for self-serving reasons) trying to channel his audience's discontent into an urge to follow their dreams and change the world. What would a similar ad for Adams's work look like? Oddly enough, it would be something like this: "I, Scott Adams, won't respect you in the morning. I left my job to build something of my own. Now sit there and read my comics like the powerless peon you are, and were always meant to be. Ha ha!"

Wednesday, February 15, 2012

In which Kevin Drum walks in wearing clown shoes, a clown nose, and a floppy wig, and is taken seriously

So Kevin Drum wrote something ridiculous about copy protection, and Tim Lee and others (see the comment thread) have tried to write thoughtful responses. But this is superfluous. Drum's original post is ignorant, arrogant, and insulting. Drum lazily handwaves away the multi-decades-long failed boondoggle of technological copyright enforcement, and the combined opinions of practically all subject matter experts who are not directly employed by the publishing oligopolies. He does not attempt to refute the evidence; in fact he does not even engage with it. Along the way he manages to sneak in some snide insults for the people who made general-purpose computers (like the one he is typing on) the incredible instruments of human creativity that they are today. What makes anyone think that a mere presentation of further evidence and argument are going to sway him?

Scientifically, the proposition that technological copyright enforcement can dramatically reduce infringement without severe and costly restrictions on liberty is in the same ballpark as climate change denial and cures for homosexuality. I could explain the implications of the Church-Turing thesis very patiently, in very small words, but frankly it strikes me as rather like reading aloud to a student who's not only too lazy to read the book, but too lazy to crack open the Cliffs Notes.

And I would add that this whole business looks very different when (as is the case for many in Silicon Valley) people in your extended social network have had startups or products crushed by errant IP law. Furthermore consider that countless engineer-years have been wasted dreaming up and implementing fruitless schemes like DVD CSS; however they were financially compensated, those are real and concrete wasted human lives. (Counting that production as economic output is rather like a broken windows fallacy in which the window never gets fixed, but the guy who broke the window gets paid.) The publishing oligopolies demand the satisfaction of their fantasies, and engineers pay the price in sweat and tears. Against all this, consider the extremely weak empirical evidence for large-scale harms from digital copyright infringement.

Drum strikes me as the moral equivalent of a priest reassuring his lord that the farmers ought to be all-too-happy to be taxed a few more bushels of grain to burn to the sky gods. Of course it costs him nothing to utter those words, so he can afford to be unbelievably cavalier. But this type of behavior should not command our respect.

Sunday, February 05, 2012

Desktop environments: get out of my way

Increasingly the applications you use — yes, you, sitting there, not some rhetorical "you" — can be broken into two categories:

  1. Lightweight stuff where you just want to minimize your maintenance costs. You don't particularly care whether you learn every keyboard shortcut; you just want it to be simple and cheap and available wherever you go.
  2. Heavyweight applications where you do hardcore work. You are an expert operator of these applications and any minimal loss of functionality annoys you.

The categories that these fall into differ from person to person. For a programmer, category (1) might include a photo editor and category (2) might include your text editor. For a graphic artist, these might be reversed.

Desktop environment developers have developed this arrogant idea that they can impose "human interface guidelines" on the user to make their user experience simpler, more consistent, etc. This is an illusion, because category (1) applications will eventually all be on the web, and category (2) applications have never obeyed any platform's human interface guidelines (HIG), and they never will.

Maya 3D laughs at your pathetic HIG, Microsoft. Emacs laughs at your pathetic HIG, Ubuntu. Photoshop laughs at your pathetic HIG, Apple. (For that matter, your own software usually laughs at your HIG, Apple.) When you spend all day doing hardcore work inside an application, you don't give a fuck if it's inconsistent with everything else you do, because the gains from having that app work exactly the way you want far outweigh the loss from having it be slightly inconsistent with your MP3 player and your PDF viewer. For category (2) applications, the homogenizing influence of the HIG makes about as much sense as having a carpenter install exactly the same grip on a hammer, a power drill, and a jigsaw.

As for web applications, the web laughs at all platforms' HIGs. And anyway it's more important that they be consistent among web browsers than that they be consistent with the conventions of the host platform. Unless you're one of those douchey people who carries their iPad everywhere in a little murse non-murse device that makes you feel acceptably masculine, you're going to check your email using some other device sometimes.

So what we basically need from desktop developers is to get the fuck out of the way. Integrate really well with the web browser, and also give the user the most efficient, unobtrusive way possible to switch between type (2) applications.

Thursday, November 17, 2011

The social graph is...

"...neither social nor a graph..."

...provided you redefine the words "social" and "graph" to mean something other than what they mean to everyone else.

M. Ceglowski is just being deliberately obtuse, or more precisely he is taking a wild excess of rhetorical license in order to make his statements seem more profound and unconventional. For example, he writes:

We nerds love graphs because they are easy to represent in a computer and there is a vast literature on how to do useful things with them. . . . In order to model something as a graph, you have to have a clear definition of what its nodes and edges represent.

Well, that's actually bullshit. In a dynamic Bayesian network, you don't have a complete definition a priori of what nodes and edges represent. Well, you do, in that the nodes represent variables and the edges represent relationships between those variables, but the weights on the edges are learned statistically from data. An edge may represent a meaningful connection, or it may mean nothing at all. The graph precedes semantics, not vice versa. Likewise with the social graph. People are connected, and you don't necessarily know what each connection means. But it's still a graph.

The labels on the social graph's edges may be subtler and more multidimensional than the simple weights you put on Bayesian network edges. And we don't have a good handle on how to learn those labels, or even what the labels should be. However, calling for the abandonment of a useful mathematical construction in an emerging field of science because it's incomplete is something that you do when you want to convince people that you're smarter than the people working in that field. It's not something you do when you want people to become better-informed.

Ceglowski also writes that the social graph is "not social" because... well, actually, I have trouble even locating a coherent argument in that part of the essay. He seems to be confusing "social" with "sociable". The social graph is social, since it describes relationships between people. Perhaps some activity involved in digitally reifying the social graph is anti-social (Note that anti-social is not the opposite of social — anti-social behaviors are social behaviors!). But that doesn't make the social graph "not social". By that standard, sociology is not a social science because sociologists spend a lot of time by themselves in libraries.

Incidentally social scientists have been modeling social connections as graphs for decades.

Here is a short list of the valid points Ceglowski makes:

  1. FOAF relationship labels are kind of dumb and embarrassing.
  2. Manually maintaining anything other than a very coarse-grained digital reification of a social graph is a tedious chore.
  3. Making your social network and behavior the property of a company whose revenue model is not aligned with your long-term interests is a bad idea.

And here is a short list of other, non-terminological points that Ceglowski just gets wrong:

  1. Social networks do "[g]ive people something cool to do and a way to talk to each other". It turns out that sharing photos, videos, and links is one of the most broadly appealing online activities, and social networking sites seem to do this better (along some dimensions) than dedicated photo-, video-, and link-sharing sites.
  2. Judging communities by the outward-facing cultural artifacts they produce is a radically inadequate measure of value. The vast majority of communication is point-to-point, not broadcast, and the vast majority of interpersonal interactions are social grooming. Social grooming is a deep-seated primate instinct which nerds devalue at their peril. Social networks have made online social grooming far easier than their predecessors did.
  3. People on WoW, Eve Online, and 4chan have healthier social lives than people on Facebook? Really?

Note that I write all the above as someone who dislikes Facebook and is skeptical of reductive approaches to modeling social relationships. And I've been advocating* an end to proprietary social networks for years — long before I started working at Google, and in fact before Facebook was even the predominant social network. So I'm broadly sympathetic to Ceglowski's aims. But I don't like at all the way that he goes about explaining them.


*Incidentally, rereading this old post, I realize that I completely missed the possibility that the dominant social network site would simply become a huge platform for third-party applications. I guess it never occurred to me that serious companies would bet their livelihoods on being sharecroppers in the walled garden. Go figure. I could speculate that this willingness can be traced directly to the Valley vogue for building companies to flip rather than to create sustainable, decades-long sources of enduring value — if you're just holding on until your "liquidity event" then it doesn't matter that your business is built on the fickle forbearance of your platform landlord — but I'm not sure how right that is.