user experience

I’m at an industry meeting today about AJAX — basically the collection of technologies on the client and the server (but generally thought of mostly as DHTML & Javascript) that most web companies are building out richer applications mostly delivered via web browser. Google’s GMail started this trend in earnest (you can quibble here), but today there’s a ton of interactivity in web applications that just wasn’t there a year or two ago. Mozilla generally and Firefox specifically have had a big impact here, as the development (and especially debugging) experience on Firefox for AJAX-style applications is significantly better than with any other browser, at least for the moment.

Anyway. The first thing I’ll say is that we’re in this awesome building pretty close to where the Giants play in San Francisco — used to be the Macromedia building but is now, of course, the Adobe building. It’s really a terrific building: old brick, exposed timbers & beams — pretty much vintage SF Web 1.0 Bubble. đŸ™‚ ***But*** there’s no friggin’ open wireless network. What the heck? Are we back in 2004 or something? Ugh. So.

But what I really wanted to post about in this particular missive is how much things have changed, truly, and how much technology, for all its foibles, is incredibly more user-centered than it was when I was at Stanford, or Trilogy, or even Apple, really. Back in 1990, Mitch Kapor wrote something called A Software Design Manifesto (I’ll link to the article sometime when I can connect). What he basically asserted is that the software industry (which had just witnessed the release of Windows 3.0 — craptastic!! Now, improved with LanMan!) needed a new role that was analogous to a building architect (or urban planner, some others posited) — someone to think about human use of software products. This was an important thought, and his essay was the first one that I read when I embarked on my work in HCI.

[Hmm. Gartner guy just noted an issue with Firefox 1.5.0.2 displaying GMail. Uh oh. Everyone in the room swiveled their heads to look at me. Hi, I’m the Firefox guy. In the orange shirt. Hard to miss.]

[Aside: this talk is veering pretty wildly back & forth between enterprise & consumer applications, and can I just say this: Oh my god. Enterprise is sleep inducing. I can’t believe I’ve spent half my career to date working on enterprise. Thank goodness for the other half — the consumer half. I’m glad that I understand how to operate in both worlds, but man. Yawn.]

Anyway, even 5 years after Mitch wrote that essay, when I went to Trilogy, HCI was considered a slightly wacky discipline. I managed to convince Joe (our CEO) that HCI mattered for Trilogy. (Probably true. Hard to say. Long, but boring story, as the aside above suggests.) I spent time working on it at Apple. And occasionally companies would hire us to help them with it at Reactivity. But it was always a bit of a hard sell, and I think most people viewed it as a discipline that was somehow separate from the technology or business activities that companies were working on.

Now, though, it’s different. With both consumer and enterprise software, it seems to me that everyone talks about the user experience, about how getting it right is crucial to application adoption, and application adoption is the thing that matters, either in getting a return on your investment in the enterprise or in getting consumers to use your service. It’s an interesting, and fundamental shift, seems to me. We’re getting to the point now where technology is still a little wonky, but at least the conversations mostly include at least putative normal human beings and ideas about how they’re going to use our work. That’s a great step forward.

But I think it’s more fundamental than that. In my admittedly mostly homogenous circle of friends (nerds, all of us), the things that people get most excited about anymore are things like videos of Stephen Colbert online or experiences like World of Warcraft or open media like political bloggers or just places to interact like MySpace or Facebook. The technology matters — is critical, in fact — but it’s less often the point of what people are working on. And HCI seems to me to be blending into the fabric of the development & business teams instead of standing as a separate discipline (this is a super-long conversation and it’s more complicated than what I just wrote, but the gist is true.).

[Another side note: most startup guys that I talk with lately have MacBooks — it’s sort of the uniform of the moment here. But in this room, it’s probably more like 60% Windows machines and only 40% MacBooks. I think it’s maybe because a lot of these folks are in enterprise environments.]

Comments are closed.