Last week, I wrote about John Underkoffler, designer of the interface technology featured in the 2002 movie “Minority Report,” who shared his insights on what he saw as the core significance of that technology. As cool as the Tom Cruise character’s gestural manipulation of data across multiple displays in a spatial operating environment was, Underkoffler said, “that stuff is important, but it’s hardly the most important thing. The collaborative element is way more important.”
In any case, it’s hard to watch “Minority Report” — or “Iron Man,” the other movie that featured Underkoffler’s UI design work — and not be blown away by the sheer coolness of the visual experience. So I took advantage of my recent opportunity to speak with Underkoffler to get a sense of what the impact of his film experience has been from his perspective. I opened this portion of the interview by asking him, if he had gotten the call to develop the interfaces for “Minority Report” today rather than 15 years ago, what in the movie would be different. He said it was a difficult question to answer, because the movie has had such a huge impact on how people developed technology in the intervening 15 years. But he gave it a shot:
I was trying to show the world a bunch of stuff when I designed what we ended up seeing in 2002, when I designed the “Minority Report” interface. And the thing that people really latched onto was the gestural part. It was the most obvious and the most extroverted part, because you’ve got Tom Cruise up there waving his arms around, and it looks really cool. And it is actually very powerful. But it wasn’t the only thing that I was trying to show, and I think what I would do today is really foreground some of the other pieces. The gestural parts of what’s going on in “Minority Report” I would still show, but I’d really accentuate these other pieces. One is that this is actually a collaborative user interface. You go back and watch those scenes, of course the camera lingers on Mr. Cruise more than the others — he’s the main character, he’s the highest-paid actor. But if you go back and watch those scenes, what you’ll see is a bunch of people. There’s a small team of really capable people; there’s a SWAT team that’s come in to solve this time-critical problem. And they’re able to do it because of the UI. The UI connects them, it allows them to work together, and that’s something that modern-day UIs still don’t do. And this is Oblong’s big mission right now: to produce user interfaces that are collaborative, that literally invite people to work at the same time, instead of being isolated behind their glowing rectangles. So the collaborative piece I would definitely foreground.
What that would feel like, watching this new, parallel universe version of “Minority Report,” is that you would marvel at how this team was able to function almost like a hive mind, to have their ideas and their intent and their will and their analytic capabilities joined together as if there were cables going from one human head to another. Except it’s not cables, it’s a user interface.
As I pointed out in my previous post, Underkoffler went on to co-found and head up Los Angeles-based Oblong Industries, which has commercialized this technology in the form of its flagship product, Mezzanine. He continued his explanation of the movie experience in that context:
The other piece that I was keen to show is an idea that we also pursue at Oblong, which has to do with scale. Which is, how much stuff do you need to see in order to get your work done? If the answer is, everyone is only going to have a smartphone from here into infinity, that’s not a very good answer, because that says any kind of work you need to do can be done, or has to be done, in this little three-by-five card. But we know that’s not true — you’re not going to solve a really time-critical, tricky, traffic redirect problem when the president visits your city on a three-by-five card or on a smartphone. What you need is a big space. What you need is what you would have done before the world had computers. You’d come into a room, you’d unroll a map on a table, you might even push little bottles around, three of you would stand at the whiteboard or the blackboard, you’d sketch solutions together, you would use the space — and a lot of space — to contain the problem. And that’s what “Minority Report” really showed: The problem of figuring out all the particulars of the future murder was really complicated. There were tens of thousands of images that needed to be correlated, and the characters had to spread them out. You can’t do that on a little screen, so what you need is a UI that gives you access to lots and lots and lots of pixels, lots of visual space. Because that’s where the hardest problems get solved, is in visual space, and you need to be able to spread out.
I asked Underkoffler which he found more challenging: his work for “Minority Report,” or his work for “Iron Man.” He said “Minority Report” was more challenging, because he was starting from scratch:
It’s interesting — in developing the stuff for “Minority Report,” I was developing to a set of ideas. The payoff at goal was to be able to depict these ideas, to articulate these ideas, in such a way that people would see them and instantly understand them onscreen. In “Iron Man,” you’re really developing for a person, for a character. You’re developing literally for Tony Stark, and the truth is you’re also developing for Robert Downey, Jr., who is pretty much as smart as Tony Stark is in the fictional world, but in the real world — and just as demanding. So they’re very different kinds of design problems. If you think about it, the “Minority Report” interface, again, is collaborative. The idea is that a lot of people use this interface. In the story world of “Iron Man,” where did the UI come from? In the story, Tony built it for himself. It’s a product with exactly one customer. Tony doesn’t care if no one else in the world can use it; he alone needs to be able to use it. So it’s a lonelier kind of interface, in a way. But at the end of the day I think it was more challenging to do the “Minority Report” thing, because we were starting in the movie world from scratch. When it came time to do “Iron Man,” we already had “Minority Report” and a bunch of other stuff to reference. Of course at the same time, in “Minority Report” I was making very liberal use of all of the work that I had done in the 80s and 90s at the MIT Media Lab, so it’s not like there was no precedent.
To get a sense of how this interface technology is being used in real life, I asked Underkoffler to provide an overview of how one of Oblong’s key customers, PwC, is using it. He explained it this way:
Mezzanine is Oblong’s flagship product — it’s really the “Minority Report” operating system in a way, a real-world instantiation of the “Minority Report” scenes. PwC built it into a facility they call the Delta Room. And I guess I should start with Mezzanine itself, which is general-purpose, and which not only PwC, but a hundred other customers on six continents are also using. Mezzanine, in addition to being literally “Minority Report,” is a new kind of computer that lets you arrange visual information all over the room on lots and lots of screens. It’s a computer that lets you control it from anywhere in the room, or even around the world. It’s a computer that for the first time, many people can use at the same time. So PwC uses Mezzanine in all those modes. They bring in clients — some of their most important clients — and they spend half a day or a day in an intensive session where they’re just working together with the client. And this is very different from a typical engagement, where you might have a several-hour-long pitch to a client. What they understand deeply at PwC, and what Mezzanine has been able to let them go even deeper with, is that the best kind of client engagement is actual productive engagement — not a one-directional pitch, but a bi-directional working session. So in their Delta Room, they’ve got clients, they’ve got PwC consultants, and in a kind of structured way, everyone in the room is tossing visual ideas up on the screens, which is what Mezzanine lets you do, moving them around, juxtaposing them, seeing what kind of meaning comes out, trying out different ideas, parking results over on the side screens, and iterating. So the room itself becomes a canvas. The architecture itself is a computer, which is another of Mezzanine’s ideas.
Underkoffler wrapped up the conversation by explaining how Accenture is using Mezzanine, as well:
Accenture has a somewhat analogous use case. They’ve actually built Mezzanine into a new offering called the Accenture Connected Analytics Experience. The activities that Accenture undertakes using Mezzanine are much more focused than the PwC deployment that I was describing, and in particular, as the name suggests, around Accenture’s analytics offering. They actually use Mezzanine as a medium to deploy their various powerful analytics software elements. So instead of just looking at one bit of analytics on a small screen at a time, you fill the room visually with results, with the analytics engine kind of cranking away. And again, you can see lots of stuff all at the same time, which is something that humans do well, and to date, most computers haven’t done very well.
A contributing writer on IT management and career topics with IT Business Edge since 2009, Don Tennant began his technology journalism career in 1990 in Hong Kong, where he served as editor of the Hong Kong edition of Computerworld. After returning to the U.S. in 2000, he became Editor in Chief of the U.S. edition of Computerworld, and later assumed the editorial directorship of Computerworld and InfoWorld. Don was presented with the 2007 Timothy White Award for Editorial Integrity by American Business Media, and he is a recipient of the Jesse H. Neal National Business Journalism Award for editorial excellence in news coverage. Follow him on Twitter @dontennant.