UI Design as Normal Science

The desktop GUI has shaped the way we think about human-computer interactions for the better part of 40 years. That may be changing.

jjosephmiller
5 min readNov 2, 2022
A young Steve Jobs wearing a non-customary blazer and bow tie introduces the original Macintosh in 1984.
I don’t really know how to cope with Steve Jobs in a bow tie and blazer.

So I’ve been re-reading Thomas Kuhn’s The Structure of Scientific Revolutions. It’s my first full read-through since my undergraduate days at Hampden-Sydney College. I’m pleasantly surprised by how much of it I remember.

I’m also struck by the parallels between Kuhn’s framework for understanding science and the history of human-computer interface design—the discipline more commonly known today as UI/UX design.

Kuhn on Science and Paradigm Shifts

It’s probably useful to begin with a brief recap of Kuhn’s basic argument. If you’re already up on 20th-century philosophy of science, then you can skip down to the next section.

Still here? Cool. Let’s start with a couple of Kuhn’s technical terms.

Normal science: “Research firmly based upon one or more past scientific achievements, achievements that some particular scientific community acknowledges for a time as supplying the foundations for further research.”

Paradigm: A scientific achievement that is “sufficiently unprecedented to attract an enduring group of adherents away from competing modes of scientific activity” and that also is “sufficiently open-ended to leave all sorts of problems for the redefined group of practitioners to resolve.”

The basic story here is that the early days of any new discipline are characterized by lots of competing theoretical frameworks. For example, in the early days of the study of optics, there were three major theories of how light worked.

  1. Light is made up of particles coming from certain types of physical bodies.
  2. Light is a change in the medium between the body and the eye.
  3. Light is a change in the medium between the eye and certain types of physical bodies.

Each of these theories was supported by some empirical data—collected via formal experiments and observation. But no theory could explain all the observed data. Not until Isaac Newton’s Opticks came along.

Adherents of earlier conceptions of light converted to Newton’s understanding. Within a handful of years following the publication of Opticks, to study optics just was to accept and work within Newton’s description of the nature of light.

In Kuhn’s terms, the Newtonian conception of light became a paradigm for the study of optics. Scientists working to extend, refine or apply the Newtonian paradigm were engaged in normal science.

Of course, paradigms don’t hold forever. As normal science proceeds, it invariably uncovers anomalies—discoveries that can’t quite be explained by the existing paradigm. Each new anomaly requires an ad hoc extension to fit it to the prevailing framework. The paradigm starts to creak under the weight of all the exceptions and amendments. Eventually someone will propose a brand new framework that explains both the older data and the newer anomalies.

Kuhn calls this work “revolutionary science.” Scientific revolutions bring about a paradigm shift, after which normal science resumes.

UI Paradigms

While we normally think of UX/UI work as design rather than science, there are some interesting parallels nonetheless. Howard Rheingold’s Tools for Thought outlines three distinct computer interface paradigms.

Punch Cards

Early computers were programmed by way of punch cards—little sheets of thin cardboard filled with small rectangular holes. A program consisted of a series of cards with a unique set of holes. This particular UI paradigm was borrowed from the textile industry, where machines had a large number of small rods that could weave threads together. The punch card blocked some rods and allowed others through, thus altering the pattern that could be woven into cloth. The cards allowed one machine to create multiple patterns.

Early computers applied this concept, only with circuits. The holes in the cards allowed certain circuits to close while preventing others. By opening or closing different circuits, programmers could construct elaborate Boolean logical operations.

Time Sharing

Punch card programming is slow. Producing the cards was itself laborious. The cards then had to be fed to the computer one at a time. As computers grew faster and more powerful, inputting programs became the limiting factor. Indeed, inputting and debugging programs began to account for more of the hours spent with the computer than actual processing.

Time sharing was the first stage in interactive computing. In this new paradigm, programmers created programs directly on the computer. The inputs shifted from punch cards to keyboards (a repurposing of typewriter inputs), and outputs moved from paper printouts to CRT monitors.

Desktop GUI

The end goal of the interactive computing movement was a computer on every desk. But computers with only a keyboard for input had steep learning curves. True personal computing needed to be more accessible. Enter the graphical user interface (or GUI) with its desktop metaphor—little icons representing files that could be tucked into folders using a mouse that allowed you to click on an icon and drag it around the screen.

The desktop GUI was pioneered at Xerox PARC in the 1970s, but was introduced to the wider public with the release of Apple’s Macintosh in 1984.

UX/UI Design as Normal Science

Pretty much all UI design since 1984 has been the work of normal science. Yes, there have been some extensions of the desktop GUI model—most notably the extension of the paradigm to the web and to mobile. But the basic paradigm—files, folders, desktops, pointing and clicking—still holds.

On the web, we’ve replaced nested folders with hierarchal menus. Pages with unique URLs stand in for files. Content is arranged in lists, with metadata shown on cards. Most of our websites—particularly in the research sector—are faster versions of the venerable library card catalog.

We’ve done a lot of good work during this period of normal science. Card sorts. Tree tests. Usability testing. A/B tests. We’ve got data and analytics for days.

But as Kuhn reminds us, paradigms work by setting boundaries on what counts as science. A paradigm shapes the set of questions that count as doing science, which means that a paradigm also shapes the very sorts of experiments that we run in the first place.

Consider how much of usability or tree testing is aimed at ensuring that users can find things. But the entire notion of finding things on a website presupposes a graphical representations of things that get accessed by way of traversing a set of hierarchies. We conduct usability tests because we have graphical user interfaces that are grounded in a desktop metaphor.

The Next UI Paradigm Shift

No paradigm holds forever. And the anomalies of the current GUI desktop metaphor are starting to pile up.

An extremely unscientific search (as in, I Googled and looked at the first result) suggests that we create 2.5 quadrillion bytes of data each day. That’s 2.5 x 10¹⁸. For comparison, the entirety of Wikipedia is about 2.1 x 10¹⁰ bytes. Interfaces based on physical objects can’t cope with that sort of volume.

We’re starting to see some entirely new metaphors emerge.

A few examples:

  • The infinite canvas popularized by Notion
  • Bidirectional links and none link diagrams seen in Roam Research, Obsidian and Logseq
  • Active reading apps like LiquidText
  • AI-assisted searching in tools like DEVONthink
  • Live app embedding from desktop GUI stalwarts like Microsoft (Loop) and Google (smart chips)

These are metaphors that, as Raghav Agrawal notes above, “basically invent new social practices” that are inherently digital—they have no real analogue in the desktop GUI paradigm.

We’re entering a period of revolutionary science in the world of UX/UI design. It’s still unclear what the new paradigm might be. Or whether it’ll be one that old fossils like me can adapt to.

--

--

jjosephmiller

Employing hypertext to explore ambiguous idea spaces. Principal, Fountain Digital Consulting. Author SCREENS, RESEARCH AND HYPERTEXT. Recovering philosopher.