Victoria Lacroix


⇐ Blog

The Terminal is an Evolutionary Dead-End

Is change possible when something is so deeply ossified into the software ecosystem?

January 12, 2026


A massive subset of software developers, programmers, IT professionals, and computer enthusiasts swear by the terminal. Its popularity has even seemed to grow over time. I myself have not been immune to its siren song, having spent many years using a workflow designed primarily around using the terminal. Development of software targeting the terminal is as popular as ever, because many users simply swear by it. Still, I think that the rich ecosystem of new terminal software is a symptom.

The terminal as it is today on most computer systems is a program which emulates a device called the VT100, by the Digital Equipment Corporation. This device was for a decade the standard for accessing remote UNIX systems. As UNIX users were expected to use a VT100, UNIX software such as Vi and Emacs was written specifically to use the VT100’s features to show sophisticated terminal user interfaces (TUIs).

Prior to the introduction of terminals like the VT100, computer operators would enter commands line-by-line through physical typewriters connected to the computer with these same typewriters also displaying the commands' output. These remote typewriters, called teletypes, were initially in use to facilitate a form of telegraphy where written messages could be exchanged instantly without the need for an operator to translate Morse code back into the alphabet. By hooking up teletypes into a computer, any number of users could interact with the computer at once with near-instant feedback, giving rise to the command-line interface. Because VT100s were seen as a more efficient replacement to the teletype—themselves not needing paper to output text as well as tape to record terminal sessions—they were initially referred to as glass teletypes.

When the only mechanism for output is a line printer that handles ASCII-encoded text, programs were written simply to emit plain text out to the operator. Teletypes did not meaningfully change this. The chief innovation of UNIX as an operating system is that it leaned heavily into this limitation in its application design by introducing a concept known as pipelines. As a program’s input and output are both simple text, one program’s output could be used as another program’s input. Simple invocations of a series of commands could take input data, format it for reports, tabulate tables, and send the result as an email or to a typesetter, and so much more. In this way, simple tools can be composed together to create complex text processing pipelines. Programming knowledge wasn’t even necessary to write sophisticated piplines—all that was needed was an understanding of what each individual program did. The composability afforded by pipelines allowed every single UNIX program to interface directly with any other program, making UNIX into the first truly integrated development environment (IDE). Even today, users comfortable with the command line will make use of classic UNIX programs to process text for various purposes.

The introduction of graphical terminals, of the glass teletype, broke this composability. Now, UNIX had tools whose use could not be automated in a pipeline, though some text editors like Vi or Emacs could still leverage pipelines on text that was being edited by the user—at the cost of not being able to enter commands as though on a command line. Were one to avoid the use of TUI applications, the glass teletype would add little else to the experience other than a reprieve from needing to refill paper or replace ink ribbons. Today, TUIs have little advantage over graphical user interfaces (GUIs). As contemporary terminal emulators target the features of the VT100, support for mouse input, modern text editing features, even simple things like wrapping words onto the next line when space runs out, each of these features needed to be bolted onto the VT100 in very haphazard ways. The command-line suffers for this as well. Even today, you can’t use a mouse to edit a command before sending it, it is impossible to drag-and-drop highlighted text back into the command prompt for use in further commands.

The terminal’s features have led it to an evolutionary dead end. If the terminal had no problems, no meaningful limitations, then a dead-end would be a poetic fate. I believe however that the terminal is actively holding the command line back. Modern UX conventions would be invaluable for using the command line, but will remain unthinkable and impractical so long as we limit ourselves to using an emulated VT100.

In biological evolution, species hit dead ends all the time. The solution to evolutionary dead ends is what enables evolution in the first place: divergence. Offspring pick up mutations, and the genes of various populations slowly change. After many generations, one’s offspring begin to look and behave very differently from one another. They start to occupy different ecological niches. Where one may run into an evolutionary dead end, others will not.

When iterating to improve on an existing design, sometimes it’s necessary to look into the past to understand the context that gave rise to the status quo. When it no longer becomes possible to iterate on a design, this hindsight allows one to ask if alternatives are possible. In this sense, designers can do what evolution can’t: turn back the clock and take another path.

If the terminal is a dead end in terms of presenting a command-line interface, that begs the question: what would happen if we could emulate a paper teletype instead of a glass teletype? What would a command-line interface look like—and how could it work differently—without the terminal?