Imagining the future of web design

A 7 minute read written by Frederick December 14, 2015

Imagining the future of web design

Imagining the future of web design

In a world where technology evolves at warp speed, it's nearly impossible to keep apace with the state-of-the-art, much less get ahead of the j-curve of technological advancement. To forge a clear path forward into the exponential expanse requires a vision that sees beyond the specifics of the moment to the larger trend behind it. What are some fundamental observations we can make about the state and direction of technology — specifically in the context of the work we do?

Technology is abstraction

One such observation is that technologies tend to add a layer of abstraction. Examples of this observation are ubiquitous: hammers, levers, and knives are all abstractions of human capacities to pound, push, or bite. Horseback riding is an abstraction of walking. Horse-drawn carriages are an abstraction of horses. Horseless carriages (cars) are an abstraction of the horsed variety.

This means that code will likely continue to move further into the background. Everyone knows that computers run on ones and zeroes, but few people ever write binary code, which is unreadable by humans. Generally speaking, binary code is generated by machine code, which is generated by assembly languages, which are in turn generated by higher level languages (e.g. C, C++), and so on. Most of the web is written in high-level scripting or interpreted languages that sit atop this stack of abstractions. Since the tendency has been for more human-friendly layers of abstraction to emerge as interfaces for progressively lower levels of code, it seems like a safe assumption to predict that this trend will continue.

How far could this trend go? The dream of artificial intelligence has been that a human might just dictate thoughts aloud in natural language and a computer would be able to interpret them into lower-level codifications as necessary. But is natural language the best option? Describing every detail of a website in natural language might turn out to be just as tedious as writing code, and the point of new technologies should be to reduce work. A high-level abstraction of web design tools and technologies should enable a fluid flow from identifying system requirements to interface design, to the generation of production-ready code, without requiring any specific knowledge of lower-level programming languages.

Levels of abstraction in computer languages.
Computer coding languages stack on top of on another in a system where each layer increases in abstraction

Music as a metaphor

Another observation that can be made looking at the rise of technology is that the more abstract a system becomes, the more important becomes the role of metaphor — models of understanding become more important than the details of the technology. What metaphors might exist to make web design and development a simpler, more human task?

Pondering this thought, I started thinking about musical instruments as an example. Classically, musical instruments were designed to be played by the human body. Their exact dimensions, functions, and methods of use were all dictated by what could be done by people — with their hands, feet, and mouths.

When methods of digital music production started to emerge, a lot of this design heritage was temporarily forgotten. Early digital music interfaces were horribly difficult to use. Whereas acoustic and analogue electronic instruments were built such that their entire range of output could be immediately accessed by a skilled player, digital instruments had most of their functionality hidden away in virtual menus that could only be accessed via tiny screens. While digital technologies vastly expanded the capacity for musicians to make and control hitherto unknown realms of sonic possibility, they lacked strong interfaces to enable that capacity.

Using MIDI, a single performer could conceivably play any number of instruments, but in practice this meant long hours of painstaking coding and programming of machines. For musicians who were used to the immediacy and expressiveness of more classical forms of instrumentation, producing music in this way was painful and tedious. Thankfully, in recent years there has been a growing awareness of this problem, and a new wave of musical interfaces has emerged to pick up the slack.

Most of these new interfaces implement a system called MIDI over USB, which makes it easy to connect a “controller” directly to a computer as a means of inputting and controlling musical information. Many of these USB controllers are modelled after the keyboard metaphor — with white and black keys like a piano — along with a range of knobs and dials for controlling various aspects of a given sound. Other interfaces have more task-specific designs, with grids of buttons for triggering and sequencing sounds, or rows of sliders for mixing sounds together.

Regardless of their specific form, these new controllers are another example of a technology that leverages metaphor and abstraction to make a tedious digital production task fluid and expressive. Could something like this be done for web design?

Flowchart : The current design and development process
The typical system for moving from design to production-ready code is quite complex.
Flowchart : The streamlined process
As design tools evolve, we should be able to cut out a lot of the middle steps involved in producing code.

Prototyping

Thinking about the future is fun, but building the future is even more fun. The above observations got me wondering if I might be able to build a prototype of such a system — a tactile interface for web design.

Prototypes are tools for understanding a problem. In a sense, a prototype is built to fail, so that you can find all the points where the system fails before you build the end product. As such, early prototypes should always be made at a low cost and low fidelity. Whenever possible, they should be built with tools and materials already on hand. Furthermore, they should be made in such a way that they can be rapidly iterated upon, so that as you learn from the failures you can adjust and rebuild.

I started wondering what I might have on hand that could facilitate prototyping this system and remembered that I have a little USB MIDI controller that might just do the trick. Another thing to remember when making prototypes is that you should keep them simple; it’s easier to learn from them if you test a small set of ideas at a time. These are the questions I hoped to answer by building a prototype:

  1. Is it even possible?
  2. Is the idea actually useful?
  3. Is the idea attractive to other people working in the field?

Answering the first question was going to be simple: if it works, it works.

This is the list of system requirements I thought were going to be necessary and feasible for the first prototype:

  1. Capture MIDI input over USB.
  2. Output CSS values to a .css file in response to MIDI input.
  3. View output of CSS in web browser(s).
  4. Automatically refresh to check for changes to the CSS file.

I had to do a bit of learning to meet some of those requirements. The first step is not something I’d done before, but I was pretty sure it could be done using Processing, a programming language designed for artists making interactive art. I knew the second step could also be done in Processing. The third step is basic, fundamental to web design and development — a no-brainer. The final step was pretty easy too. I found some JavaScript online that automatically checks for changes to the CSS and updates the page.

A couple quick tests later, and I had confirmed that Processing was going to be able to both read MIDI input and output values to a CSS file, so I knew I was ready to start prototyping.

In order to test whether the system might be useful, I started by focusing on some basic typographic styling. To make it a bit more interesting, I also added some controls for colour.

The final test, regarding whether the idea is interesting to others in the field, is up to you, dear reader. If you have any thoughts or feedback please leave a comment below!

The prototype in action
An image of the prototype in action. In the foreground you can see the midi controller keyboard, which is used to control the design of the webpage shown on the screens behind.
NOTE: Realtime updates to CSS on the server enable live feedback on multiple devices - allowing you to test your output on multiple browsers, devices, and viewports simultaneously.

Results and next steps

Eureka! The prototype works as planned and has given me some answers to the questions I hoped to answer.

I’ve confirmed that the basic concepts all work, and I think the prototype is worth further development. However, what I’ve built so far is severely limited as to practical use. I do have some ideas for how it might be made more useful by expanding the system into a tactile interface capable of controlling the full range of possibilities afforded by CSS.

Here are some ideas for how I’d like to take this project further:

  • Start an open source project on GitHub.
  • Interface with an SCSS preprocessor to adjust variables in the CSS via MIDI (e.g. Bootstrap’s customization variables: http://getbootstrap.com/customize/).
  • Develop a robust conceptual framework for the styling of elements and patterns that can be decoupled from the code interface. Ideally, a designer shouldn’t have to think about code.
  • Experiment with implementing Google Chrome’s MIDI support to directly control parameters in the element inspector.

See also:

What do you think?

Am I crazy? Or could this work? If you have any thoughts or questions please leave them in the comments below. If you would like a copy of my source files, please send me an email.