In the future, web sites will design themselves

Web designers: I'm writing a book! Sign up to be notified here.
It's about web design in the age of A/B testing, web analytics and more!

I want to outline three key ideas based around web analytics that will underpin the future of design on the web, both in outcome and practise, particularly for bigger, content driven sites. These ideas are (i) the need for designers to understand and advocate for web analytics like we do web standards, (ii) the practice of designing for performance using web analytics and (iii) the automation of web analytics in the design process.

Some of these ideas are already being used and some are yet to develop, but they are all certainly a long way from the mainstream, which is where I firmly believe they should be.

This essay could also be called “Why multivariate testing is the future of content and commerce driven design on the web”, but that is a far less interesting title!

Before I get to the first point, let me tackle one obvious objection, stemming from the title of this piece - that web sites don’t generate design ideas, so they aren’t going to ‘design’ themselves anytime soon. True, ideas and aesthetics need to come from humans—designers, ideally, in the professional context—but what I envisage is a system where those ideas are essentially fed into the ‘machine’, tested, evaluated, and either incorporated into the site or dispensed with. The system is, of course, only as good as the ideas fed into it, but if the software behind the web site is responsible for the testing, evaluating and implementing these ideas, the web site could be thought of as ‘designing itself’.

Allow me to elaborate on the three ideas that I see taking the web design profession another big step forward:

1. Understanding web analytics: Designers, meet data

The web design community has, by and large, successfully adopted a standard approach to building web sites over the past 5+ years.

Styles continue to come in to and out of fashion at breakneck speed, aesthetics and functionality continue to evolve and improve - design on the web as a visual and functional pursuit seems to be quite healthy.

We can build sites on solid foundations that look good. But we are largely ignorant of how they perform.

For some sites, it may not matter. They may perform well enough in spite of their design, they may perform well enough by sticking closely to a largely pre-determined, familiar structure (webblogs, for instance), or the web site may be a creative or personal expression of some kind where quantifiable performance is irrelevant.

For many businesses and organisations however, from publishers to retailers to service providers, performance matters. Major or minor changes to their web sites may have a dramatic effect on their bottom line. Changes may be in structure, copy or design, and there is a strong desire to measure and understand the effect of these changes on the performance of the site. Thus the mini-industry of web analytics was born to measure, report and analyse the performance of these web sites based on certain metrics.

Hopefully it wont be long before designers are equipped to play a similar role to the clients and organisations we serve.

Marketing has known about measuring performance for a long, long time - they talk about conversions, segmenting, cost per action, and that’s just on the web. Measuring the performance of mail-outs, for instance, long pre-dates the web. Many designers no doubt have a passing familiarity with some of these concepts, but this passing familiarity needs to, and I think will, evolve into an intimate understanding of their role in the performance of the sites they design.

Take e-commerce for instance. In e-commerce, it’s easy to see macro effects on site performance - performance is defined by how much money the site makes. Sales go up, sales go down, or sales stay the same. Profitability like wise. But what designs perform best in this environment? What sort of pages, what sort of elements? Do we not have an obligation to know?

This is what web analytics can tell us, once we have the data, and that is why designers of most persuasions must embrace web analytics.

Some tools have emerged to aid us. We have seen heat maps and click maps which are mildly interesting when analysing a page, and those silly session-recording tools have popped up recently which are, in my opinion, of near zero value, however we need much, much more.

The current state of web analytics

...is not a topic I’m really qualified to give a detailed analysis of, but as an outsider I’ll happily give my current impression of where things stand!

The web analytics industry is (with some exceptions) to my mind is roughly where the content management industry was before the (formerly) little players showed up and started eating away at their market from the bottom up.

Once you needed $100k and a team of engineers to successfully publish your content online, now you can get an enormous amount done with some extraordinary, low cost tools that designers and other users wield with excellent results.

Currently, the players in the web analytics market are generally the big players. Hopefully, in the coming years we will see similar bottom-up innovation in the web analytics sphere that makes friendly tools available to the legions of designers and web professionals out there in the trenches, who toil away each day at the coal face of the web. (Pushing pixels - it’s tough work!)

In fact, we may not have to wait years at all, thanks to the data nerds at Google. 

It’s not about statistics

Google Analytics is called Google Analytics for a reason.

Web stats packages are a dime a dozen - notice how many come free with your web hosting account? Mere global quantities or users, pages, or dare I say it, ‘hits’ give you a rough idea about the popularity of your web site and overall trends, but how do they help you make better design decisions?

This is where web analytics comes in. For the sake of this essay I’ll define web analytics fairly broadly as the analysis of raw data that helps stake holders make informed decisions about their web sites (including measurable marketing campaigns driving traffic to the site), but the subset I’m particularly concerned about is the resulting metrics that helps designers make better day-to-day design decisions.

The key thing to note is it’s not the mere reporting of numbers that counts, it’s the analysis of that data and the resulting metrics of performance (your KPI’s and what not) that matters.

Currently, the analysis of those numbers is largely left up to marketing teams who understand user segmenting, landing page conversion rates and so on, or the number crunchers who are tasked with providing reports to management.

In many cases this is entirely appropriate - marketing teams need to be able to monitor the performance of their campaigns, and sometimes deep, detailed analysis requires a real head for numbers (and Excel functions).

However, very relevant data from bounce rates to click through rates are often simply far too removed from the hands of the designer.

Google Analytics has done a lot to change this, especially with its fantastic, mid-2007 update, which made tracking bounce rates, conversion goals, and if you’re really tricky, click-through rates (tag your links, people!) readily available, however there is still a long way to go for several reasons.

Firstly, not enough designers have enough experience monitoring the performance of their designs in hard numbers to accurately predict if changes will positively or negatively affect an existing site. That is, whether what they are being paid to do will actually be a step backwards or a step forwards. That’s a worry, don’t you think?

Perhaps it is unfair to ask designers to be able to gaze into their crystal balls of web site performance as, secondly, there aren’t enough readily-available tools to help them with this task of designing to improve performance, and I’ll discuss this further in the third section. There is one obvious exception however, again freely available from our data overlords at GOOG, which I will get to shortly.

Thirdly, web analytics is generally not a part of the designer-client relationship. A design is finished, a site is built or redesigned, a final amount of money changes hands, the end.

2. Designing for performance: Measure, test, improve

The second idea I want to look at is how one goes about designing for performance.

As a designer, there comes a point where you have what seem to be equally good design ideas. Or ideas that you think are good, but it turns out are ultimately bad. Sometimes catastrophically so. Conversely some ideas which may be dismissed for whatever reason, by management, by the choice of the design team, by your own skepticism, may turn out to be quite beneficial.

Wouldn’t it be good if there was a way you could suspend final judgement and test these ideas out?

You could ‘experiment’ by making the changes live and site wide, crossing your fingers, hoping for the best and seeing what happens, but this is both risky and ineffective. Bad Things could happen, and if you make multiple changes you have no idea what exactly caused problems (or improved performance) and thus no easy way of undoing those changes without reverting back entirely to what you had before, hoping that the damage wasn’t permanent, that performance recovers to where it was, and that you don’t lose too much face in the process!

Perhaps that is not the most ideal way to go.

We could experiment in a more meaningful way by implementing one change, showing it only to a discreet group of our audience, and measuring what changes, if any, it has on the performance of the site for those users vs our unchanged ‘control’ group. This is called A/B testing and it is a slow and steady way of making improvements. It requires patience to record results over time and discipline to keep the experiment clean - if more variables enter the equation, the results from your testing can quickly become meaningless.

Better, but still not great.

What if we had multiple elements we wanted to test, and we wanted to see how they worked in combination, as opposed to in isolation in A/B testing. Enter multivariate testing (MVT). With MVT you can test combinations of multiple elements all at once, letting them fight it out until a winning combination emerges. It does require a certain amount of traffic to derive meaningful results, but it is an exceptionally powerful way to test design changes and combinations of changes in order to optimize the performance of a design.

It also fundamentally changes the designer’s relationship to the finished product.

No longer is the design something set in stone until a redesign occurs somewhere down the track, nor is it tweaked in an ad-hoc way as needs change. It is an organic, ongoing experiment that is constantly being optimized for the performance of a given metric (or metrics), and is dependent only on the supply of ideas to improve the performance of the design.

What does it look like in practice?

In some cases, Google’s free Website Optimizer (alluded to earlier) will fit the bill perfectly for MVT testing. Google says its “free multivariate testing application helps online marketers increase visitor conversion rates and overall visitor satisfaction by continually testing different combinations of site content (text and images).”

Note that they are pitching to the marketing folks, and this is reflected in the way Google Website Optimizer works - it’s very much oriented around a given test page to goal page conversion, which is fine for that context, but designers really need something broader.

3. Automating performance based design - a necessary feature in the future content management toolkit

Here’s where the content management guys come in.

When it comes to the web, designers have become quite technically skilled over the years, and technical tools such as Content Management Systems (CMS’s) have made it possible to achieve individually what previously took a dedicated team of specialists.

As the CMS market continues to mature, I hope that designers will start asking for ways to integrate web analytics into their sites.

Why? If we want to do the most effective form of testing - multivariate testing - we need to set up experiments within our own templating system, and the CMS is the logical place to do that. The backend software is the perfect environment to serve up these multiple variations and track what happens. What’s the click through rate like for front page news stories? What’s the most effective form of navigation for a given situation? How many article clicks come from that fancy footer you designed? Do images - and what kind of images - increase or decrease conversion rates?

Remember, its not just stats we are after - it is key performance metrics and the ability to test our designs that we want.

I would love to see all kinds of content management systems in the future come packaged with the ability to create the testing environment required for ongoing MVT testing, and a dashboard that tracked the ‘vitals’ of a web site’s health in terms of key metrics, in a similar way to doctors and nursing staff monitoring a patient’s vitals in a hospital setting. Maybe without the sporadic beeping, though.

It’s certainly not just about design, either. The effectiveness of web site copy writing, from navigation terms to article content, also plays a crucial (though sadly often overlooked) role. Are you using the most effective headings, leads or trigger words, for example? Through them into the MVT mixer and see what comes out!

In the future, rather than rushing to implement a design idea that looked good on paper, or conversely agonising over a controversial option, I envisage design changes being fed into the backend software of a site, where they are added to the ongoing optimization experiment that is the design of the site, and the software will tell us what is most effective. It will do so because it can deal with far more variables far more accurately than our spongy grey matter ever could. It will of course be set up to serve us, and the most effective combinations will perhaps be automatically added to the site for all users, new ideas will continue to be fed in when they arrive, and the experiment and optimization will continue in an ongoing fashion.

Some experiments may be so common that the software could almost perform them itself, particularly little things like the number of content items or links in any given element (ten news items or five?), and then work out what positive performance outcomes appeared from these variables in combination with other elements being tested live on site.

It’s really the ability for software to determine what combinations of elements perform best that holds such a great deal of potential for designers, as it would be otherwise nigh-on impossible to do it manually - what text variation with what picture variation, for instance? Why debate it - throw it into the system and have the users tell you what they prefer.

It is worth keeping in mind the obvious however - if you start with a dud idea, no amount of optimization is going to turn it into a good one.

Nevertheless, when you consider the power of this evidence based approach to design founded on multivariate testing, and compare it to the ‘cross your fingers and hope for the best’ method that often is employed, it should send a shiver down your spine.

We just need the tools!

Finally...

Maybe the marketing guys (or other designers!) will laugh, and say “You are just discovering this now? but better late than never eh?

I’m also sure there are a number of big web sites that have been doing this for some time, but there were (and still are) big publishing systems before MovableType came along and endeared itself to designers everywhere back in the day.

The point is that understanding web analytics - not just what we can do now, but what is possible with further software development in the area - is crucial to the increasing professionalism of the modern web designer, and indeed the industry on the whole. Not only can a deep understanding of web analytics act as a competitive advantage far more than arguing for, say, the vagaries of accessibility ever could as far as a client is concerned, it also paves the way for the sharing of the results of experiments that would allow best practices to emerge not just from anecdotal evidence but careful observation and reporting of findings across the globe.

Consider online news design for instance, a topic that has been covered on this site before, and imagine all news sites as giant, ongoing science experiments that published their results - it might be wishful thinking, but it illustrates the incredible value that could be unlocked and unleashed, fed back into the machine so to speak, and further improved upon.

If the ideas I outlined at the start of this essay are adopted - they are (i) the need for designers to understand web analytics, (ii) the practice of designing for performance using web analytics and (iii) the automation of web analytics in the design process - then it should be a win for all involved.

The users win, as they are the ones ultimately determining the best combination of elements through ongoing multivariate testing, businesses and clients win because they can steadily improve the effectiveness of their web site in measurable ways with the guidance and skill of a savvy designer, and designers wins because, when every site is an ongoing experiment, we can build up over time a mental (testable!) library of winning combinations, a design sense that is the combination of both creative intuition and empirical data, and we can take the lessons of each one of these experiments to our next project.

comments closed | permalink
rss feed for this site

Thanks for clearing up the idea of “sites designing themselves”.  I see Web Analytics and Design as being able to greatly aid the other to accomplish a common goal.

I am actually a designer who is enrolled in an “Award of Achievement in Web Analytics” program starting in the fall.  I just want to get an understanding in Analytics (I love stats), designing sites and continuing to learn the art of optimization.  Excellent read!

Luc

- Luc Arnold on 06 July 2007

Interesting, and indeed ongoing. I’m working as a designer in a company where we have the CMS in a sister company, and also another sister company doing analytics, and we all try to work together. What I do find is that the task of managing a web case is getting ever more complex, and people are not up to it. Either they drop out and follow the old ways, or we end up with sites that are made for Google, and not for the users - which is my ultimate fear. Treating all web projects as one, with one common goal will lead to the same degradation we see in general. Less to choose from, less to experience, and uniform life everywhere…

- BK on 06 July 2007

I must say it’s pretty long read. but i enjoyed it. and will recommend this post to some friend who clearly are at the beginning of the CMS-rage. Nice work !

- Victor on 06 July 2007

Great post.

- benry on 06 July 2007

I have used Google Analytics, was expecting it to be the best as it was google.
However, when one of my colleagues recommended to use GoStats.com they are good and offer:
-advanced referrer displays
-realtime nature of the stats
-inclusion of your website in its top sites directory list, which is a high PR list thus worth the effort in itself;

- AD on 06 July 2007

I think it jsut seems like a long read cause his line height is too much. having it spread like this causes the brain to fatigue faster as you have to scroll more and less info can be processed on one page scroll.

- david on 09 July 2007

Maybe his analytics package will be able to magically detect that the line height is too tall and redesign the page to provide a better reading experience. I jest, as I know that’s not the point of this. But it does point out that that there are some things an analytics package cannot analyze. For this, with a bit of eyestrain from a line height issue on an informational page, what is the metric that would be analyzed? Return visitor rate? If the content page is the end destination, with no real need for seeking further information on other pages in the site or taking further action that would generate a metric, there might not be much that an analytics package would pick up on that there was a problem… and even if it did, it would take a designer to take a look at the page and say, ah!, maybe I should play with the line height. One could pursue a multivariate guess-and-check approach to line height, I suppose. In any case, just saying this might be a situation this theory might not completely have an answer for. Interesting thoughts, though.

- Marc on 10 July 2007

Thanks for the comments all, and David/Marc, that’s actually quite an interesting point.

There actually are relevant metrics you could use to test if design changes influenced user engagement - time on page would be the obvious one!

It’s a good case in point - you could theoretically test the difference between different line-height/font-size combo’s and with enough traffic & time measure if users seeing one combo or another spent more or less time on the page (presumably reading) or not.

Of course the analytics software wouldn’t pre-empt this, but if user’s complained about it, and a designer was curious about it, you could run an experiment to find out, rather just make an arbitrary call one way or another.

This is what takes us beyond very general ‘best practices’ to specific, evidence-based design gleaned from experiments performed using your own audience, so you can determine exactly what they prefer (on the whole). If there’s no difference in the numbers, it’s up to the designer to use their discretion, but if there is a difference, then there’s your answer - case closed! :)

- Luke Stevens (author) on 10 July 2007

"general ‘best practices’”

My assumption would be that these general best practices have their root in well researched evidence, or they would not necessarily be ‘best practices’ they would rather be, ‘best guessed practices’ =)

- david on 10 July 2007

Hey there,
I read through most of your article above and a company that I was just introduced to today popped into my head in that they simulate new sites, almost like a videogame… check out spigit.com. 

I’m looking forward to knowing more about them soon, especially the metrics side of things as you’ve mentioned here.

- Tyler on 14 July 2007

About

Hi! I'm Luke Stevens and this is/was where I write about design on the web. This blog has been dead for about two years while I've been busy doing the freelance web design thing. Later this year it will be reborn as something new. Until then, enjoy the latest post and feel free to comment with your thoughts!

I'm also speaking on data-driven design at Web Directions South in October 09. If you can make it to Sydney, come along!

RSS Feed

Grab the feed if you want to find out when this site is reborn anew.