Continuous improvement for long-term success
Life is motion; nothing is ever truly still.
Even the greatest mountain is in the process of either being extruded from the earth, or being worn down by the elements.
With everything we do we make this choice: to create growth, or to allow decay.
Neither option is better than the other.
But we do make the choice, whether we notice it or not, about everything.
Including websites.
Three metaphors for a website
When you create a website or app or any other digital product, you can think of it in one of three ways.
A toaster
When you get a toaster, you put it on the bench, you use it every day, you expect it to work, and when it breaks you feel annoyed. Toasters are simple! They shouldn’t break!
A car
On the other hand, when you buy a car, you resign yourself to the fact that every six months it’ll need a service. Fair enough: it has lots of moving parts, there’s a lot of wear and tear. But if you keep your part of the maintenance bargain, then feel like you should be able to cruise around town for as long as you like.
Tom Cruise in Edge of Tomorrow
If you haven’t seen this movie, Tom Cruise plays a future soldier battling an alien army, who discovers that every time he is killed in battle he travels back in time and gets another chance at the fight — meaning that with every day he gets the opportunity to improve. He keeps trying different strategies, dying over and over again, but steadily refining his approach, until finally he is a kick-ass super-soldier who turns the tide against the overwhelming alien army.
Those are the three ways you can see a web project.
If it’s a toaster, you’re going to be disappointed when it breaks, because it will.
If it’s a car, you’ll maintain it, but the world will keep evolving around you.
But if you see a website as Tom Cruise in Edge of Tomorrow, then you have a chance to build an alien-slaughtering powerhouse through a process of continuous improvement.
What does continuous improvement look like in this context?
The process is simple.
On all our web projects, we collect user behaviour data. This data is measured against site-specific goals, and continuously analysed for insights and opportunities
As often as possible — ideally each week — we discuss work in progress with our clients, we review the data and discuss opportunities. Any improvement ideas we add to an ongoing list of possibilities.
About once a month we have a longer meeting where we make a more detailed study of the data, prioritise the ideas and choose the most promising candidate for an experiment.
The experiments are very goal oriented. From our interpretation of the data we create a hypothesis (an “If this… then that…” statement) and we design an appropriate test.
Usually we will test variations against a control using Google Analytics’ A/B testing feature, and if we can find a clear winner then we deploy the successful idea and repeat the cycle.
While this process is consistent, specific improvement projects vary widely, so below are three examples.
Reducing bounce rate and increasing conversions on IUA’s Deals page
Context: Isuzu UTE has a page devoted to latest deals on particular models of vehicle.
Data: The Deals page was getting more traffic than any other part of the site on mobile, but it also had the highest bounce rate.
Interpretation: A lot of the behavior could be explained by the fact that many people simply want to check the current deals and then leave; they have no intention of exploring the site further. But the conversions were low enough that we thought something else was at work, and we suspected it was the page length: on desktop the page seemed to have the right information density, but on mobile the content was spread over one seemingly endless page. It felt like it was just taking too long to scroll from deal to deal.
Hypothesis: If we shorten and compress the Deals page on mobile, we will reduce the bounce rate and increase conversions.
Test: We created three variations of the Deals page and had them randomly served up through Google Analytics, with bounce rate as the key metric.
Results: We had two clear winners in a month. We took what we regarded as the best ideas of each idea to create a hybrid design, and this was deployed to the live site.
The new design achieved a 20% reduction in bounce rate , which is huge given the traffic on IUA’s website. This experiment also provided insights into another aspect of user behavior and calls to action across devices that formed the basis of another optimization experiment.
Creating a conversion path for
Mr Rental
Context: Mr Rental had a referral program where customers can get a $50 voucher for referring friends, but conversions were really low and they asked us if we had any ideas.
Data: The referral program relied on SMS and we didn’t have much data about customer behavior, but we did have lots of Mr Rental campaign-based data that clearly showed smaller “everyone wins” incentives were more effective than larger “you have one chance” incentives.
Interpretation: We combined the campaign data’s suggestion that small guaranteed incentives were effective with the fact that we could do precise targeting of customers on Facebook.
Hypothesis: If we target people carefully on Facebook, offer the same modest incentive, we will get higher conversion rates than the referral program.
Test: We adapted and streamlined elements from the referral program, then began a careful segmentation of our Facebook audience. We targeted likely prospects on their birthdays, essentially sending them a personalized gift voucher redeemable during that month.
Results: The conversion rate for these birthday vouchers has been 80%, opening the way for a range of new Mr Rental Facebook advertising ideas.
Increasing conversions by streamlining Devine’s corporate banners
The context: Devine are a bit different to the rest of our clients. While they do have a corporate website, a lot of marketing work is done by an extended ecosystem of landing-pages and microsites, each related to a specific geographic location.
Ordinarily customers might Google their way to the central corporate site, but during specific marketing campaigns they are often directed to a relevant microsite. However, if a customer lands on a microsite and they aren’t interested in that particular offer, a prominent link directs them to the corporate site where they can explore other options.
Data: The weird thing we saw was that during campaigns, plenty of people were using the link from the campaign landing page to get to the corporate site, but overall conversions on the corporate site went down substantially during the campaign.
Interpretation: We had a few theories, but one of them hinged on the design of banners on the corporate site. What we thought was this: all the microsites had a consistent banner design, with a big simple image and a contact form integrated into the right hand side of the banner. Meanwhile the corporate banners all looked wildly different from each other, had very different offers, and the enquiry form was located elsewhere on the page. We wondered if the issue was that visitors arriving from the landing page were disoriented by the differences between the designs.
Hypothesis: If we make the design of all corporate banners consistent with the design of the microsite banners, we will increase conversions during campaign periods.
Test: This change sounds simple, but it wasn’t. First, we needed to define a design standard that was consistent with the microsites but also allowed for the needs of the corporate site. We needed to produce a style guide, because banners for the corporate website were sometimes produced by another agency. Then we needed to modify the code on the corporate site to both integrate the enquiry form into the banner and move other navigation elements out of the way. Only after all that could we actually run a test.
Results: At first we thought the test had failed: during campaign periods, conversions didn’t increase at all. We were disappointed. But then after a little more analysis we saw some that while conversions weren’t increasing, they weren’t decreasing either. The big “campaign dip” had disappeared. As far as we could tell, we now had certain customers being routed to the microsites via campaign marketing, deciding the campaign offer wasn’t for them, clicking through to the corporate site, and once there making an enquiry where before they’d dropped off the radar. Success!
It needs a close relationship
These are three different examples, but they are unified by a strong relationship with the client. For this continuous improvement process to work we have to routinely interrogate data for opportunities — and our ability to interrogate that data is based on our history with the client, our familiarity with all aspects of the client’s business, and the knowledge gained from past experiments.
Scientific method, and acceptance of failure
When we do these optimization activities, we’re essentially using the scientific method: hypothesis, test, result. But, just like with real science, we’re searching in the unknown and there is always a risk that an experiment will produce a null result. This is disappointing when it happens, but we have to see the null results as informative in their own way — they help us map the landscape, and find our way eventually to truly effective solutions.
An obsession with impact
In The Edge of Tomorrow, Tom Cruise becomes an ass-kicking super-soldier one step at a time: learning to turn left here, run for cover there, steal this helicopter in between. Similarly, most of the continuous improvement actions with our clients are quite small—a button here, a form change there—but the cumulative impact is enormous (and every now and then you have a jackpot win from a small but critical change).
And that’s what we want, because at the end of the day we are simple folk. We like nice, clear, obvious results so we can feel good about ourselves.
And for our clients, every improvement represent life and growth, day after day, year after year.