At Thumbtack we use A/B testing on just about everything we do. Integrating A/B testing into your product workflow allows you to quickly iterate and get feedback to create an awesome product. With each added feature, you know what effect it had on conversion, time on site, click-through, or whatever other metric you care about, which helps you focus in on the important factors and leads to great results. Over the past couple months, we’ve used A/B testing on a prominent form on our site to increase conversion by more than 50%. Through this process we’ve learned a lot about what makes a good form and what to avoid.
The form in question is what every user has to get through to place a request on our site. It asks several category-specific questions regarding the service you want and then asks for your contact information. Here’s what it might look like if you were trying to hire a DJ:
In this post, I will outline the top five changes that came up in our testing and can be generally applied to any form.
1. Use standard form elements
When creating any interactive elements, there are many reasons you may want to make your own GUI controls that better support that particular interaction. Jakob Nielsen lists this as his [number 1 application-design mistake], saying,
Users will most likely fail if you deviate from expectations on something as basic as the controls to operate a UI. And, even if they don't fail, they'll expend substantial brainpower trying to operate something that shouldn't require a second thought
And now we have our own data to back that up.
With the previous version of the form, we attempted to make the form take up less screen space by creating ‘faux’ select inputs that relied on some fancy Backbone views to do their magic. The rationale behind this was that a smaller form would be less daunting and encourage more users to place a request.
However, when we started running A/B tests on the form we found that replacing these ‘faux’ selects with standard form elements increased conversion site-wide, with some pages’ conversion going up 20%.
Why were the standard elements so much better? With the faux selects we failed to take into account the difficulty of learning a new interaction model. Our interaction was especially tricky because it looked like the standard select buttons that people had seen before, but behaved differently. This difficulty isn’t necessarily noticeable on a conscious level, but every little frustration decreases your user’s reservoir of goodwill (see: Don't Make Me Think) and their chance of converting.
The standard form elements also have the benefit of being easier to maintain, so this was a clear winner for us.
2. Keep it simple
Your site may offer many different services, but once a user is filling out a form it is usually pretty clear what they want to do. You should make it simple for them to complete that task and avoid distracting them. This is on Steve Krug’s (Don't Make Me Think) list of things that increase goodwill: “know the main things that people want to do on your site and make them obvious and easy.”
We have tried adding several features to the form that we thought would be helpful, but ended up decreasing conversion or showing no significant change. One such feature, outlined in red above, was intended to give the user more context while they were requesting a specific service pro. The problem with most of these is that they distracted from the flow of the form. We don’t keep features like these unless they show an obvious performance increase, since each new feature complicates the user’s experience. Our most successful changes have been changes that integrate naturally into the flow of placing a request.
3. Optimize for mobile
Mobile traffic now constitutes a significant percentage of total traffic for many websites and will only continue to grow. Not optimizing for mobile makes the mobile experience difficult and clumsy, preventing many users from converting on your site. Even if you don’t see much mobile traffic yet, Luke Wroblewski warns, “don't wait too long to change as the shift from desktop to mobile can happen faster than you think.”
Since we introduced a mobile-optimized version of our form, we’ve seen a 40% increase in conversions on mobile. Furthermore, designing for mobile forces you to rethink your form and simplify the experience, which leads to ideas for improving the desktop version as well. Many of the tests we ran on the desktop form were inspired by our experience designing the mobile version.
If a significant portion of your traffic is mobile, this is a no-brainer.
4. Only ask for what you need
Users are hesitant to give away personal information unless it is clear why you need it. Think carefully about what information you need and then present the fields in such a way that it is clear why you need it. “Not Indicating How Info Will Be Used” is number 9 on Jakob Nielsen’s [list of top application-design mistakes][number 1 application-design mistake] and “Asking me for information you don’t really need” appears on Don’t Make Me Think’s list of things that diminish goodwill.
We used to require a phone number for every request, even if the user opted not to share their phone number with our service pros. The phone number was useful to verify the user’s identity, but from a user’s perspective we were asking for information that was irrelevant to their request.
We tested this against a version where the phone number was optional and a dynamic version where the phone number field would only appear if they opted to receive quotes by phone. Incredibly, the dynamic version showed a 15% increase in requests while the optional phone number only showed a 4% increase. Although both of the new versions effectively made the phone number optional, the dynamic version created an intuitive interface that made it clear to the user why they were being asked for this information.
5. Test everything
We like to A/B test everything. There were many features that we thought were obvious improvements but turned out to decrease conversion or make no difference. We have to remind ourselves that our users are not us; they often use things differently than we do, so we not only test everything, but we also try to understand why particular changes increase or decrease conversion.
Since we use A/B testing so extensively, we built a tool called ABBA for analyzing the results of A/B tests. This lets us check exactly how each test affects our metrics and tells us how confident we are that the observed changes aren’t just due to chance. The source is also available.
Of course, conversion isn’t everything. You also need to focus on the quality of the request you’re receiving and the quality of the user’s experience. We have tested features that showed increases in conversion but ended up pulling them because we believed they compromised the quality of the user’s experience and the tradeoff wasn’t worth it. These are difficult decisions to make, but A/B testing will at least give you the data you need to make an educated decision.
Since we began working on the latest version of our request form, we’ve seen our conversion rate increase by more than 50%. We hope that these tips will help you create better forms and see similar improvements on your site.