TestingMessage_Body.jpgby Robert Lerose.



John Wanamaker, who founded one of the first department store empires in the United States, famously complained that half his advertising was wasted; he just didn't know which half. Today, businesses can better track the return on their marketing dollars by vigorously testing their message. Indeed, the improvements and breakthroughs in testing tools make it possible to experiment with virtually every element of a business's public marketing efforts and see results quickly.


Perhaps the most well known testing concept is the A/B split test: two nearly identical ads are shown to customers, with one variable changed, to determine which version performs better. Typical variables include headlines and price points. The history of advertising is replete with stories of how altering a headline—or even a single word of copy— resulted in a dramatic spike in response. Case in point: When a book on auto repair was marketed to mechanics with the headline, "Fix any part of any car," the ad failed. But when it was tested with a new headline that said, "Fix almost any part of almost any car," sales of the book skyrocketed.


TestingMessage_PQ.jpgTest high traffic webpages

"A/B testing is a low-cost tool that small businesses in particular can use to test incremental changes and see what the net benefit is going to be for their user base," says Lara Swanson, the user experience manager at Dyn, a New Hampshire-based company that specializes in infrastructure services such as email delivery and managed networks.  


The first step is deciding what should be tested. This will vary depending on the nature of your business and what you want a visitor to your website to do. For example, an online retailer or an ecommerce site could test its product page or checkout page, whereas a lead generating site looking to collect email addresses might test its homepage.


Testing a webpage that gets a lot of traffic is a smart rule of thumb. So is testing one thing at a time. "Small businesses typically don't have the budget to test multiple variables all at the same time," Swanson explains. "So A/B testing is the simplest way to go about establishing what site works best for your user base."


A/B testing is also a numbers game. More visitors mean more accurate results, something known as statistical significance. "You want to get enough traffic in a test [to show] that the results you're getting are actually viable and you can be confident that's a real number," says Kim Ann King, chief marketing officer at SiteSpect, a Boston-based web and mobile optimization firm.


"If you have a low traffic website and you're testing a lot of different things, it's going to take you a long time to get to statistical significance," she says. Sites with marginal traffic should expect to run a test for at least two weeks or have a minimum of 2,000 visitors to be confident of the results.

Be clear about your metrics

Another question to consider as you're designing a test is: What do you want to measure? The answer will stem from the action that you want your visitors to take. At Dyn, they keep a close eye on the number of users who finish the checkout process and actually buy something. "Other sites that aren't revenue-driven [such as those that provide information without selling something] may look at the amount of time that users spend on their sites, bounce rates, the number of times users return to their site or how long they spend on the page," Swanson says. "It really depends on what they want their users to do."


For WePay, a Palo Alto, California-based company that provides tools to let small businesses accept online payments easily, testing their homepage started with a friendly internal competition which had a surprising outcome.


They held a contest for their engineers to build different homepages, rotated them on their website, and then measured the conversion rate from visit to sign-up. One of the test pages outperformed their then-current homepage by around two percent.


"That's a pretty significant lift in conversion for us," says Bill Clerico, WePay's CEO. "We have tens of thousands of visitors a day visiting our homepage. To be able to convert two percent more of them into sign-ups was a really big win for us."


In addition to enlarging their customer base, testing also uncovered a key insight about what WePay's users wanted from the site. "When we initially built the homepage, we were graphics intensive. It was all about pictures," Clerico says. "We found out that people actually wanted to read [more], so we ended up with more text on our homepage based on one of those designs."


A variety of free testing tools for small businesses abound, such as Google Analytics, but Clerico is particularly fond of KISSmetrics and especially Optimizely.


Small changes make a big difference

Optimizely, founded in 2009 by two former Google product managers, lets businesses make and test changes to their website in an easy-to-use platform. It also tracks responses for a statistically significant sample. 


"One of the things we like to preach to people is, make sure your test is going to run for a long enough time," says Jodie Ellis, director of marketing at the San Francisco-based company. "The Optimizely system will actually tell you that you need to wait a little longer. That's something we've done to correct for that common mistake that a first-time tester might make."


In the past, setting up a test has been the domain of engineers, where their technical and coding expertise were critical. Today, almost any department at a small business can test variations on their own. "It's really empowered marketers," Ellis explains. "It's also freed up a lot more engineering resources. You're seeing a lot more people who are empowered to use their skill set without hindering other parts of the business."


Optimizely asks for a user's URL on their homepage, with a call to action in the form of a button. Recently, they tested variations against their "Get Started" button. Every variation beat it, most notably one that said, "Test It Out." That particular button increased their conversions by 25 percent.


"It was an experiment that ran across about 14,000 visitors," Ellis adds. "With this small language change, we were able to achieve a remarkable jump in the number of people who completed an action that's very important to our business."


SiteSpect's King recalls a similar revelatory moment when her company did a simple A/B test for an auction house in Boston. By enlarging a product image on one of their webpages from 250 pixels to 350 pixels, they generated a 329 percent boost in bids. Which Test Won, which runs examples of actual split tests, is a favorite of Dyn's Swanson.


Today's testing tools can accelerate the success of small businesses, but that doesn't mean they should be the final word on everything. As WePay's Clerico notes, "I don't think Apple could have invented the iPhone through doing a series of A/B tests. Sometimes with a really big breakthrough, it requires some sort of thought and creative vision."