The Digital Marketing Podcast
Episode: "Test Everything"
Hosts: Daniel Rowles & Ciaran Rogers
Guest: Kirti Man Koholi
Date: September 17, 2017
Overview
This episode dives into the crucial practice of "testing everything" in digital marketing—challenging assumptions through rigorous A/B and multivariate testing to drive success. Hosts Daniel and Ciaran share their own experiences and insights on testing methodologies, highlight the importance of moving beyond received wisdom, and feature an in-depth case study from listener Kirti Man Koholi in India about optimizing Facebook ad campaigns for better results.
Key Discussion Points & Insights
1. The Value of Testing in Digital Marketing
-
Challenging Assumptions:
The hosts emphasize that best practices aren’t always best for every context and highlight how real users often behave unexpectedly.“The key thing here is to realize that we know nothing and essentially that you do need to test everything because you need to look at challenging our assumptions about things.”
– Daniel Rowles [00:50] -
A/B Testing Example:
Daniel explains a real scenario: Adding a security certificate badge (a common best practice) actually decreased form conversions by 12.5%. The badge, intended to increase trust, made users more suspicious.“You see [the] certificate, you start thinking about trust, and you think, well, maybe I shouldn't be putting my email address everywhere.”
– Daniel Rowles [02:33] -
Removing Subjectivity:
Both hosts call out how everyone in an organization feels expert about copy and design, but only testing provides real evidence for decisions.
2. A/B Testing vs. Multivariate Testing
-
A/B Testing (Split Testing):
Comparing two or more variations of a single element (e.g., email subject lines, landing pages).“A/B testing may actually be ABCDE testing… but you’re taking an original version and then you’re testing something against it... Normally using a system online to do that.”
– Daniel Rowles [03:00] -
Multivariate Testing:
Testing multiple elements and combinations simultaneously (e.g., mixing different headings, images, and text). Useful for more complex optimization.“For example, you could take five versions, the heading, two versions, the text, six versions the image, mix them up, spew out different versions of that web page, and then decide which version... is working better.”
– Daniel Rowles [03:47] -
Automation Tools:
Multivariate testing platforms help determine statistical significance automatically.
3. Testing Tools & Recent Changes
-
Google Content Experiments vs. Google Optimize:
The landscape has shifted from Google Content Experiments (more basic A/B capability, being deprecated) to Google Optimize (more robust, allows full multivariate testing).“There’s a new tool called Google Optimize... It’s not just a Google Premium analytics feature. This is something that they’re offering to everybody, which is great.”
– Daniel Rowles [04:53] -
Google Surveys for Pricing & Market Research:
Daniel shares using Google Surveys to set pricing for their online training—a simple, scalable way to test market responsiveness.“What it made me realize is it made it a bit of a no brainer. And we wanted to build numbers initially as well. So we said, okay, well we’ll kind of start at that point.”
– Daniel Rowles [07:16]
4. Importance of User-Centric Testing
-
From Analytics Aggregates to Customer Journeys:
While Google Analytics is useful, Ciaran and Daniel make a case for deep-diving into real user journeys using tools like Hotjar.“You would never get these kind of insights from your analytics reports because they’re looking at everything in aggregate but when you start going through maybe five or ten different users, actual journeys on the website, you start to see the power of multivariate testing.”
– Daniel Rowles [08:27] -
Usability Principle:
Even informal testing with just five users can reveal 85% of usability problems.“Five people doing the same thing, identifying 85% of the problems of your website. And that completely holds true.”
– Daniel Rowles [09:34] -
Real-Time Feedback:
Encouraging direct user feedback via the website’s feedback tool for continuous improvement.
Key Case Study: Kirti Man Koholi and Multivariate Facebook Ad Testing
[12:35] – [21:59]
Background
- Kirti Man is a Digital Marketing Manager for 91Mobiles (India’s largest gadget research site) and Killer Features.
- Faced rapidly escalating Facebook cost-per-click (CPC) as they scaled campaigns for a new content brand.
Challenge
- As volume increased, CPC for acquiring engaged readers increased significantly; targeting more broadly was less effective and more expensive.
“With a small budget, we used to get a great CPC... But after a while, when we wanted to scale this, the CPC went up exponentially.”
— Kirti Man Koholi [14:55]
Solution: Systematic A/B and Multivariate Testing with AdEspresso
- Adopted AdEspresso, a Facebook ad management tool, to automate and scale A/B testing across images and headlines.
- Created multiple (e.g., 5 images × 5 headlines = 25 variations) for each campaign.
- Ran small budgets for short periods (24 hours), then scaled spend only on best-performing presets.
“...create multiple variations of the same ad and you can just put a few headliners, put a few images across and the tool does the rest.”
— Kirti Man Koholi [17:07]
- Key findings:
- Short headlines with hard-hitting numbers/statistics performed best
- High-definition product images outperformed generic ones
- Repurposing a curated image library improved consistency and results
“Shorter headlines with good numbers, with hard hitting numbers and stats used to work a lot and good looking HD images work a lot.”
— Kirti Man Koholi [18:35]
Measurable Impact
- Achieved a 5x increase in traffic with only marginal increases in CPC.
- Systematically used learnings to improve campaign performance over time.
“We have taken our traffic up to 5x with just a marginal increase in CPC.”
— Kirti Man Koholi [20:54]
- Advocates for making testing a repeatable process, adapting recommendations for context and audience.
“It’s better to get all the relevant learnings yourself. And over a period of time... you can get all the learnings you want.”
— Kirti Man Koholi [19:57]
Memorable Quotes & Moments
-
On Assumptions:
“You know nothing, Jon Snow.”
– Daniel Rowles, referencing Game of Thrones [01:32] -
On Removing Subjectivity:
“Everyone in every organization is an expert in at least two things: copywriting and web design.”
– Daniel Rowles [02:50] -
On Listening to Real Users:
“You have to walk in your customer’s shoes and there are tools out there that enable you to do that really, really effectively.”
– Daniel Rowles [09:02] -
On Adopting Testing as Routine:
“Always test everything... get all the relevant learnings yourself.”
– Kirti Man Koholi [19:57]
Important Timestamps
- 00:50 – Challenging assumptions and the importance of testing best practices
- 02:33 – Real-life A/B testing result that contradicts common wisdom
- 03:00 – Definitions: A/B vs. Multivariate testing
- 04:53 – Transition from Google Content Experiments to Google Optimize
- 07:00 – 08:00 – Using Google Surveys for price testing and market feedback
- 09:20 – 10:31 – The value of user journey tools like Hotjar and one-on-one usability
- 11:46 – Introducing Kirti Man Koholi and his Facebook ad testing story
- 14:55 – Describing the scaling CPC challenge
- 17:07 – 19:16 – Implementing AdEspresso, campaign architecture, & key learnings
- 20:54 – Reporting 5x increase in traffic with minimal CPC increase
Takeaways
- Never rely on assumptions, even if they’re industry best practices. Test everything—from web forms to ad campaigns.
- A/B and multivariate testing should be systematic, ongoing processes.
- Use analytics, user journey tools, and direct feedback channels to truly understand and improve user experience.
- Ad management tools like AdEspresso can vastly enhance the speed and reliability of testing on platforms like Facebook, delivering dramatic improvements in efficiency and ROI.
- Short, punchy headlines and high-quality images matter—at least for tech audiences, but always test for yourself.
This episode offers a practical blend of strategic advice, hands-on tools, and a real-world case study to make the case for a testing-driven culture in digital marketing. Whether you’re optimizing a landing page or scaling paid social campaigns, “test everything” remains timeless advice.
