How to Guides

The Meta Ads “Creative Testing Methodology”

The paid social landscape has changed drastically in the last 2 years. No more

  • Lookalike
  • Interest audience experimentations
  • or other big hack

Everything boils down to your “creatives” + “offer” and landing pages.

The Meta Ads "Creative Testing Methodology" is a guide for advertisers to get the best out of their ad campaigns on Meta platforms like Facebook and Instagram.

It's all about testing different ad formats and designs to see what works best.

The guide explains how to set up your

  1. Ad campaigns,
  2. Decide on budgets and
  3. Figure out which ads are winners based on their performance.
  4. It also covers different situations you might face, like ads that look good but don't sell enough or those that get a lot of attention but not enough clicks.

The goal is to help you find and use the most effective ads, and to keep improving them so your campaigns stay strong. It's a smart way to make sure your ads are doing their job right and helping your business and sales grow.

Table of content


  1. Campaign Creation and Structure
  2. Setting Control Rules
  3. Testing Budgets
  4. Data Analysis and Identifying Winners
  5. Scaling Winners from Testing
  6. Creative Testing Prioritization Framework
  7. Iterations Based on Existing Winners
  8. Frequently Asked Questions

1. Campaign Creation and Structure


  • Avoid launching net new creatives in your core campaigns, as some of them can eat up budgets but don’t yield positive results. Or they won’t spend at all 🤔
  • Begin by creating 3 or 4 identical broad ad sets within a conversion campaign, each containing your creative ads. Please, pleaseoptimize for “add to cart” or “traffic”. These should be CBO ideally so Meta dynamically reallocates budgets, but ABO should be fine as well.
  • Each ad set should have creatives with the same themes i.e. Founder Story with its variants as ad set 1, Interview on the Street UGC with its variants as ad set 2
  • See my testing structure from one of my ad accounts below.

2. Setting Control Rules:


  • Allow a minimum of 3x of your AOV as a spend threshold or a minimum of 10,000 impressions per ad.
  • Consider pausing if an ad does not yield the desired CPA or ROAS after the set criteria.

3. Testing budgets:


  • Dedicate a minimum of 25% of your overall budget towards testing.
  • Allow testing campaigns a minimum of 4-5 days before pulling a plug

4. Data Analysis and Identifying Winners:


  • Hard metrics - Always your CPA or ROAS and your click-to-purchase rate.
  • Soft metrics or storytelling metrics:
  • → Hook rate > 30%→ Hold rate > 10%→ CTR > 0.5 to 1.2% (depending on your niche, markets, etc.)

💡 Also read: The Ultimate Guide To Meta Ads Custom Metrics You Need To Track

Scenario 1:

Hitting soft-metric goals but not meeting north-star goals (ROAS) <<

  • As shown above, the ad shows strong storytelling metrics (strong hook and hold rate) but fails to meet the ROAS target.
  • If we dig deeper, we see this is due to the low CTR (around 0.2%) and high CPC.
  • This reflects a gap in driving clicks, enticing users to take action and should be paused.

Scenario 2:

Meeting North-Star Goals but Low on Soft Metrics <<

In this scenario, your creatives are meeting or exceeding the hard metrics like CPA or ROAS, which are critical for the financial success of the campaign. However, they are underperforming in soft metrics like hook rate, hold rate, or CTR. This situation can be indicative of a few potential issues:

  • Limited Long-term Engagement - While the ad is effective in driving conversions, it may need to be more engaging to build long-term brand awareness or customer loyalty.
  • Action Plan - Investigate the creative elements that might be lacking in terms of engagement or storytelling. Try testing variations of your current ads with more engaging visuals or copy to see if this improves the soft metrics without hurting the hard metrics.

Scenario 3:

High Engagement but Low Conversion Rates <<

In this scenario, your ads are performing well in terms of engagement metrics (high hook rate, hold rate, and CTR), but they need to convert more effectively, as indicated by lower-than-desired CPA or ROAS.

  • Misalignment with Target Audience: The ads are appealing and engaging but might not reach the right audience, or the messaging might not align with what motivates your audience to convert.
  • Action Plan: Reassess your target audience to ensure it aligns with your product or service. You should also refine your ad's messaging or call-to-action to better cater to the audience's needs and drive conversions.

5. Scaling winners from testing


→ Once you evaluate results from #4, shortlist the winning ads and move them to core campaigns via the post ID method. Here’s a YouTube tutorial explaining how it works.

→ It’s recommended you don’t pause your winning ad from the testing ad set (going by the popular saying, don’t fix unless it's not broken).

→ Also, at times, we have seen the same ad work well in testing, but it doesn't yield the same results when moved to core campaigns or Advantage+ Shopping.

6. Creative Testing Prioritization Framework


Your testing priority depends on two things:

a/ Performance - What type of creativity will you be able to help you convert the best?

b/ Time - What type of creative will take you the least amount of time to execute on?

Other things to consider

  • What are the low-hanging fruit? Super easy to execute on, but get you good results?
  • What are the big gaps that the brand hasn’t executed on at all but have immense potential? For example, if a brand has never done UGC, they should probably try UGC.
  • If you haven’t done a lot of testing, you should do rapid format testing.
  • Go back to “10 Free Static Ad Concepts To Test” and start there.
  • It only takes 1 second for someone to determine if they want to watch your ad or not. So, testing those 10 formats will help you figure out which format works to get attention.
  • After you get the format down, you start testing different types of messaging.

7. Iterations based on existing winners


Now, we're diving into a fascinating aspect of our journey in the "Creative Strategy" course: iterating on creatives, especially the winners, to enhance their performance even further!

Identifying the Impactful Variable

When you spot a top-performing creative that you wish to iterate on, here’s a golden tip: Identify the element that’s likely driving its performance. Is it the imagery? The messaging? Or perhaps the format?

To decipher this, let’s glance at your creative test. Ideally, you should be testing at least 3 variations per creative test. For instance, if you’re testing a statistics-based ad, like Mott and Bow did, and find a winner, ask yourself: What’s the pivotal variable here?

  • Messaging: Statistics?
  • Imagery: A particular shot?
  • Format: Image or video?

If all variations perform decently in terms of spend and CPA/ROAS, it’s plausible that the format is a key player. But if one outshines the others, it’s time to zoom in on imagery and messaging for further insights.

Diving into Iterations

Now, let’s explore the iterations I’d recommend:

  • Identify the Major Variable: And experiment with it across different formats. Why? Because format is a colossal variable.
  • Image to Video: Consider adapting a top-performing image into a video hook.
  • Post-It Note Test: Try embedding your message in a post-it note format.
  • Testimonial to Social Proof: Leverage testimonials and social proof to enhance credibility.

Add-Ons for Video and Images:

  • Social Proof: Showcase user numbers or other social validation.
  • Testimonials/Golden Nuggets: Integrate powerful user statements.
  • Press Mentions: Highlight any media coverage.
  • Before/After Shots: Demonstrate your product’s impact.
  • Statistics: Bolster your message with data.
  • Bullet Points: Summarize key benefits succinctly.
  • Carousel Format: Combine top-performing images into a carousel, like a dynamic creative testing campaign. For instance, Arrae utilized multiple formats in one ad creative.
  • Headline Bar: Experiment with different headlines.

Quick Swaps:

  • Image Swap: Try different environments or subjects.
  • Message Swap: Explore various human desires or pain points.

Video Iterations:

  • Problem and Agitator Swap: Maintain the format but experiment with different problems and agitators.
  • People Swap: Use the same script but with different individuals.
  • Hook Swap: Experiment with greenscreens, TikTok response bubbles, or split screens.
  • Texture Play: Explore reverse loops or texture plays for an oddly satisfying effect.
  • Story Build: Expand upon the story, like Hexclad did, building on Gordon’s golden nugget review.
  • Audience Call-Out: Directly address your audience.
  • Pet Inclusion: Test the impact of adding a pet.
  • Post-It Note Ad with VO: Convert your best-performing script into a post-it note ad with voice-over.

Frequently asked questions


  1. What is the ideal number of creatives to test in one ad set?
  2. Answer: Ideally, test 3-5 different creative formats within each ad set. This variety allows for a comprehensive comparison without overwhelming the ad set.
  3. How long should I run my creative test for reliable results?
  4. Answer: Run the test until each creative accumulates 5-8000 impressions or 4-5 days. This ensures enough data for a valid comparison.
  5. What common mistakes should I avoid in creative testing?
  6. Answer: Avoid using too many similar creatives, not allocating a sufficient budget, ending tests too early, and not considering audience-specific responses.
  7. How should I adjust my budget during the testing phase?
  8. Answer: Start with a smaller budget, then adjust based on ad performance. Ensure each ad set has enough budget to reach the desired impressions and clicks.
  9. What should I do if my test results are inconclusive?
  10. Answer: If results need to be clarified, consider extending the testing period or testing with a different set of creatives or audiences.
  11. How can I prevent 'ad fatigue' in my campaigns?
  12. Answer: Regularly update and rotate creatives and test new ones to maintain audience engagement.
  13. Can I use the same testing methodology for different types of campaigns (e.g., brand awareness vs. conversion)?
  14. Answer: While the fundamental methodology can be applied across campaign types, the KPIs and creative strategies should be tailored to the specific goals of each campaign.

Still not convinced?

Read other Case Studies