⚡ TL;DR
Table of Contents
- AI-generated creatives outperform human-made ads in CTR in 60%+ of split tests when properly briefed and trained on brand data.
- The winning formula: AI handles volume (100+ variants) while human strategists set the creative angles and hooks.
- Real case studies show 30–80% lower CPC and 2–3x higher ROAS vs. static creative from traditional agencies.
- AI creative works best for iteration speed; human insight remains essential for initial strategic direction.
The debate about whether AI-generated ad creative actually works is over. The data is in, and it’s unambiguous: brands that make the shift to AI-powered creative production are seeing measurable, repeatable improvements across every metric that matters — ROAS, CPA, CTR, and CPI. Not marginal improvements. Not anecdotal wins. Documented results from real campaigns, at real scale, across multiple verticals and platforms.
This article presents five case studies from Admiral Media’s AI creative programs, with specific numbers attached to each. If you’re evaluating whether AI generated ad creative is ready for your business, this is where the decision gets made.
Why AI Creative Performance Is Now Measurable at Scale
Two things have changed that make the AI creative case studies below possible. First, AI creative production has matured to the point where output quality is platform-ready across video, static, and AI UGC formats — without the caveat that it “looks like AI.” Second, performance marketing platforms have evolved to reward creative variety, with algorithmic systems that identify winning creatives faster and more accurately when given larger creative pools to work with.
These two factors compound. Better AI production quality means creatives can actually run at scale. More creatives running means more data. More data means faster identification of winners. Faster identification of winners means better campaign performance — which validates more investment in AI creative production. The feedback loop is self-reinforcing, and the case studies below show exactly what happens when it’s running correctly.
According to StackAdapt’s State of Programmatic Advertising 2026 report, campaigns using Dynamic Creative Optimization deliver a 32% higher CTR and 56% lower cost per click on average. Admiral Media’s client results consistently outperform these benchmarks, because the AI creative approach goes beyond dynamic optimization into full strategic production and systematic testing.
Case Study 1: Star Chef 2 — +45% ROAS, +55% CTR, -18% CAC
Star Chef 2 is a mobile cooking game developed by 99games with a distinctive visual identity and a large, engaged global player base. When 99games needed to scale creative production without losing brand consistency, traditional production methods hit a ceiling — the volume of variants needed for effective performance testing simply couldn’t be achieved at acceptable cost and speed.
Admiral Media implemented an automated AI production workflow with fast-paced iteration, generating large volumes of ad variants while preserving Star Chef 2’s visual style and brand guidelines. The structured testing approach systematically evaluated hook variations, gameplay highlight formats, and audience-specific messaging angles simultaneously — something only achievable with AI production at the volume level delivered.
Results across the engagement:
- +45% ROAS — return on ad spend improved substantially as winning creative combinations were identified and scaled
- +55% CTR — click-through rates increased dramatically as the testing volume surfaced formats that resonated most strongly with each audience segment
- -18% CAC — customer acquisition costs decreased as efficient creative reduced wasted spend and improved conversion rates throughout the funnel
Shilpa Bhat, VP Games at 99games, noted that Admiral Media helped them scale creative production while staying consistent with Star Chef 2’s visual style and brand. The structured testing approach surfaced several new creative concepts that performed exceptionally well and had not been identified through traditional production methods.
The Star Chef 2 case illustrates a consistent pattern in AI creative agency results: the performance gains are not primarily from the AI itself, but from the testing surface area that AI production makes possible. When you can test 50 creative variants instead of 5, you find winners that would otherwise remain undiscovered.
Case Study 2: StoryBeat — Double the Impact, Half the Production Time
StoryBeat is a creative app that helps users produce visually compelling stories for social media. As a brand operating in the creative tools space, their ads needed to demonstrate creativity and quality while competing in a crowded app market — a combination that makes high-volume creative testing both necessary and challenging.
The core problem was a production bottleneck. StoryBeat’s creative timeline limited how quickly they could test new ad angles and respond to performance signals. By the time a new concept was produced and live, the performance window for that insight had often already passed.
Admiral Media integrated AI into StoryBeat’s creative process at the production level, compressing the time from creative concept to live creative from weeks to days. This compression didn’t just save time — it fundamentally changed the strategy. StoryBeat could now run more messaging hypotheses simultaneously, identify winning angles in real time, and scale production of high-performing formats without waiting for a traditional production cycle to turn around.
The results reflect the compounding effect of faster creative iteration:
- 50%+ reduction in production time — the time from creative brief to live ad was cut in half, enabling far faster response to performance data
- 2x campaign impact — advertising effectiveness doubled, measured across key performance metrics for the campaign
The StoryBeat case demonstrates why speed is a strategic advantage in performance creative, not just an operational convenience. Every week faster means another week of performance data informing the next iteration. Over a quarter-long campaign, that compounding effect is what separates good creative programs from great ones.
Case Study 3: Dynamic Creatives — +77% Ad Spend, -32% CPA
Dynamic Creatives faced a challenge that most performance marketers know intimately: creative fatigue limiting the ability to scale. Their existing creative library was strong, but platform algorithms had learned the formats, frequency was rising among their target audiences, and performance was decaying as a result. The options were either accept the performance ceiling or invest in creative refresh at a pace that traditional production couldn’t sustain.
Admiral Media implemented a high-volume AI creative production program that delivered a continuous stream of variants across multiple formats and messaging angles. The approach was systematic rather than reactive — rather than producing new creatives in response to performance decay, the program was designed to stay ahead of fatigue through structured, ongoing variant generation and testing.
The platform algorithms responded to the increased creative variety by finding better optimization paths. More creative options gave the algorithm more room to work, identifying audience-creative matches that weren’t accessible when the creative pool was limited.
- +77% increase in ad spend — the campaign scaled aggressively as creative performance validated increased investment
- -32% decrease in CPA — cost per acquisition improved simultaneously with spend scaling, the defining outcome of an AI creative program working correctly
The Dynamic Creatives result is particularly significant because it achieves both goals that performance marketers care about simultaneously: more scale and better efficiency. Normally these are in tension — scaling spend tends to push CPA upward as you reach less efficient audience segments. AI creative variety breaks this tension by continuously finding new creative-audience fits that maintain efficiency at higher spend levels.
Case Study 4: FET — -66% CPA, +162% Subscriptions
FET is a dating app operating in a competitive mobile app market where customer acquisition costs are high, creative differentiation is difficult to sustain, and the gap between a winning creative and an average one has an outsized impact on subscription economics. In this environment, AI creative performance isn’t just a nice-to-have — it’s a competitive necessity.
Admiral Media’s creative program for FET applied a data-driven creative strategy that systematically tested value proposition angles, visual styles, and audience-specific messaging to identify the combinations that drove not just installs but paying subscriptions. The distinction matters: optimizing for installs alone is common; optimizing all the way to subscription conversion requires understanding how creative quality affects downstream user behavior, not just top-of-funnel click rates.
The results from FET’s AI creative program were among the strongest across Admiral Media’s client portfolio:
- -66% lower CPA — cost per acquisition dropped by two-thirds as AI-powered creative testing surfaced dramatically more efficient acquisition pathways
- +162% more subscriptions — conversion to paid subscription more than doubled, reflecting creative that attracted users with genuine intent rather than casual browsers
FET’s case underlines the difference between optimizing for volume and optimizing for value. AI ad creative performance at its best doesn’t just make the top of the funnel cheaper — it makes the entire funnel more efficient by finding the creative-audience combinations that attract users who actually convert to revenue.
Case Study 5: PURE App — -74% CPI via AI-Optimized Creative
PURE is a mobile app operating in the dating and social discovery category, competing across global markets where user acquisition costs vary significantly by region and the creative that works in one market often fails in another. Scaling globally while maintaining performance efficiency requires a creative production infrastructure capable of generating market-specific variants at volume — exactly the problem that AI creative production is built to solve.
Admiral Media partnered with PURE to develop an AI creative program that addressed both volume and localization. The approach generated large numbers of creative variants tested across markets, with performance data from each market feeding back into the creative strategy for the next iteration. Where traditional production would require separate production runs for each market adaptation, AI production compressed this into a unified workflow with market-specific outputs.
- -74% lower CPI — cost per install decreased by nearly three-quarters, a result that fundamentally changes the economics of global user acquisition at scale
- ROAS goals achieved and exceeded — the campaign met and surpassed the return on ad spend targets that had been set as the primary success metric for the engagement
A 74% reduction in CPI is not a minor optimization — it’s a transformational change in unit economics. For a brand spending €100,000 per month on user acquisition, achieving installs at 26% of the previous cost means either a 3.8x increase in install volume for the same budget, or the same install volume at 26% of the previous spend. Either outcome changes the business.
What These Results Have in Common
Looking across these five case studies, several consistent factors drive the results.
Volume Creates the Testing Surface That Finds Winners
In every case, the AI creative program delivered significantly more creative variants than the brand had previously been able to test. Research consistently shows that only 6-7% of ad variants perform at scale — which means the brands finding the most winners are the ones testing the most creatives. AI production doesn’t just make creative cheaper; it makes the testing surface large enough to actually find the high-performing combinations that exist but would otherwise go undiscovered.
Performance Data Drives Creative Decisions
In each case study, the creative strategy was grounded in performance data rather than intuition. What hooks drive the highest watch-through rates? Which value propositions produce the lowest CPA? What visual styles correlate with subscription conversion rather than just install volume? These questions can only be answered systematically at scale — which is why AI creative agencies for performance marketing consistently outperform traditional agencies that brief creative based on brand preference rather than performance signals.
Speed Compounds Over Time
The StoryBeat and Dynamic Creatives cases both demonstrate how production speed translates into strategic advantage over time. Each week faster is another week of performance data informing the next creative iteration. Compounded over a quarter or a year, the brands with faster creative cycles build a data advantage that becomes increasingly difficult for slower competitors to close.
The Human Layer Makes AI Creative Scalable
None of these results were achieved by deploying AI tools without strategic oversight. In every case, Admiral Media’s creative strategists developed the testing frameworks, interpreted the performance data, and made the strategic calls about which directions to pursue. The AI handled production volume and speed. The humans handled judgment, strategy, and brand alignment. This combination — not AI alone — is what drives the results above.
Learn more about how AI creative agencies produce 100+ ad variants monthly, or explore the complete guide to AI creative agencies to understand the full scope of what this model delivers.
How to Evaluate AI Generated Ad Creative Results for Your Business
When assessing whether AI creative results translate to your context, the most important question is not whether AI creative works in aggregate — the data above settles that. The question is how to structure an engagement to achieve results like these rather than mediocre outcomes.
The brands that see strong AI creative results share common characteristics: they give the testing program enough time to accumulate data (typically 60-90 days before drawing conclusions), they provide a large enough creative pool to generate statistically meaningful signals (not 5 variants but 30-50), and they maintain a feedback loop between creative performance data and the next production cycle.
The brands that see weak results typically cut the program too early, over-constrain the creative brief to the point where meaningful testing is impossible, or treat AI production as a cost-cutting measure rather than a performance strategy — optimizing for cheaper creative rather than better creative.
The case studies above were not accidents. They were the output of a structured methodology applied consistently. That methodology is replicable — and the results, while they vary by vertical and competitive context, consistently point in the same direction.
Frequently Asked Questions
Do AI generated ad creatives really perform as well as human-made ads?
In performance marketing contexts, AI-generated creatives consistently match or outperform traditionally produced ads on core metrics. The five case studies above show improvements ranging from 32% to 74% on key performance KPIs. The advantage is not the AI itself but the volume of testing it enables — more variants tested means more winners found. Traditional production quality is high, but the volume needed for effective performance testing is not achievable at traditional costs.
How long does it take to see results from an AI creative program?
Most brands see meaningful performance data within the first 30 days, with statistically significant improvement trends emerging by 60-90 days. The compounding nature of the approach means results typically accelerate over time — the first month builds the performance baseline, the second month applies learnings from month one, and by month three the creative program is operating with a meaningful data advantage. Admiral Media’s case studies show the strongest results after sustained engagement, not one-off campaigns.
What types of ad formats work best with AI creative production?
AI creative production covers the full range of performance ad formats — video (including AI UGC), static image, animated variants, and copy/headline testing. Video tends to show the most dramatic performance differences because the creative variable space is larger and the production cost savings are greatest versus traditional video production. That said, static format optimization frequently delivers significant CPA improvements for brands where static is the primary driver, as the FET and PURE results above demonstrate.
How many ad variants should I plan to test?
Research indicates that approximately 6-7% of ad variants become genuine scale performers. Working backward from that, a brand that wants to find 3-5 reliable winners should plan to test 50-80 variants at minimum. Higher-spend brands managing multiple campaigns and markets should target 100+ variants per month. This volume is not achievable through traditional production; it requires AI-powered creative production to be economically viable.
What makes Admiral Media’s AI creative results different from generic AI tools?
The difference between deploying self-service AI tools and working with a managed AI creative agency is the strategic layer. Generic AI tools handle production. A managed program adds creative strategy, performance hypothesis frameworks, human quality review, and a feedback loop between campaign performance and creative decisions. The results above — 45-74% improvements on core metrics — come from this combination, not from AI production alone. The tool is the engine; the strategy is the driver.


