
This past June, Meta set off a bomb in the marketing world when it announced that it would fully automate the advertising on its platforms by 2026. People in advertising wondered: Is this the end of ad agencies as we know it? Has the AI “slopification” of social media finally been fully realized?
The hyperbolic reaction is understandable—maybe even justified. With 3.43 billion unique active users across its platforms around the world, and an advertising machine that brought in $47.5 billion in Q2 sales alone (up 22% over last year), Meta is an accurate bellwether for where the ad business is heading.
Meta has been working for years to build a machine that is already pretty damn close to automating its entire ad system, from creative concept generation to determining whose eyeballs see the final product. Its current capabilities are good enough to give most advertising creatives the flop sweats. But now is not the time for marketers to cower in fear.
The opposite, actually.
This is a great moment for marketers face head-on how Meta views its relationship with creatives, agencies, and brands, as it continues to roll out new technologies and features. To help, we asked Meta ad execs to break down their strategy.
Below is a detailed explainer to help you understand how Meta is thinking about its role in the advertising space, and what brands, agencies, and even consumers can do to better prepare themselves for what’s to come.
In this premium piece, you’ll learn:
- What Meta’s new AI advertising tools are and how they work, straight from the people creating them
- The reason why agencies will always be a part of Meta’s advertising equation
- Which tools are turbocharging growth for marketers, according to Helen Ma, Meta’s VP of product management (GenAI ad formats, video growth, creative diversification)
Five key breakthroughs
Earlier this month, Meta announced a slew of features to its AI-powered ad platform, including virtual try-on tech, AI-generated video for advertisers, and generative CTA (call to action) stickers to replace the common “Buy Now” button. But to understand the significance of the new tools, it’s important to step back for a moment and dig into the technology infrastructure that powers Meta’s advertising system.
Over the past few years, Meta has systematically rebuilt its entire ad infrastructure around AI. Each innovation builds on previous advances, creating compounding improvements in the effectiveness of ads on its platforms. (Take haircare brand Living Proof, for example, which saw an 18% boost in purchases after using Meta’s generative AI feature for ad creative, compared to using its usual campaign strategy and creative.)
This all-in-one-place approach to marketing tools reduces the operational burden for advertisers while increasing their dependence on Meta’s systems. The goal for Meta is to be as embedded as possible in a brand’s overall marketing operation.
Matt Steiner, Meta’s VP of monetization infrastructure for ranking and AI foundations, says there are essentially five key technological breakthroughs that underpin Meta’s AI advertising platforms. The focus is on automating and optimizing every part of the advertising process, from creative generation to targeting and performance measurement. Here’s what you need to know:
Advantage Plus Shopping Campaigns: The Automated Ad Manager
Instead of advertisers needing large teams to constantly monitor their ads, analyze spreadsheets, and manually decide when to increase or decrease spending, Meta introduced Advantage Plus in 2022 to use machine learning models to do the heavy lifting. The AI constantly monitors which campaigns and audiences are performing well and automatically redirects the budget and changes the bid strategy 24 hours a day to maximize results. “I think the key innovation that drives it is that machine learning models don’t get tired,” Steiner says.
He notes that this technology was key to Meta’s ad business when Apple introduced anti-tracking changes for iPhone users. Historically, Meta could track whether ads you saw on its platforms ultimately led to a purchase elsewhere, and this anti-tracking change cut off its lifeline to that information. Meta bypassed the blockage using transfer learning and combining its app data with advertisers’ own data on the people using its sites and making purchases.
Meta Lattice: The “Shared Knowledge” System
This is a deep machine learning technology that allows different AI models to learn from each other. Traditionally, Meta had separate AI models predicting different user behaviors. For example, one model would predict who will click on an ad, while another would predict who would actually buy the product. Announced in 2023, Lattice utilizes transfer learning, which allows these models to share knowledge. Transfer learning is a concept where a machine learning model trained to do one task can be trained to do a second task, and its performance on the first task improves by being trained to do the second task.
Generative AI for Ads Creative: The Automatic Ad Designer
This set of tools, originally introduced in May 2024, automatically creates variations of a brand’s ad content across text and image backgrounds as well as entirely new images from scratch. It then optimizes them to look good and perform well on Meta platforms. This saves advertisers time by allowing them to test and learn what consumers respond to much faster than a human team could.
“Humans are best at coming up with novel ideas,” says Steiner. “They’re not really good at thinking of all variations of the word buy or sale, and that’s not something people are really excited to do. So with machine learning models to automate that, they can spend their time doing the things that are uniquely human-skilled, like coming up with new ideas and really understanding why a campaign will resonate with people—things that are not really automatable today.”
Andromeda: The High-Speed Ad Finder
The goal of all Meta advertising is to match the right brand’s ad to the right person at the time that person is most likely to click (and buy). Thanks to the new AI ad tools rolled out over the past year, the number of ads available in Meta’s system increased rapidly. Within a month of launching its first AI tools in 2024, more than a million advertisers used Meta’s generative AI tools to create more than 15 million ads. This essentially clogged the system and made it harder and slower for Meta to search through all those ads to find the few that might be relevant to any particular user.
In December Meta introduced Andromeda, a massive technical and hardware upgrade to Meta’s backend infrastructure that lends it up to 10,000 times more computing power. Codesigned with Meta Training and Inference Accelerator (MTIA) and Nvidia’s Grace Hopper Superchip, Andromeda allows Meta’s system to handle the massive increase in demand for computing power from all the ads being created using its generative AI tools.
Steiner says the result has been a dramatic improvement in the selection of relevant ads, increasing the likelihood of people finding a useful ad and ultimately driving up conversions for advertisers. According to the company, so far it has boosted conversions on Facebook mobile Feed and Reels by 4%.
Generative Ads Recommendation Model (GEM): The Customer Map
Introduced in April, GEM is new AI model architecture for deciding which ad to show you, based on predicting future behavior. Just as an LLM uses sequence learning to predict the next logical item in a sequence, this does the same thing for ads.
Instead of just predicting whether you’ll click on the next single ad, GEM tracks your entire history of ad interactions and purchases. This allows the model to recognize that you might be on several different, parallel “purchase journeys” simultaneously and react accordingly. The company says these improvements increased ad conversions by approximately 5% on Instagram and 3% on Facebook Feed and Reels in Q2.
Ad feeds of the future
This new backbone technology is powering all the ads you see, but it’s all more or less invisible. Here’s what Meta is betting on to get you buying more across Instagram, Facebook, and WhatsApp:
Virtual try-on:
This is exactly what it sounds like. Meta is now testing with select advertisers the ability to see how clothing featured in an ad looks on them after they upload a photo of themselves.
AI Sticker CTA (call to action):
Most of the time you see an ad on Instagram, there’s a generic “Shop Now” button at the bottom. Now brands are going to be using custom AI-generated stickers that could be a product photo or a logo graphic to add a bit more flair. “We’re seeing something like 50% to 200% higher click-through rates on these AI-generated CTA stickers, because they’re fun and visually appealing and bring the product to life,” says Helen Ma, Meta’s VP of product management.
Previously announced at Cannes, this visual enhancement is now available to more advertisers globally for Facebook Stories and testing for Facebook Reels, as well as Instagram Stories and Reels.
Creative generation upgrades:
Meta rolled out two notable updates to its generative AI tool kit. First is an AI-generated music feature that understands the content of an ad and produces unique, custom music that reflects the product, style, and sentiment a brand wants to convey. It will also feature AI dubbing for international or multilingual audiences.
The other is what Meta calls “persona-based image generation,” to help advertisers further personalize ads to different customers. This is like an AI vibes tool, changing the vibe of an ad to fit specific audiences. If you’re selling headphones, it can create one image that focuses on style for a fashion angle, one to highlight sound quality for audiophiles, and another that emphasizes comfort for travelers, all from the same product image.
Facebook creator discovery API:
This makes it easier for brands to find creators on Facebook by allowing them and third-party partners (like agencies) to search for creators using keywords. It also helps agencies and brands explore creator insights like audience demographics and average engagement rate to find the best match.
Meta AI assistant-informed ads:
The Meta AI digital assistant has more than 1 billion active monthly users and is available as a stand-alone app, as well as across its apps like Facebook, Instagram, WhatsApp, and Messenger. Starting December 16, the company will utilize users’ interactions with the AI assistant to inform ads and other content it shows them.
Krassimir Karamfilov, Meta Advertising’s VP of product, says that the number of Meta platforms, combined with billions of users, makes it impossible for individual marketers to get the most out of their ads without the help of tools like this.
“It’s just impossible to manually test all the potential variants, so this is why AI is just making it easier to experiment efficiently and then home in on what works,” Karamfilov says.
He knows that some advertisers have expressed concerns over a lack of control, but he counters that ads perform better when they’re not limited to the brand’s initial parameters. “We see a lot of suboptimal usage of our products,” he says. “What we’re doing is all about aligning our systems to the way the advertisers measure value.”
Enter the AI Concierge
Meta isn’t stopping at the ads in your feed. It sees a bigger business opportunity in helping brands—especially small and midsize businesses—utilize AI agents in their own business operations like customer service.
Earlier this month, the company launched Business AI, which acts as a sales concierge to help take a consumer from an ad in a Meta feed all the way to purchasing a product. It acts as a personalized AI agent on Meta ads, messaging threads inside Meta platforms, and can even extend to the brand’s own websites.
Clara Shih, VP of Business AI, says Meta’s clients were asking for help beyond the advertising side. “Our customers have said, ‘We want AI to not only help with product discovery and generating leads, help us all the way to closure, help us with our business operations, help us with customer support questions,” Shih says.
A recent MIT study reported that 95% of enterprise generative AI pilots fail to deliver measurable business impact, despite a collective billions invested. Shih says Business AI takes the burden of infrastructure off of companies in order to feel that impact.
“It’s just very hard, and a lot of companies don’t have big machine learning and AI teams where they can piece all these things together,” she says. “So something else that’s been really important to us is creating something that’s easy to set up and maintain.”
The benefit to brands is that a Meta-powered AI chatbot doesn’t have to learn about a brand from scratch because so many businesses and brands have been active on Meta for years. Shih says all of their past ads and social posts are a gold mine of tacit knowledge about a businesses within the Meta universe, giving the Business AI chatbot a lot of information to work with from the start.
“They don’t have to hire a consultant and pay millions of dollars to set up their chatbot. We could just look at what they’ve said and what they’ve done and what their brand is all about,” she says. “And just by using LLMs to mine that information, we’ve been able to create the world’s first turnkey business agent that just works because it’s them. It’s all based on what they’ve done.”
It’s all very fun and convenient in the short term. And Meta’s recent earnings prove that it’s working. Second-quarter ad sales hit more than $47 billion, a 22% increase year over year. Every executive I spoke to emphasized that these are tools for humans to use, and that the company’s relationship with agencies and creatives are crucial to any of this working—at least for now.
Just keep in mind that the end game here is still full automation. As Shih put it, “Mark [Zuckerberg] has talked about how in the future, the dream state for a business is to come to Meta, share their product catalog, share their business outcomes, and then we can automate the rest through a combination of Advantage Plus AI features, all as a business agent,” she says. “And we are getting closer and closer to that.”