The First Rule of Black Box AI is You Do Not Talk About Black Box AI
Giving Up Our Agency in the Age of AI Advertising
“Advertising has us chasing cars and clothes, working jobs we hate, so we can buy shit we don’t need. We are the middle children of history, raised by television to believe that someday we’ll be millionaires and movie stars and rock stars. But we won’t. And we’re slowly learning that fact. And we’re very, very pissed off.” – Tyler Durden
The irony of an advertising industry being manipulated by its own methods isn’t lost on me.
But that’s exactly what’s happening in the age of AI.
Because as an industry, we never question the premise. Or who is telling it to us. Instead, we reach for the easy button.
Meta CEO Mark Zuckerberg—who understands this tendency all too well—is offering advertisers his biggest promise yet: a customer vending machine.
“We’re going to get to a point where you’re a business, you come to us, you tell us what your objective is, you connect to your bank account, you don’t need any creative, you don’t need any targeting demographic, you don’t need any measurement except to be able to read the results that we spit out.”
Well, when you put it that way, who needs their agency anyway?
*************************
It’s easy to be convinced when you’re being told exactly what you want to hear. It’s so effective, in fact, there’s an entire industry built on this idea.
We are all too willing to buy into the premise when we like the sound of the promise.
In David Fincher’s 1999 film Fight Club, the narrator/protagonist, “Jack,” is depicted as an unwitting captive to consumerism—beholden not to the promise of actual success but merely the appearance of success. He accepts the premise without ever questioning whether it’s worth the price. Until a chance airplane encounter with soap salesman Tyler Durden:
TYLER: You know why they have oxygen masks on planes?
JACK: Supply oxygen?
TYLER: That’s a sharp answer. The oxygen gets you high. You're taking in giant, panicked breaths and, suddenly, you become euphoric and docile, and you accept your fate.
The advertising industry has bought into every recent promise—personalization at scale, perfectly measurable outcomes, cost-efficient media buys, AI-driven optimization—without ever considering the veracity of the premise. We are fed results that offer the appearance of success. Despite so much evidence pointing to the opposite.
And we fall for it again, and again, and again.
Editor’s Note: Fight Club (and Tyler Durden specifically) has been co-opted in recent years as a totem for the alt-right and embrace of toxic masculinity. I’ve always interpreted the film as the opposite—a strong rebuke of misogyny and other toxic aspects of our culture. Fortunately, David Fincher agrees with my interpretation: “It’s impossible for me to imagine that people don’t understand that Tyler Durden is a negative influence. People who can’t understand that, I don’t know how to respond and I don’t know how to help them… We didn’t make it for them, but people will see what they’re going to see in a Norman Rockwell painting, or [Picasso’s] Guernica.”
Principal-Based Buying: Giving Up Your Agency
Advertisers—sleepwalking through all of this—briefly awaken every few years to the cold reality: they’ve lost their grip on their agency, which has devised yet another new scheme to extract profits from them. In the 2010s, agency rebate scandals rocked the industry. Today, the issue is principal-based buying.
Principal-based buying is an arbitrage scheme where agencies buy ad inventory at a volume discount—in effect, becoming the principal—and sell it to their clients at a markup. The Association of National Advertisers (ANA), concerned over the practice’s opacity and potential conflicts of interest, urges marketers to demand transparency from their agencies.
Once the agency becomes the principal, its fundamental incentive structure has changed. Without inventory transparency, the advertiser can no longer trust that the media is being recommended in its best interest.
But look at the discount on those CPMs!
Surely that’s worth the cost of transparency… right?
*************************
The agency has the advantage of information asymmetry: it knows the advertiser’s objectives, the price it’s willing to pay, and the cost of the media. By purchasing large quantities of inventory, the agency has maximal flexibility to distribute media in a way that optimizes their spread rather than the effectiveness of the campaign.
Jared Belsky, founder and CEO of Acadia, an independent agency, explains how a scenario like this might play out at agencies engaging in this practice:
“Let’s say there’s this 27-year-old media planner two days away from presenting her media plan to her client, Converse. And on her plan, she's got a cool digital-out-of-home [activation] to appear on basketball courts. And she spent all the time thinking about how ads could show up on a podcast dedicated to hip-hop culture. And she's worked out a really cool media buy with some influencers that are into setting hoops culture.
All of a sudden, her SVP or the chief investment officer comes in and says, ‘Where's the NBCU allocation? Where's the Warner Brothers allocation? Doesn't someone know that's how we make our money?’ A day later, the plan that was created with all this care has magically been changed to include NBCU and Warner Brothers and Meta because that’s where the margin is for the holding company.”
Why do advertisers allow this to happen? Belsky explains:
“First, a majority of clients don’t understand it—or they’re even being opted into something unbeknownst to them. Second, they’re being fed that it’s in their best interest. And third, the agency is linking a better CPM to a better outcome, even when there are a hundred arguments for why a higher CPM on CTV, for example, could be better than low-CPM tonnage on a major broadcast network. It creates this narrative that cheap is better.”
According to a survey from the ANA’s 2024 report The Acceleration of Principal Media, brands said the top benefit of principal-based buying is reduced cost. At the same time, they said the biggest problem was the potential conflict of interest it introduced combined with the lack of transparency.
The problem with sacrificing pricing transparency for reduced media costs is its circular logic. There’s no way to know if the reduced costs are ultimately cost-effective without transparency. Meanwhile, the media they’re getting is not being selected to maximize performance for the brand— it’s being selected to maximize the agency’s margins.
Black Box AI: Buying Into a False Premise
The agencies aren’t alone in playing this game. Ad platforms like Meta and Google have also turned obfuscation into a business model with their black box AI optimization tools.
And while they’ve long promised advertisers an easy button for more cost-efficient campaigns, their rhetoric has ratcheted up. Zuckerberg recently characterized Meta’s black box AI as “the ultimate business machine” that will be “one of the most important and valuable AI systems that gets built.”
This glossy veneer distracts advertisers from what’s really happening: their data is being extracted.
It’s a familiar playbook which has been surprisingly effective. Over the years, advertisers have willingly uploaded their CRMs (“custom audiences”), added the Meta pixel to their checkout pages, and implemented the Conversions API based on the promise of better business results.
Never mind that they’re giving Meta premium data on their most-likely-to-convert customers.
It shouldn’t come as a surprise that Meta is now getting into principal-based buying. After all, agencies are just another source of valuable data to feed the algorithms. If the long game is to disintermediate agencies, now is the perfect time to get into this racket.
Although, you could argue Meta’s black box AI optimization tool, Advantage+, already puts it in this racket. Same goes for Google Performance Max (PMax).
PMax and Advantage+ require the advertiser only to input a campaign objective and budget, and the black box algos go to work.
But there’s a catch, which Eric Seufert of MobileDevMemo calls a “Faustain bargain” in his article “The Advertiser’s Dilemma with PMax and Advantage+”:
“[T]hese so-called “black box” campaign optimization tools come at a cost: transparency. By design, the targeting options for these automated campaign types are limited: the tools perform best when given maximum agency for exploration, and so they limit the controls that advertisers can exert for targeting.
…..
But it’s not just audience targeting that these tools restrict: they also restrict placement targeting. And further, they also mostly obscure audience-level and placement-level performance in reporting.”
Pooling the inventory is essential to the scheme. PMax campaigns can run on all of Google’s inventory, including Search, YouTube, and the Display Network, while Advantage+ campaigns can run on Facebook, Instagram, or Meta Audience Network. The advertiser gets limited to no visibility post-campaign on which audiences were reached and where the placements ran—just the results.
This creates a misalignment of incentives, as Seufert explains:
“A ‘black box’ automation tool need only meet the advertiser’s performance goals. In this way, the advertiser and the platform have incentives that are only tangentially, obliquely aligned. The advertiser’s goal is to maximize ROAS within some budget constraints. The platform’s goal is to maximize the advertiser’s spend at whatever performance standard keeps them spending. When the advertiser has total control over targeting, the platform must deliver performance objectives that conform to the advertiser’s natural inclination to maximize ROAS for every targeting segment. When the advertiser surrenders that control to the platform, the platform needs only to achieve the budget-level performance standard established by the advertiser — not to maximize ROAS for every targeting segment and placement it experiments with.”
If this sounds familiar, that’s because it’s a direct parallel to principal-based buying.
1. Extract data from the advertiser about its customers and desired campaign outcomes.
2. Commoditize the inventory while limiting advertisers the ability to select specific placements.
3. Distribute the inventory non-transparently to meet the advertiser’s desired outcome.
4. Profit from the surplus margin via data-driven arbitrage of the inventory.
The result: the advertiser leaves value on the table from inventory that could’ve maximized their outcomes if the platform was incentivized to do that. This scheme only works when advertisers don’t require transparency. Information asymmetry is the fundamental driver of margin.
Results are promised in exchange for advertisers’ surrendering control to the platform. But the promise is just an illusion. The black box enables the platform to steer impressions toward sub-prime inventory from their audience networks. Much of this traffic (probably most of it) is IVT, as noted ad fraud researcher Augustine Fou has previously documented.
But the platform’s sleight of hand focuses the advertiser’s attention on the performance—clicks, conversions, and ROAS—all of which can be easily manipulated by bot traffic.
Many advertisers remain in the dark about the quality of the inventory they’re getting when they give up control over ad delivery. They think they’re reaching users on the primary domains and apps, but the reality it’s a much different picture when the audience networks are mixed in.
The platforms are supposed to work on behalf of advertisers—or at least that’s what they tell them. Yet they refuse to give advertisers any agency in the process.
Agentic AI: Giving Up Your Self
Now Big Tech is using billions in profit from black box tools to invest in their pursuit of AI supremacy. The next front is agentic AI, where the ambitions get even bigger. And it’s not just Meta and Google, but Sam Altman’s OpenAI and Elon Musk’s xAI entering the fray.
Earlier this year, Zuckerberg promised that highly intelligent and personalized AI assistants—tailored to the individual’s context, interests, and personality—will reach “more than 1 billion people, and I expect Meta AI to be that leading AI assistant.”
It’s not only individuals but businesses that will benefit.
“I think we're going to live in a world where there are going to be hundreds of millions or billions of different AI agents eventually, probably more AI agents than there are people in the world... Every business in the future, just like they have an email address, a website, and a social media presence today, is going to have an AI agent that their customers can talk to in the future.”
Maybe the only person more ambitious than Zuckerberg is Altman, who sees agentic AI as merely a step towards a “glorious future” of superintelligence.
“We believe that, in 2025, we may see the first AI agents ‘join the workforce’ and materially change the output of companies. We continue to believe that iteratively putting great tools in the hands of people leads to great, broadly-distributed outcomes.”
One industry Altman expects to undergo material change is advertising:
“It will mean that 95% of what marketers use agencies, strategists, and creative professionals for today will easily, nearly instantly and at almost no cost be handled by the AI... all free, instant, and nearly perfect. Images, videos, campaign ideas? No problem.”
Where will the creative come from? The picked-over carcasses of once-proud publishers and ad agencies.
Extract. Commoditize. Distribute. Profit.
In this instance, the businesses in the media and advertising value chain are the victims.
But agentic AI is also coming for you. And its appetite is even more voracious.
My recent attempt to download Tactiq, a virtual meeting AI transcription tool, hinted at what’s on the horizon.
“It could: Read and change all your data on all websites.”
A data extraction tool, powered by OpenAI, sold to you with the promise of efficiency. “It will save you time, hassle, and money! It will make your life easier!”, they proclaim.
*************************
“Lye—the crucial ingredient,” said Tyler, describing the origins of the soap he fashioned from the stolen waste of liposuction clinics. “Ancient peoples found their clothes got cleaner if they washed them at a certain spot in the river. Why? Because human sacrifices were once made on the hills above this river. Year after year, bodies burnt. Rain fell. Water seeped through the wood ashes to become lye. The lye combined with the melted fat of the bodies till a thick white soapy discharge crept into the river.”
“Tyler sold his soap to department stores at $20 a bar,” explained Jack. “Lord knows what they charged. It was beautiful. We were selling rich women their own fat asses back to them.”
Extract. Commoditize. Distribute. Profit.
In the AI arms race, you can never have too much data. So, the promises must get even bigger.
According to Zuckerberg, in the future all facets of our lives will be intermediated by AI agents. They’ll be your personal assistants, your friends, your trainers, even your therapists.
“I think people are going to want a system that knows them well and that kind of understands them in the way that their feed algorithms do,” Zuckerberg said during a recent interview.
All you need to do is give the AI access to your data. Your financial records, your employment records, your health records. Tell it your motivations, your desires, your fears.
All to be thrown into the black box and fed back to you as “helpful” recommendations from a friendly agent.
What could possibly go wrong?
Question the Premise
“If you don't know what you want, you end up with a lot you don't.”
What is it about the promise of an easy button that convinces everyone to reach for the snooze button?
This moment is a wake-up call for the advertising industry. And it’s a wake-up call for individuals.
We must question the premise of black box AI.
Does it really promise us a “glorious future”? Is it a future we even want? What are we willing to give up in exchange for it?
All these companies ask of us is to give up our agency.
But as Altman and Zuckerberg are quick to point out: your agency is no longer needed.
great write-up. I now have the Pixies in my head :)
If Zuck eventually give you directly what you (should) want as a business, that is value (conversions x (Price - Cost)), how could bots manipulate that but by actually buying the products ?
Or maybe "bots" will morph into "agentic clients" actually buying on your behalf the products that Meta "Agentic servers" will show you ?