What 2025 Actually Taught Us About AI in Marketing (and What to Do Differently in 2026)

A year ago, most marketing teams were experimenting with AI. Trying ChatGPT for drafts. Testing an AI tool for social scheduling. Dipping a toe in. By December 2025, the experiments are over. AI is embedded in how marketing teams work — 91% of marketers now report using AI regularly, up from 63% the year before. The tools improved dramatically. The adoption curve steepened. And several things happened that nobody fully anticipated.

Looking back over my own work with founders and marketing teams this year, and at the broader industry shifts, I can see five lessons that matter more than any individual tool or trend. These aren't retrospective observations for the sake of it. They're the things that should change how you plan, hire, and invest in 2026.

Lesson 1: The search traffic model broke — and most marketers are still pretending it didn't

This is the one that caught the industry off guard, even though the warning signs were there. Google AI Overviews rolled out broadly in mid-2025, and the impact on organic click-through rates was severe. Studies throughout the year showed CTR declines of 34% to 61% on queries where AI summaries appeared. For informational and educational content — the bread and butter of most content marketing strategies — the drop was even steeper.

HubSpot, widely regarded as one of the best SEO operations in the industry, saw its organic traffic roughly halve. Their CEO acknowledged it publicly: organic search traffic is declining globally, and AI overviews are giving answers that reduce click-through. If HubSpot couldn't avoid it, the rest of us need to take this seriously.

Meanwhile, ChatGPT passed 800 million weekly active users. Perplexity processed 780 million queries in a single month. These platforms are increasingly where buyers start their research — and they don't send traffic the way Google used to. They answer the question inside the interface and occasionally cite a source. The click is disappearing.

For the marketing teams I've been working with, this was the year the penny dropped. The content strategies that drove growth for the past decade — produce high-volume blog content, optimise for keywords, capture traffic, convert through forms — are generating diminishing returns. Not because the content is worse, but because the distribution mechanism has fundamentally changed.

What to do differently in 2026: Stop measuring content success primarily by traffic. Start measuring by citation, visibility, and share of voice in AI-generated responses. Audit how your brand appears in ChatGPT, Perplexity, and Gemini. Structure your content for AI readability — schema markup, clear FAQ formatting, authoritative sourcing. This is what people are starting to call Generative Engine Optimisation (GEO) and Answer Engine Optimisation (AEO), and it should be a priority for every marketing team heading into the new year.

Lesson 2: AI went from content tool to operating system — faster than anyone expected

At the start of 2025, most marketing teams were using AI for content creation. Writing drafts. Generating social posts. Producing email subject line variations. By the end of the year, AI had moved well beyond content into campaign management, analytics, lead scoring, personalisation, and workflow orchestration.

Google's Performance Max and Meta's Advantage+ campaigns became standard practice rather than experimental features. These platforms now handle targeting, bidding, creative selection, and budget allocation with minimal human input. The marketer's job shifted from managing campaigns to managing inputs — the creative assets, audience signals, and conversion goals that the AI optimises against.

The emergence of AI agents — systems that can pursue goals autonomously rather than just follow rules — moved from concept to early production. Automation platforms introduced agent frameworks connecting AI models to hundreds of marketing tools. Teams started designing workflows where agents could pull data, generate reports, flag anomalies, and trigger actions, with humans reviewing rather than initiating each step.

For my own work, this shift was significant. I started the year using AI primarily as a research and writing assistant. By the end of it, I was building Custom GPTs for clients, designing AI-assisted workflows for content production and reporting, and thinking about AI as infrastructure rather than a tool. That progression — from using AI for individual tasks to designing systems that run on AI — mirrored what I saw happening across the industry.

What to do differently in 2026: Stop evaluating AI tool by tool. Start thinking about your marketing operation as a system and ask where AI should be the operating layer. Map your workflows. Identify the ones that are most repeatable and most time-consuming. Those are your first candidates for AI-assisted or agent-driven automation. And invest in the integration layer — connecting your tools so data flows between them — because an AI tool that sits in isolation delivers a fraction of its potential value.

Lesson 3: Content volume became a liability, not an advantage

This one surprised me, even though it shouldn't have. The AI content explosion of 2024 and early 2025 — when every team discovered they could produce five times more content with AI — ran headlong into a reality check: more content doesn't mean more impact when everyone else is also producing five times more content.

By mid-2025, the industry started talking openly about "AI slop" — the flood of competent but undifferentiated content that fills feeds and search results without adding anything distinctive. Brands that leaned heavily into AI-generated volume found that their content performed worse over time as algorithms struggled to surface anything meaningful from the noise.

The counterintuitive lesson: the teams that got the best results from AI in content weren't the ones that produced the most. They were the ones that produced less content overall but used AI to make their production process more efficient, freeing human time for the creative and strategic work that makes content worth reading.

The winning formula that emerged over the course of the year: use AI for the production system (research, first drafts, repurposing, formatting, distribution, A/B testing), but invest human time in the editorial layer (original thinking, distinctive voice, strong opinions, creative risk-taking). AI handles the how. Humans handle the what and why.

What to do differently in 2026: Audit your content output. If you've been producing more content but seeing flat or declining results, the problem is probably differentiation, not volume. Redirect your AI productivity gains toward fewer, better pieces of original content and use AI to maximise the distribution and repurposing of each one. One genuinely original article that gets cited by AI models and shared by your audience is worth more than twenty competent articles that disappear into the noise.

Lesson 4: The measurement conversation got harder, not easier

This was a theme I kept running into across my client work this year, and it's backed up by the data. Despite all the AI tools available for analytics and attribution, the percentage of marketers who can prove AI ROI actually dropped — from 49% to 41% over the course of the year. Not because AI was delivering less value, but because the expectations for what counts as proof kept rising.

Part of this is the search traffic problem I described in Lesson 1. When organic traffic was a reliable growth metric, marketing had a clear story to tell: we invested in content, traffic grew, pipeline followed. With traffic declining even as content quality improves, that story no longer holds. Marketing needs new metrics and new frameworks for proving its contribution.

Part of it is the challenge of measuring AI's impact specifically. When AI is embedded across every workflow, isolating its contribution becomes difficult. Did the campaign perform well because of the AI-generated creative, the AI-optimised targeting, the AI-adjusted bidding, or the human strategy underneath all of it? Attribution gets tangled when AI touches everything.

The companies I saw making the most progress on this problem were the ones that established clear baselines before implementing AI changes, ran structured experiments to isolate impact, and invested in incrementality testing — comparing AI-driven outcomes against control groups to prove causal impact rather than just correlation.

What to do differently in 2026: Before launching any new AI initiative, document your current baselines. Agree with your finance team or leadership on what success looks like in financial terms. Run incrementality tests where possible — even basic A/B comparisons between AI-assisted and non-AI-assisted workflows. And shift your measurement language from traffic and engagement toward revenue, pipeline, CAC, and ROI. The CFO doesn't care how many blog posts AI produced. They care whether the investment is generating returns. (I'll be writing more about this specific topic in the new year — the intersection of AI marketing measurement and financial accountability is one of the most important and underserved areas I can see.)

Lesson 5: Governance stopped being optional

Early 2025 was characterised by a "just ship it" mentality around AI in marketing. Try the tool. Deploy the workflow. See what happens. By the end of the year, the mood had shifted noticeably. As AI moved from individual tools to systems that take autonomous actions — sending emails, adjusting ad spend, generating customer-facing content, responding to reviews — the question of oversight became unavoidable.

Several high-profile incidents throughout the year reinforced this: AI-generated content that went live with factual errors. Personalisation engines that crossed the line from helpful to intrusive. Chatbots that gave customers inaccurate information. None of these were catastrophic on their own, but cumulatively they taught marketing teams that AI without governance is a liability waiting to materialise.

The companies that handled this well didn't build elaborate compliance frameworks. They did something much simpler: they wrote down what their AI tools were allowed to do, what they weren't allowed to do, and when a human needed to review before anything reached a customer. A one-page document. Shared with the team. Updated as they learned. That basic level of intentionality made an enormous difference compared to the teams that were running AI with no explicit guardrails.

What to do differently in 2026: If you don't have an AI governance document for your marketing team — even a basic one — write it in January. Three sections: what AI can do autonomously, what it can never do without human approval, and the escalation path when something goes wrong. Then build a review cadence: monthly for the first quarter, quarterly after that. The goal isn't to slow AI adoption down. It's to make it sustainable and defensible as you scale.

Looking ahead

If I had to summarise 2025 in a single sentence, it would be this: AI stopped being something marketing teams use and started being something marketing teams run on.

That shift — from tool to infrastructure — has implications for everything: how you hire, how you measure, how you structure your team, how you allocate budget, and how you think about competitive advantage. The companies that recognised this early and adapted their operating model have a meaningful head start. The ones that treated AI as a collection of productivity hacks are going to spend 2026 catching up.

The year ahead will be defined by three questions:

How do you maintain brand visibility when AI mediates discovery?

How do you prove marketing's value when the old metrics don't work?

How do you govern AI at scale without slowing your team down?

I don't have complete answers to any of these yet. But I've been working on them all year, and I'll be writing more about each one in the months ahead. If you're wrestling with the same questions, I'd love to hear what you're seeing.

Here's to a more intentional 2026.