The GTM Slop Problem, Part 2: Why the Old Martech Stack Is Finally Breaking
- 2 days ago
- 8 min read

Scott Brinker of chiefmartec's new report with Databricks, The New Martech “Stack” for the AI Age, lands on a point a lot of people in product, marketing, RevOps, and GTM have felt for years but have not always said clearly enough: the old stack is not just messy. It is starting to get in the way. AI is exposing every weak seam in it. Too many tools. Too much plumbing. Dozens layers and systems pretending to be connected. We’ve spent too much of our teams' time trying to make the stack behave instead of focusing on the business at hand. Anyone who has been through a multi-instance Salesforce integration can surely attest. It’s a nightmare.
What struck me about the latest Brinker report was that it’s neither a martech landscape piece telling you there are more logos than ever (think Luma Partners Lumascapes), nor a suite-versus-best-of-breed argument either. What this report does well is explain why the old way of thinking about the stack is finally giving way to something else.
Brinker calls it a “composable canvas.” The wording may sound a little abstract at first, but the point is practical: if the old stack was a bunch of boxes tied together with integrations, the new model is a shared data foundation with apps, services, and agents assembled around it as needed.
From a GTM standpoint, that is a very big deal, because it's not really a martech story. It is about whether your growth engine can actually move as fast as your market.
The report says most teams have assembled dozens of tools over time, each solving a real problem, but together creating a web of integration headaches. AI raises the stakes because new capabilities are showing up constantly and competitors are moving faster.
The report’s answer is not rip-and-replace. It is a shift in architecture, from a rigid stack to a more fluid model built on a shared data foundation. And it argues that the business payoff is real: faster execution, lower integration drag, more adaptability, and more room for differentiation through custom capabilities. This last point is the one I would underline.
When every competitor can buy roughly the same commercial tools, the advantage moves away from procurement and toward what you can uniquely compose on top of your own data, workflows, and operating logic. That is where this gets interesting for GTM leaders. Because the question is no longer just “What tools do we use?” It becomes “What kind of operating model are we actually building?”
The report frames this as the 3rd Age of Martech. The first age was defined by hard tradeoffs: suite versus best-of-breed, software versus services, build versus buy. The second age started to soften those lines with platform ecosystems, blended software and services, and customization on top of commercial platforms. The third age pushes further. The old either-or categories start to blur, and the stack starts to dissolve into something more composable and more dynamic.

That tracks with what a lot of us have been seeing in market. Most enterprise stacks still look like a mural of boxes. CRM here. CDP there. CMS, DAM, MAP, DSP, BI, and now AI agents are orbiting the whole thing. This is more flexible than the old suite model, sure, but still rigid in practice. Integration is partial and data is oftentimes duplicative. Arguably, true composability has remained mostly aspirational,
AI is what turns that structural weakness into a real business problem.
“This is where dashboards end and agents begin.” This point from the report gets to the heart of it. Dashboards were built for a world where humans looked at signals and decided what to do next. Agents do not work that way. They need context across systems, data across the business, and the ability to act in real time. That is why the old layered stack starts to buckle.
Traditional SaaS vendors are embedding AI into their products. Agent builders want to orchestrate across any system. Data platforms are pushing up and saying, not unreasonably, that if they already house the data, maybe they should be where the apps and agents run too. Everyone wants to be the place where agents run.
This is not just technical pressure, it is commercial pressure too. If your GTM team cannot get from data to decision to action without waiting on integration work, you are already slower than the market.
Marketing has the opportunity to lead enterprise AI adoption because campaigns are always-on, data-rich, and powered by continuous feedback loops, which is exactly what agents need to learn, decide, and act in real time. That is a fancy way of saying: marketing now has one of the clearest use cases for this shift, and one of the clearest penalties for missing it.
The report’s “everything is data” section is also stronger than it might sound at first glance.
The idea is simple. In the old model, data moved between systems. In the new model, data becomes the operating layer everything runs on, or as Brinker says, “Martech no longer sits on top of the data, it is the data.” Data of all types (e.g., customer, company, content, code data) become part of one operating foundation.

Once you see the world this way, the question changes. It is no longer, “How do I integrate these products?” It becomes, “What do I want my business operations and customer experience to actually do?” That is a much better GTM question, and that shift has some very practical consequences.
Attribution gets less theatrical and more operational. The report says when campaign data connects to pipeline and revenue data in the same foundation, attribution stops being a quarterly debate and becomes a continuous conversation with the CFO. That is a major change. It means marketing can spend less time defending itself and more time driving decisions.
Content becomes operational too, not just creative. The report describes content data as something queryable, governed, and dynamically assembled by agents that understand the rules. That is a useful way to think about the next wave of content operations. Content stops being a file cabinet problem and starts becoming part of the execution layer.
And then there is the semantic layer, which most teams still underestimate. The semantic layer sits between underlying data and the tools that use it.
It applies shared definitions, relationships, calculation logic, and access rules so metrics and business objects are interpreted the same way everywhere.
Call this the keeper of coherence.
If sales, marketing, finance, and service all define customer value differently, agents are just going to operationalize your inconsistency faster.
Semantic mess used to create bad dashboards, but now it can create bad automated decisions.
Finally there is the concept of context graphs, which Brinker is defines as the reasoning behind decisions that more often than not is lost across an amalgamation of Slack threads, calls, emails, and people’s heads. Traditional systems capture what happened, but do a terrible job capturing why it happened.
This is a big GTM idea because context is what separates blind automation from intelligent execution, and for this to be real, reasoning should become durable, searchable, and operational too. Think of all the machinations behind the simplest of questions: Why did a lead get routed? Why did a discount get approved? Why did a budget shift? Why did a human override the model? Hard to get straight answers when you're bereft of context.
Integration has been martech’s invisible tax for years.
The report gives a clean way to think about how that tax evolves. In the first age, integrations were mostly point to point. In the second age, hub-and-spoke made things better, but the hub became the bottleneck.
In the third age, when data lives in a unified foundation and applications operate on the same data layer, the integration burden starts to shrink in a much more meaningful way. This is not just a technical cleanup, rather a scaling decision.
This matters because if every new capability takes months of integration work, your stack becomes a ceiling on GTM performance. If the underlying architecture reduces that drag, your team gets more time back for actual growth work. That is a much more useful way to evaluate martech than simply asking whether a tool has a feature.
The point solution problem is real, and only getting worse.
The martech landscape has gone from roughly 150 solutions in 2011 to over 15,000 today, and this commercial growth is only a prelude. The next wave will not only come from vendors, but also from inside companies: IT-built apps, citizen-developed workflows, and agent-generated software created on the fly for specific tasks.
Brinker calls the shift from the commercial long tail to the hypertail of custom-built martech. That is one of the most important ideas in the report, because this is where differentiation starts to live.

Commercial software still matters, but your competitors can buy the same thing. The hypertail is where companies encode what is unique about their own customer understanding, decisions, and execution logic. Custom software is where you build what competitors cannot buy.
Meagen Eisenberg, CMO of Samsara made a sharp and fair point, “If a SaaS product is just making data easy to access, companies are increasingly going to do that themselves, so vendors need something more defensible at the experience layer.” This points to the fact that point solutions that mostly wrap access without creating a deeper operating advantage are likely to feel pressure first.
Why should GTM teams care?
Mainly, because this is not a martech ops story. It is a story about whether your growth system can actually move at the speed your market requires. The martech stack is no longer just a stack, rather a control system, and the winners are likely to look different than they did in the last era.
A stronger data foundation will sit underneath everything, enabling "meaning" to be standardized before action is automated. Custom apps and agents will get used where they create real advantage. Integration will be treated less like a background chore and more like a strategic speed lever.
Finally, tools will be judged less by whether they fill a box and more by whether they fit into a composable operating model without adding new drag.
Why should I do now?
I would not start by asking what to replace, rather ask what is slowing the business down. Then I would do these five things:
First, get the data foundation right. If your data is still scattered and you are still fighting over a single version of truth, everything else gets harder.
Second, build the semantic layer before you overbuild the agent layer. If your definitions are messy, your automations will just be messy faster.
Third, move attention from plumbing to decisioning. Every hour you save on integration drag is time you can put back into actual growth work.
Fourth, rethink your categories as capabilities, not boxes. Ask what job needs to be done, what data needs to be shared, and what should be bought versus built.
And fifth, manage the hybrid state with purpose. Existing tools and new capabilities are going to coexist for years. Fine. But know what stays, what gets replaced, and what gets custom-built over time.
The old stack helped organize a messy world, but the new one has to do more than that. It has to help you move faster, decide better, and execute without so much friction.
