Proprietary Data Is the Key to Surviving the AI Boom

Proprietary Data Is the Key to Surviving the AI Boom

Our SaaS and Software expert, Vijay Raina, joins us today to dissect the turbulent and thrilling landscape of artificial intelligence. With over $100 billion invested since late 2022 and predictions of massive industry consolidation, we’re navigating a period of what some call “irrational exuberance.” Vijay, a specialist in enterprise SaaS technology and software architecture, will help us understand whether we’re in a revolutionary boom or a speculative bubble. We’ll explore the fundamental shifts this wave is causing, from the very structure of SaaS companies potentially dissolving into APIs to the strategies businesses must adopt to build defensible moats in an era where open-source models are rapidly closing the gap on proprietary leaders. Finally, we’ll look at how enterprises can move beyond scattered pilot projects to find real value and what this all means for the future role of the software engineer.

The article notes over $100 billion invested in AI and a prediction that nearly 30% of some startups could be acquired soon. Citing the dot-com era, how do you differentiate this consolidation from a bubble bursting? Please share a few key metrics you look for.

It’s a completely understandable reaction. Anyone who remembers the fire sales on foosball tables in 2001 feels a little twitchy seeing this much money flood into a new technology so quickly. But what we’re seeing now feels more like a maturation than a meltdown. In a true bubble bursting, you see panic selling and companies liquidated for parts. What’s happening here is strategic. The first metric I look at is the acquirer. We see established giants like Nvidia, Databricks, and ServiceNow making these purchases. They aren’t just buying hype; they have roadmaps that stretch out three years, and they know that’s too slow. As Crunchbase’s CEO noted, their corporate development teams are actively hunting for specific points of ingenuity to integrate. This isn’t panic; it’s a calculated race for innovation. A second key metric is the motivation. It’s about filling a strategic gap, not just grabbing market share. The cost of missing the next big thing is seen as far greater than the cost of a failed investment. It’s part of what Carlota Perez calls the “installation phase” of a technology cycle—you need that irrational exuberance to finance the foundational build-out whose ROI is still uncertain. So, I look for strategic acquisitions by established players to accelerate their own roadmaps, which points to healthy consolidation, not a chaotic burst.

It’s suggested that SaaS companies might become APIs, with AI creating a “canvas to rule them all” as the UI. What specific, step-by-step changes should a traditional SaaS company make to its product roadmap and engineering team structure to prepare for this future?

That’s the trillion-dollar question, isn’t it? The idea that twenty years of “calcified software” is suddenly up for grabs is both terrifying and exhilarating. For a traditional SaaS company, the pivot has to be deliberate. First, the leadership must fundamentally shift its mindset. You have to accept that your beautiful, hand-crafted user interface is no longer your primary asset. The value is now in your unique business logic and proprietary data access. Your product roadmap needs to reflect this immediately. Stop prioritizing pixel-perfect UI tweaks and start treating your API as your number one product. This means a relentless focus on documentation, reliability, and performance, designing it for an AI agent, not just a human developer. Second, your engineering team structure must evolve. You can’t just have front-end and back-end teams anymore. You need to create a dedicated “Platform & Services” team whose sole mission is to build and maintain these world-class APIs. You also need a new breed of engineer, maybe an “AI Integration” team, who understands how to make your service indispensable to the LLMs and agents that will live on that “canvas to rule them all.” They become the evangelists ensuring your service is the one the AI reaches out to when it needs to get something done.

With open-source models catching up to proprietary leaders in as little as 41 days, the article mentions trust as a key differentiator. Beyond trust, what specific services or data-driven features can a company offer to build a defensible moat that an open-source model can’t easily replicate?

Trust is the foundation, but you can’t build a fortress on just one stone, especially when the landscape is shifting so fast. The DeepSeek model matching a leader at a 96% lower production cost is a massive signal. Beyond trust, the first line of defense is the classic open-source business model, but on steroids: services. We’re not just talking about support tickets. We’re talking about enterprise-grade security, guaranteed SLAs, managed hosting, and indemnification. As Stefan Weitz mentioned, even Microsoft wouldn’t touch open source back in the day because of licensing and trust issues. Enterprises will always pay to offload that risk and complexity. You’re not selling the software; you’re selling operational peace of mind. But the real, unbreachable moat is data. This is the holy grail. An open-source model is trained on the world’s public information. It can’t know what my best customers look like. A feature that can analyze a company’s unique, proprietary sales data and, as Jager McConnell posited, say, “I’m going to go and find new customers that look just like your most successful ones”—that is something an open model simply cannot invent. AI can’t make up proprietary data, and the companies that build services to extract unique value from it will be the ones that are impossible to disrupt.

One company was described as running 230 AI pilot projects simultaneously. For an enterprise with unique, proprietary data, what is a more focused, three-step strategy they can use to identify the single most valuable AI use case and successfully bring it from pilot to production?

Seeing a company run 230 AI pilots gives me anxiety. It sounds like the shotgun-style VC model, throwing money at a wall and hoping something sticks, but it’s a terrible strategy for an established enterprise. It’s a symptom of not having a real AI strategy at all. A much more effective, focused approach starts with the data, not the tech. First, an enterprise must identify its “crown jewel” data set—the unique, proprietary information that no one else has. Forget about AI for a moment and ask, “What is our most valuable, inimitable data asset?” For a logistics company, it’s shipping routes; for a healthcare provider, it’s patient outcomes. Second, connect that data to a critical, high-value business problem. Don’t just ask “What can AI do?” Ask, “How can we use our unique shipping data to reduce fuel costs by 15%?” or “How can our patient data predict infection risk?” This reframes the entire effort around a tangible business outcome. Finally, you create a small, interdisciplinary team to attack that single problem. The article mentions Caltech researchers solving the catheter infection problem by combining medical knowledge with AI. That’s the model. You need the domain expert who understands the problem, the data scientist who understands the data, and the engineer who can build the solution, all working together. This focused approach delivers a real, measurable win that builds momentum, instead of 230 experiments that go nowhere.

What is your forecast for how the role of a typical software engineer will change over the next five years, especially considering the disruption of traditional UIs and workflows?

I believe the role of the software engineer is heading for a significant transformation, not an extinction. The days of simply translating a spec into lines of code are numbered. The future software engineer will become more of a “systems thinker” and an “AI orchestrator.” Instead of spending weeks building a user interface from scratch, their primary role will be to design and integrate complex systems, using AI as a powerful component. They’ll be selecting the right foundational models, fine-tuning them on proprietary data, and building the logical scaffolding that allows AI agents to perform complex business workflows. We already see that “vibe coding” without deep engineering discipline fails spectacularly. The value will shift from the act of writing code to the architectural decisions behind it. Essentially, engineers will move up the abstraction ladder. They’ll spend less time on the mundane and more time on high-level problem-solving, ensuring the systems they build are secure, scalable, and actually solve the right business problem. The job will be less about being a bricklayer and more about being an architect who has a super-powered, AI-driven construction crew.

Subscribe to our weekly news digest.

Join now and become a part of our fast-growing community.

Invalid Email Address
Thanks for Subscribing!
We'll be sending you our best soon!
Something went wrong, please try again later