AI Transformation

From Noise to Signal: a 2024 AI Memo10 min read

November 29, 2023   |   7 min read

From Noise to Signal: a 2024 AI Memo10 min read

Reading Time: 7 minutes

Here we are. The proverbial crossroads!

For the last couple of decades, Anand, Costa, and I have been on a journey to understand the role of the brain, our society, systems, people and how they influence the way we build technology systems. Our conundrums over the past decade, formulating our thesis for MSD and constructing it over the last 7 years, have given us a unique perspective on what has transpired in this time in AI.

If you happen to be the AI evangelist, the active user, the probable consumer, the decision maker for the use of AI or the one mandated to “implement AI” in your organisation, this piece is for you!

A Contrarian View

If we were your archetypal Southpark-fiend AI company, this would be the way that this story would have simply been put:

Source: South Park,  the internet, with non-generative modification

We are, for better or indeed for worse, not that company. This is not that piece.

This is our journey of understanding the tight dependencies of good ol’ software engineering and the advances in machine learning, resulting in a “what if” question of an ideal AI native data platform; not quite AGI, but an important step towards that holy grail.

This is our journey of looking beyond the corner, anticipating exponential growth in compute power ubiquity, and fine-tuning network architectures to exploit that growth, culminating in non-linear effects that are super exciting, opening doors to new breakthroughs. 

AI that glitters is not gold

While the world has been busying itself in going from the skeptical extreme of will-AI-make-it-out-of-Asimov’s-writing to today’s near-dystopian how-long-before-sentience, AI implementations have spawned a generation of people with bad habits, entire industries applying sub-optimal practices, tools, processes and application of these systems. Most importantly, it has resulted in enormous tech debt that urgently needs undoing in a rapid manner. 

The damage caused is predominantly in three ways:

  • Implementation of vanity AI projects, that have essentially gated meaningful implementation of AI towards meaningful outcomes
  • Creation of a generation of talent that believes that building models is the only job worth doing, while paying scant heed to fixing the data – the lifeblood of all of the models
  • Loss of tens of billions to suboptimal, fragmented efforts not driven by business goals

Taming of the (not so) Shrew-d data stack

The journey ahead is about taming the modern data stackIn the last 5 years, we’ve successfully created a multitude of parts that help move, organize, and unify data. No good deed, though, goes unpunished. AI is no exception. What has happened is thus:

  • Budgets in the clouds: Spiralling cloud costs that beguile CTOs and Financial Controllers in equal measure
  • One more thing: A journey from Apple’s ‘There’s an app for that’ to the hyper scaler’s ‘There’s a modern data stack component for that’
  • The journey itself is the destination: An architecture nightmare for data and transformation leaders looking at an entire enterprise’s data and looking to bring some sense of progress to it without waiting for a knock on the door every day from a company that says “I’ll do this for you” with 20 vendors, 40 moving parts, a plethora of tools and an interminable rolling 5-year transformation plan.

A Broken Promise

There are two common questions in every enterprise board room today:

  1. The investor and executive ask: How can we realize the promise of AI for ourselves? It seems more ‘within reach’ than ever
  2. The operator question: Are we realistic about the readiness, what with our units with divergent processes, unwieldy data, and talent attrition like never before? 

A two-word phrase, more than any other, has, in the past decade, led to billions going down the drain in its mythical pursuit – Digital Transformation. The truth, unfortunately, is that, well, snake oil has many incarnations, Digital Transformation simply being the latest.  According to MIT Sloan, the cost of failed digital transformation projects is $50B.

The total cost of operation of AI for enterprise is making leaders everywhere take a long pause. A CEO of a large company we recently met, told us that they had invested so much in their multi-year data efforts but attrition in their organization has outdone these efforts and the last one standing in their data transformation journey is a third-party consulting vendor who is not accountable for any of the architecture decisions made over the last 3 years. 

Source: the internet

The Model Blue Pill 


Customer experience does not wait for the ‘transformation project’ train to arrive. SBUs want results, and they want it now. They unleash their Neo, or whatever their equivalent of taking the blue pill is. Some models are built on some data leading to failure of epic proportions, and labeling of AI as a fraud resulting in increased AI budgets – budgets are funny that way – and laurels and promotions for transformational leaders. Failing upward has never been easier. 

Bain published a study showing a meager 25% of all models ended up actually living in production environments in the last few years. Also, the total cost of building and deploying AI continues to soar; the complexity of managing disparate systems grows with the talent available sparse at best; while time to market and meaningful business outcomes suffer.


A Data-Centric + Automation-Centric Approach to AI

A Promise to Our Customers

At, we have always believed in a data-centric approach to operationalizing AI at scale. This has meant a relentless focus on enriching and extracting the most out of data, creating taxonomies that live and change progressively as businesses change while letting the AI decide which models get applied when, for who (and, often, by whom) based on interaction with its users.

This approach has allowed us to apply our platform toward fulfilling these key promises to large:

  1. Fix the data in an automated cost-efficient manner  creating a unified, centralized living multi-modal data repository that is constantly enriching itself
  2. Auto-mine data with AI & deploy contextual graphs rapidly driving actionable intelligence into applications with real-time, accurate, relevant information
  3. Straight through automation of connected workflows enabling STP like you have never seen before! (RPA is dead, thanks to the need for enriched, unified data!)
  4. Drive actionable intelligence through current enterprise apps (rapid AI deployment) with proof of financial ROI in 90 days.

Now that organizations are in the early days of a consolidation cycle of point solutions and multiple vendors, Customers (CFOs and CTOs) need a partner that fulfills the promise of a simplified architecture, metrics improvement and who has skin in the game from raw data to models to apps.

At, we understand this all too well having served large enterprises over the last 7+ years. With our generalizable AI engine, we’re ushering in the era of the platform that delivers outcomes, than one that sells tools to a new generation of model builders. — Your AI Orchestration Platform

We’ve grown with our customers, as we’ve forayed into retail, finance, insurance, healthcare, pharma, and more with our AI suite of products. Across each of them, we realize that all our customers care about are outcomes, data & automation. A culmination of this has been a time-tested, outcome-driven, data-first approach to AI and this has meant a complex orchestration of 3 things:

  1. The efficient movement of clean, multi-modal, AI-enriched datasets that grow and change with time, are connected to multiple applications with no divergence of data, with accurate representation of  lineage
  2. Real-time orchestration of models, a federated intelligence engine applied contextually while observing changing data, interactions with it from end users, and responding to it appropriately, allowing for more optimal set of outcomes,
  3. Deep connection between workflows, a seamless end-to-end automated approach for a connected enterprise with minimal repetitive human intervention

Today, we’re happy to reveal our next chapter of growth with new branding, and a new category — AI Orchestration. Orchestration is the ability to seamlessly move between the 3 layers above: enable enriched, alive, progressively changing data sets for entire organizations and across teams; enable use of a federated engine that contextually understands what the user wants and responds with the right models; and a smooth coordination of end points across workflows creating smooth journeys across your enterprise. 

For example: if you’re looking to enable personalization: we layer intelligence across search, recommendations, cross-sells and upsells, forecasting, notifications, support, customer interactions from everywhere. If you’re looking to enable document automation: we layer intelligence across extraction, classification, matching, reconciliation, feeding all of the derived intelligence into unified customer profiles and centralized data systems for everyone, feeding right back to customer experiences or optimal workflow management in your organization. 

With this renewed focus on orchestration, we’re promising to help your entire organization adopt AI across its functions, undoing the messiness of the last 5+ years.

We believe orchestration simplifies.

Here to serve your analytics, IT & business teams as their system of intelligence

We’re at a time where AI tools and solutions are almost a commodity. The question for Enterprise today is not one of which applications to buy or build. The steep path ahead is one of orchestration across its functions, different departments, each of their needs, and the state of each of their data.

At, we have a deep long-standing partnership with large enterprises and partners around the globe who put their data’s quality and their vision for a unified and connected system before a large budget for model building. We have successfully become the System of Intelligence and, increasingly, also becoming a System of Record for our trusted customers who will constantly mine meaning and context from all that data, keeping it readily accessible by applications from IT to business units.

Our suite of native applications, Hotstart, along with our preset industry-specific model library have made for the perfect entry point for business teams to begin their AI journey towards an outcome-centric AI adoption. Whether it’s personalization, document processing and process automation, content tagging or forecasting and predictive analytics, our native applications focus on providing quick resolution.

With this new chapter, we get ready to expose our model-building tools and our data stack to our customers as well, allowing for them to work off of a single orchestration platform to mediate data, and to serve all their AI and automation needs.

Towards that vision, at our new chapter at, we uncomplicate. We were thinking of coining something to the effect of complify or simplicate but that would not be uncomplicating, would it? 

Welcome to the New Vue! Uncomplicate.


Founders' desk

Meet the A-Team of AI orchestration: Anand, Ashwini and Costa. When they aren't busy saving enterprises from fragmented workflows and frenzied model-building cycles, they're writing killer blog posts to help you navigate the AI transformation maze.