AI Made Friendly HERE

Why The AI Revolution Demands A New Control Plane

Serge Lucio is the VP and GM of Agile Operations Division, Broadcom Inc.

The conversation surrounding enterprise IT operations has been dominated by two critical pillars: systems of record (SORs), where core data resides, and systems of intelligence (SOIs), where actionable insights are derived. For decades, the primary challenge for large organizations has been building the connective tissue—the conduits and pipelines—between these two realms.

Today, however, the arrival of large language models (LLMs) and generative AI isn’t just accelerating this connection; it’s fundamentally altering the landscape, moving us from a structured, centralized environment to one defined by proliferation and, critically, chaos.

If organizations fail to establish a sophisticated control plane now, the very tools designed to drive efficiency will instead introduce unprecedented governance, security and operational risk.

The Problem Of Fragmentation And Brittle Pipelines

For the last 20 years, enterprise IT has been plagued by the fragmentation of automation tools. Whether we look at traditional workload schedulers, ETL tooling or modern streaming technologies like Apache Kafka and Airflow, the result is a complex, often brittle patchwork.

This brittleness has serious consequences. In large financial institutions, for instance, some of our customers may experience 1% to 5% failure rates on mission-critical jobs—this can mean hundreds of thousands of daily failures that force manual and costly remediation, not to mention business impact.

Furthermore, the intelligence derived from these pipelines is often built on questionable data. Consider the common business process of sales forecasting. Traditionally, systems of intelligence rely on dashboards populated by customer relationship management (CRM) tools. The status of an opportunity is often proxy data, manually entered and frequently outdated. This creates an enormous overhead as users chase down why the data is incorrect or stale.

This is where AI can be a game-changer.

The AI Leap: From Proxy Data To Source Truth

The true power of AI, particularly LLMs, is its ability to unlock the vast potential of unstructured data. For a decade, enterprises tried to tame unstructured data through data lakes, which often devolved into unmanageable “data swamps.” LLMs finally provide the engine to process this raw material—the meeting notes, emails and calendar invitations—and transform them into structured intelligence.

Take sales forecasting. Instead of relying on a manually updated CRM status, an LLM can parse every piece of communication related to an opportunity and deduce its true state. This allows business users to quickly generate powerful, ad-hoc insights based on the source data, not slow, hand-fed approximations.

This leap in insight, however, comes with a monumental tradeoff: governance control.

The Governance Crisis: The Rise Of Ad-Hoc Automation

When business users realize they can bypass slow, governed pipelines and load sensitive company data directly into an LLM—be it a public cloud model or an internal solution—the organization loses all visibility and control.

We’re witnessing an explosive proliferation of adhoc AI agents. Although these empower users, they introduce severe risk:

• Security And Data Leakage: When an employee exports proprietary data and feeds it to an external LLM, that data is often retained by the LLM system as part of its retrieval-augmented generation (RAG) context. This PII or corporate IP has suddenly left the control of the enterprise.

• Compliance: If an organization operates in the EU, data residency and the “right to be forgotten” are non-negotiable. If proprietary derived data is scattered across multiple LLM instances (some potentially transatlantic or cloud-based), compliance becomes impossible.

• Auditability: Who used the data? How was the insight generated? In an ungoverned environment, the chain of custody is broken, making auditability and regulatory response unfeasible.

This sudden shift—from centralized, highly regimented systems of intelligence to decentralized, ad-hoc agent proliferation—is the chaos that IT must now confront.

Building The Control Plane: The Imperative For Adaptive Automation

To reap the benefits of AI without succumbing to data chaos, organizations must focus on building a robust, centralized control plane for all automation. This isn’t about replacing existing technologies; it’s about establishing orchestration and governance over them.

There are two immediate, interconnected imperatives for operations teams and enterprise architects:

1. Hardening The Automation

We must leverage AI to make the pipelines themselves more resilient. By baking LLM capabilities directly into the automation platform, we can create adaptive workflows that automatically respond to unexpected changes. If an upstream job is late or a database schema changes, the automation system should intelligently adjust rather than fail. The goal is to drive job failure rates down from the current 1% to 5% range we’re seeing to 0.1% or less, transforming brittle pipelines into rock-solid infrastructure.

2. Enabling Governed Ad-Hoc Automation

This is the future of enterprise automation. We need to empower business users to create those crucial, adhoc insights—but within a secure, controlled framework.

The vision is a conversational interface—a chat function—where a line-of-business user can ask for complex data insights (“Show me all high-value opportunities forecasted to close next quarter”) and the automation platform handles the complexity seamlessly.

Behind the scenes, the control plane performs three critical functions:

1. Orchestration: Coordinating the necessary blend of technologies (e.g., Apache Airflow, file transfer and streaming) required to fulfill the request.

2. Governance And Guardrails: Automatically applying data masking and filtering to ensure the user only accesses data they’re authorized to see.

3. Auditability: Logging the entire workflow, verifying that the data remained within approved systems of record and maintaining compliance.

The responsibility for leading this transformation falls squarely on the enterprise architects. But IT operations teams, who own the platforms capable of providing this robust orchestration and governance, have an imperative to educate architects on how to leverage and modernize their existing tools to enable this new era of intelligent, adhoc automation.

In this moment, we’re building the steam engine for this new natural resource—data. We must ensure that we build it not only for power but for control.

Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

Originally Appeared Here

You May Also Like

About the Author:

Early Bird