CData, Databricks, and the Expanding MCP Ecosystem

Image of finger touching an AI agent in digital space.

Key Takeaways

CData has been named a Featured Launch Partner for the Model Context Protocol (MCP) within Databricks Marketplace, enhancing AI agent development by providing direct access to live data from over 350 enterprise systems.

The integration of CData Connect AI allows Databricks users to build context-aware AI agents without the need for data ingestion into Databricks, streamlining the path from prototype to production, and reducing latency and architectural overhead.

MCP establishes a common standard for agent access to enterprise systems, enabling teams to treat external systems as first-class inputs, which minimizes custom integration efforts and enhances the scalability and reliability of AI agents.

CData has been named a Featured Launch Partner for the Model Context Protocol (MCP) within the Databricks Marketplace. The partnership makes CData Connect AI available inside Databricks, positioning it as foundational infrastructure for agent development.

The integration allows developers to use Databricks’ AI tooling, including Mosaic AI, to build agents that access live data from external enterprise systems through MCP, including NetSuite, Salesforce, ServiceNow, and more than 350 additional systems.

AI agents built in Databricks increasingly require access to live operational systems. CData Connect AI extends that reach, allowing agents built with Databricks’ AI tooling to interact directly with ERP, CRM, and service platforms through MCP.

This capability matters most for enterprises that use Databricks as a core data platform and want AI in production workflows. As agents move closer to systems of record, success depends on reliable connectivity, real-time access, and governed execution paths.

An Emerging Standard for Enterprise Agent Access

MCP adoption now extends beyond a single platform. Following its earlier Microsoft integration, CData brings the same connectivity model into Databricks, reinforcing MCP as a common standard for agent access to enterprise systems.

As with CData’s integration with Microsoft Copilot Studio and Microsoft Agent 365, teams building AI agents with Databricks’ native tooling can treat external enterprise systems as first-class inputs, without separate integration layers or custom connectors.

CData’s MCP layer operates alongside Databricks’ AI tooling, shortening the path from agent design to production deployment. Within Databricks, new data sources often stall the move from prototype to production; the MCP layer helps remove that bottleneck.

With CData Connect AI, agent development inside Databricks remains decoupled from data access. Teams can build agents using Mosaic AI, LangChain, or Microsoft Copilot Studio while relying on a neutral MCP layer that abstracts enterprise systems from Databricks’ intelligence and orchestration layer. This separation lets Databricks users evolve models, pipelines, and agent logic without reworking connections.

The agents query and act directly on source data, using a consistent interface across Databricks and other platforms. Standardization reduces custom code, limits the integration sprawl, and lowers the cost of maintaining production agents over time.

“We’re excited to partner with Databricks to eliminate the organizational context gaps that prevent enterprises from deploying truly intelligent agents at scale,” said Amit Sharma, CEO of CData. “This gives Agent Bricks seamless access to data context outside the Databricks Data Intelligence Platform.”

Databricks Without Mandatory Data Movement

Databricks has traditionally required data to be ingested into its Lakehouse before it could be analyzed or used by downstream applications. That model works well for analytics, but it slows agent-driven use cases that depend on live operational data.

MCP changes that constraint. Through Connect AI, agents built with Databricks tooling can query and, where permitted, act on source systems directly, without first replicating data into Databricks-managed storage. Databricks remains the control plane for AI logic and orchestration, while enterprise systems continue to serve as systems of record.

“This capability significantly accelerates [Agent Bricks users’] ability to build context-aware AI apps and agents that deliver real business impact,” explained Ariel Amster, director of technology partner management at Databricks.

Teams working in Databricks will be able to avoid building and maintaining ingestion pipelines for every use case, reduce data latency, and limit duplication of sensitive information. More importantly, it allows Databricks users to extend agent capabilities across ERP, CRM, and service platforms without re-architecting existing data estates.

What This Means for ERP Insiders

Databricks bridges analytics and operations through CData. Agents can access live enterprise systems without staging data in the Lakehouse, reducing latency and architectural overhead. Databricks-based AI moves beyond analysis to participate directly in operational workflows, where timing and accuracy matter.

Operational AI reshapes enterprise data architecture. Agents no longer require every system to be centralized before they can act. Direct access to systems of record reduces pipeline sprawl and forces CIOs to rethink how data platforms support execution at scale.

Standards, not models, determine which agents scale. As Databricks-based agents move into production, open and consistent standards like MCP define how reliably they access data, enforce permissions, and operate across platforms.