How governed, real-time enterprise data enables AI to reason, justify outcomes, and operate with trust at scale.
Alberto, before we get into the specifics of DeepQuery, let’s start with your perspective—how has your journey at Denodo shaped your vision for where data management and AI intersect today?
Over the years I’ve watched enterprises run faster and more complex analytics on ever-broader datasets from all over the enterprise, and AI raises the bar even more. The hard problems have not changed much: how to break data silos, how to deliver data with the right business context, and how to enable control and governance. And all this in the context of a heavily distributed data ecosystem and distributed data ownership. That’s why Denodo invested in a logical data layer and a rich semantic model: to give AI and humans the same governed, real-time view across distributed systems without necessarily moving the data. Our recent releases double down on this—advancing self-service, AI support, and a marketplace experience—so AI can reliably work with fresh, explainable data instead of stale copies.
DeepQuery introduces a new paradigm in how GenAI engages with enterprise data. What was the core problem you aimed to solve when conceiving this capability?
GenAI applications so far are great at rephrasing known facts and answering direct, simple questions; they struggle with investigation. DeepQuery was built to let AI ask multi-step “why” and “what if” questions across live, governed enterprise data—discovering which systems-of-record matter, querying them in sequence, reconciling results, and returning a report with justified answers, conclusions, and recommendations, not just a fluent paragraph. That’s the gap we targeted: moving from retrieval to deep research over enterprise data.
Many GenAI tools today rely on static or pre-indexed sources. How does DeepQuery’s ability to access live, governed enterprise data change the game for organizations?
Most enterprise questions depend on today’s numbers under today’s business context and priorities. Also, the most valuable data of organizations is inside their databases, not only their documents. DeepQuery operates over live sources through Denodo’s logical layer and semantic model, so answers reflect current reality, and security is enforced using the same row-, column-, and mask-level controls your BI uses. The result is more value, higher accuracy, lower risk, and fewer “data detours” building ad-hoc extracts just for AI.
Let’s talk about complexity—what kind of business questions does DeepQuery excel at answering that conventional GenAI solutions typically struggle with?
Cross-functional “investigations” that span systems and timeframes—e.g., Why did fund outflows spike last quarter and which client segments drove it? or What’s driving churn in Region A vs. B?—where the path to the answer isn’t known in advance. DeepQuery orchestrates multi-hop exploration across finance, CRM, supply chain, telemetry, and more, and returns a structured, sourced explanation.
DeepQuery emphasizes not just answers, but explainability. Why is this level of transparency so critical in enterprise AI applications today?
Enterprises need answers they can trust—with lineage, source-of-truth, and policy context. And in some regions, it’s the law—for example, the European Union’s AI Act. DeepQuery returns not only conclusions, but the steps, queries, and sources used, aligning with our platform’s catalog, lineage, and governance, so teams can audit how an answer was produced and re-run it under the same controls. That transparency is essential for regulated use cases and executive confidence.
With the integration of external public and partner data, how does DeepQuery ensure the balance between comprehensive insights and data governance?
We extend your governed semantic layer to include approved public and partner feeds, then apply the same security, masking, and auditing across all sources—internal or external. DeepQuery can enrich an investigation with market data, news, or partner signals, but every step inherits enterprise policy, and every output is traceable back to sources and access rules.
How does the DeepQuery capability integrate with the broader Denodo AI SDK, and what flexibility does it offer for developers and AI teams building their own copilots or agents?
DeepQuery is delivered through the Denodo AI SDK alongside Query-RAG, AI development tool APIs, and now MCP support. Developers can call DeepQuery programmatically, combine it with their own tools, or embed it into vertical copilots while keeping security and semantics centralized. We’re also investing in talent enablement—the new Denodo AI SDK Certified Developer Associate trains AI developers on how to optimally use these capabilities in their work—thus driving productivity and faster time-to-innovation.
From your perspective, what is the biggest misconception enterprises have about their data needs when scaling GenAI initiatives?
Forgetting about metadata and semantics. LLMs are great, but they cannot read minds. They need comprehensive and detailed context to be able to use the data of the organization. Our experience with customers’ GenAI initiatives has shown this time and again: you need to invest time in building and refining a robust semantic layer for the data needed by your AI use cases. Simply putting “more data in a vector store” will not provide better enterprise answers.
Denodo’s inclusion of Model Context Protocol (MCP) support signals a move toward open standards. How important is interoperability in the future of agentic AI ecosystems?
Agentic AI will be an ecosystem game: multiple agents, tools, and hosts working together. MCP gives us an open, interoperable way to plug Denodo’s trusted data services into all the main frameworks and tools to develop AI Agents.
Lastly, what does success look like for Denodo’s AI Accelerator Program and the early adopters of DeepQuery?
Success is measured in weeks, not months: pilot use cases that move from analyst days to minutes, with clear governance and auditability, then scale to more domains. We’re inviting select customers to accelerate their adoption; momentum with enterprises like NEC adopting Denodo for unified, governed access reinforces the path to value we expect DeepQuery to accelerate.
A quote or advice from the author: LLMs are great but they cannot read minds: invest time in building a robust semantic layer that gives your AI comprehensive and detailed business context about your data and use cases

Alberto Pan
Chief Technology Officer, Denodo
Alberto Pan is Chief Technical Officer at Denodo and Associate Professor at University of A Coruña. He has led Product Development tasks for all versions of the Denodo Platform. He has authored more than 25 scientific papers in areas such as data virtualization, data integration and web automation.
