Featured

Export Data from D365 FO using Synapse Link

Microsoft Introduced Azure Synapse Link for Dataverse to export data from D365 finance and operations apps into Azure Synapse Analytics or Azure Data Lake Storage Gen2. The current version supports both D365FO Entites and raw tables. The previous technology such as Bring Your Own Database (BYOD) and ‘Export To DataLake’ needs to be transitioned to ‘Synapse Link for Dataverse’. The aim is to unify the approach to export the data from the Dynamics 365 platform, regardless of whether it is D365CE or D365FO. The new feature provides scalability, high availability, and disaster recovery capabilities.

The Azure Synapse Link for Dataverse allows you to choose standard and custom finance and operations entities and tables. It supports continuous entity and table data replication, including create, update, and delete (CUD) transactions. It provides an option to link or unlink the D365FO environment to Azure Synapse Analytics and/or Data Lake Storage Gen2 in your Azure subscription without the need for external tools or configuration. Data is stored in the Common Data Model format, which provides semantic consistency across apps and deployments.

Export Dynamics 365 data using Azure Synapse Link using Azure data lake, Synapse and spark

Azure Synapse Link for Dataverse offers the following features that you can use with finance and operations data:

  1. Both standard and custom Entities and Tables can be exported. The older techniques such as BYOD supported only Entities and ‘Export to Data Lake’ supported only raw tables. This feature provides supportability for both BUT the entity support is still being limited to a few entities. The supportability of the entities is growing with each release of D365FO.
  2. Continuous replication of Data: Create, update, and delete (CUD) transactions of records are continuously pushed to Azure Data Lake. This feature doesn’t impact the performance of D365FO, unlike BYOD.
  3. Possible to link or unlink the D365FO environment to multiple Azure Synapse Workspace and/or Data Lake Storage Gen2 in your Azure subscription. The Data Lake and Workspace should be in the same region as the D365FO environment.
  4. The data can be exported in both CSV and Parquet Delta Lake format. Parquet Delta format is recommended due to its better read performance, support for ACID and Schema evolution/drift. ACID transactions ensure data consistency even in the face of concurrent read and write operations.
  5. The number of tables that can be exported in the Export to Data Lake feature limit isn’t applicable in Azure Synapse Link for Dataverse.

There are three options to export D365FO Tables while exporting data using Synapse Link.

This option exports D365FO data to Azure Data Lake in CSV format within your Azure subscription. This mirrors the familiar “Export to Datalake” feature in D365FO. The data resides in Azure Data Lake, enabling downstream applications to build their data pipelines for optimal consumption. Organizations with dedicated Data and Analytics capabilities may find this option ideal. Costs are primarily associated with Data Lake storage, with additional expenses tied to downstream data read, copy, and transformation. Integration with Synapse and Apache Spark pool is optional for this option.

Export Dynamics 365 data  to Azure Data Lake in CSV Format (BYOL CSV) using Azure Synapse Link

Choose this option to export D365FO data to Azure Data Lake but in the Delta Parquet format. This format enhances efficiency, and query performance and supports ACID transactions and Schema evolution/drift. To enable this conversion, provisioning a “Synapse Workspace” and an Apache Spark pool within your Azure subscription is necessary for the CSV-to-Delta format transformation. Similar to Option 1, downstream applications retain the flexibility to customize their pipelines to suit their specific needs. Costs encompass Data Lake storage, Azure Synapse workspace, and Apache Spark pool expenses. However, it’s important to note that frequent delta conversions with a high volume of data changes might lead to substantial Apache Spark pool costs. Additionally, downstream data consumption and conversion expenses should also be considered. This option ensures enhanced performance and adaptability but requires careful monitoring to manage associated costs effectively.

Export Dynamics 365 data  to Azure Data Lake in  Delta Parquet Format using Azure Synapse Link

For a streamlined solution, opt to export D365FO data directly to OneLake. This approach minimizes the need for downstream applications to manage data conversions or copying processes. OneLake acts as a centralized repository, simplifying data access and utilization. This option is well-suited for organizations seeking a seamless and integrated data export process. Costs are associated with OneLake storage (This would be part of the Dataverse storage cost). Here you don’t have the Azure Synapse workspace and Apache Spark pool expenses. However, it’s important to note that OneLake support is Limited to D365 FO and CE entities.

Export Dynamics 365 data  to OneLake in  Delta Parquet Format using Azure Synapse Link
Criteria(Option1) Synapse link BYOL (CSV)(Option2) Synapse link BYOL (Delta)(Option3) Microsoft OneLake with Fabric
Ease of AdministrationCustomer managed PaaS resources – Storage accountCustomer managed PaaS resources – Storage account, Synapse & Spark PoolManaged by Microsoft
SecurityFirewall support on storage accountFirewall support on storage account and synapse workspaceManaged by Microsoft
Query PerformanceData format: CSV
No read-write contention when reading completed Incremental update folder.
Data format: Delta
No read-write contention as Delta supports ACID
Better-read performance
Data format: Delta
No read-write contention as Delta supports ACID
Better-read performance
End-to-End Data FreshnessData freshness (Configurable): 15 minutes – 24 hoursData freshness (Configurable): 15 Minutes – 24 hoursData freshness: ~1 hour
Data Write and Storage CostsStorage + Transaction costStorage + Transaction + Delta conversion – Spark pool costDataverse capacity (entitlement + add-on)
Compute CostDownstream data pipeline costDownstream data pipeline costFabric capacity cost
  • D365FO environment that’s version update 10.0.34 (PU 58) or later
  • Microsoft Power Platform integration for the D365FO environment
  • Enable the Sql row version change tracking configuration key.
  • Access to Azure subscription with the following resource provision
    • Azure Gen 2 Storage account in the same region as the D365FO environment
    • Azure Synapse Analytics workspace in the same region as the D365FO environment
    • Azure Synapse Spark pool with version 3.1 or later

The following access is required to configure the Synapse Link.

More information can be found here

https://learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-select-fno-data
Required PermissionDescription
The user must have the Dataverse system administrator security role for the Dataverse environmentA D365FO environment is connected to a Dataverse environment. To enable Synapse Link, the user should be the administrator of the environment.
The User must have an Azure Data Lake Storage Gen2 account Owner and Storage Blob Data Contributor role accessThe Synapse Link exports the data from Dataverse to a Blob storage using Microservice. The user who is configuring the Synapse Link should have Owner access to manage the Access control of the storage account.
The user must have a Synapse Administrator security role for Synapse WorkspaceThe Synapse Link connects to an Azure Synapse workspace. The user who connects these two services should be the workspace administrator. This Synapse workspace will also create a Spark pool instance to convert files from CSV to Parquet format.
The user should have User Access Administrator and Contribute permission on the resource group.In the resource group where Azure Data Lake and Synapse workspace are present, users should be able to manage these resources, such as creating a Spark pool.

The following links explain how to connect the Azure Synapse Link for Dataverse (which is connected to the D365FO environment) with your Azure Synapse Workspace.

Azure Synapse Link for Dataverse FAQ

https://learn.microsoft.com/en-us/power-apps/maker/data-platform/export-data-lake-faq

Azure Synapse Link for D365FO offers a seamless data export solution for reporting, analysis, and integration. Its scalability ensures efficient handling of growing data volumes without impacting D365 performance. Enhanced performance capabilities enable real-time data processing, analysis and Integration.

The Integration Gap: Why 95% of Enterprise AI Projects Fail and How Azure Integration Services Solves It

The enterprise AI landscape faces a sobering reality: despite huge annual investment, 95% of generative AI pilots fail to deliver measurable business returns. The culprit isn’t the AI models themselves—it’s the missing middleware layer that connects intelligent systems to operational infrastructure. This article examines how Azure Integration Services (AIS), powered by the Model Context Protocol (MCP), transforms isolated AI experiments into production-grade enterprise solutions.

Recent research paints a troubling picture of enterprise AI adoption:

  • 95% failure rate: MIT’s Project NANDA found that virtually all generative AI pilots yield no measurable business return
  • 88% never reach production: IDC reports that nearly nine out of ten AI proof-of-concepts fail to transition beyond experimental stages
  • 42% abandonment rate: S&P Global reveals that companies now abandon most AI initiatives before production—up from just 17% the previous year
  • Zero ROI for 42%: Nearly half of AI projects deliver no return on investment whatsoever

The failure isn’t about model capability. Today’s foundation models can understand context, reason through complex problems, and generate responses of human quality. The breakdown happens at the integration layer—the critical middleware that connects AI to enterprise reality.

According to Informatica’s CDO Insights 2025 survey, the top obstacles to AI success are:

  1. Data quality and readiness (43%)
  2. Lack of technical maturity (43%)
  3. Shortage of skills and data literacy (35%)

But beneath these symptoms lies a fundamental architectural problem: AI models exist in isolation from the systems that run businesses.

Most enterprise AI implementations excel at conversation but struggle with operations. They can:

  • Answer questions about customer data
  • Generate reports and summaries
  • Provide recommendations

But they cannot:

  • Update CRM records
  • Trigger workflow approvals
  • Execute transactions across systems
  • Orchestrate multi-step business processes

This is the integration gap—the chasm between what AI can understand and what it can do.

To move from experimental to operational, AI systems require six foundational capabilities:

1. Secure Connectivity:

Unified access to both cloud services and on-premises systems, with proper authentication and authorisation at every layer.

2. Data Transformation

Seamless mapping between disparate data formats, protocols, and schemas—translating between REST, SOAP, GraphQL, and proprietary APIs.

3. Identity Management

Centralised authentication through Azure Entra ID (formerly Azure AD), with support for OAuth2, managed identities, and credential management.

4. Observability

End-to-end telemetry, logging, and monitoring that provides visibility into every model-to-tool interaction.

5. Rate Control & Quotas

Enforced throttling, token management, and cost controls to prevent runaway usage and manage budgets.

6. Resilience Patterns

Built-in retries, fallback mechanisms, circuit breakers, and error handling to ensure stability under load.

Without middleware implementing these capabilities, AI agents become ad-hoc, ungoverned, and insecure—creating compliance gaps and operational blind spots that make production deployment impossible.

Azure Integration Services provides the complete middleware stack that enterprises need to operationalize AI. It consists of four core services that work together seamlessly:

1. Azure API Management (APIM): The AI Gateway

APIM acts as the front door for AI tools, providing enterprise-grade gateway capabilities.

Security & Authentication

  • Managed identities for keyless authentication to Azure services
  • OAuth2 and Entra ID integration for user-level authorisation
  • Credential manager for secure token storage and rotation
  • Policy-based access control with fine-grained permissions

Traffic Management

  • Rate limiting and quota enforcement
  • Request/response transformation
  • Load balancing across multiple backends
  • Caching for improved performance

Observability

  • Full request/response logging
  • Token usage tracking across applications
  • Performance metrics and analytics
  • Integration with Azure Monitor and Application Insights

AI-Specific Capabilities

The AI gateway in APIM provides specialised features for generative AI:

  • Token consumption monitoring and billing
  • Model endpoint routing and failover
  • Prompt injection detection
  • Response validation and filtering

Export as MCP Server

APIM includes a one-click “Export as MCP Server” wizard that converts existing REST APIs into MCP-compatible endpoints, eliminating manual integration work.

2. Azure API Centre (APIC): The Agent store

APIC serves as the centralised MCP registry, providing:Comprehensive Cataloging, Automated Discovery, Governance at Scale

Comprehensive Cataloging

Register APIs, MCP servers, and tools with rich metadata:

  • Technical documentation
  • SLA definitions
  • Cost information
  • Usage examples
  • Version history

Automated Discovery

AI agents can query APIC to discover available tools, with support for:

  • Semantic search
  • Tag-based filtering
  • Role-based access
  • Environment-specific catalogues (dev, staging, production)

Governance at Scale

  • Approval workflows for new tool registration
  • Compliance tracking and audit trails
  • Lifecycle management (deprecated, beta, GA)
  • Usage analytics and optimisation recommendations

3. Logic Apps: Intelligent Workflow Orchestration

Logic Apps evolve from static automation into intelligent agent loops, providing:1,400+ Connectors

Pre-built integrations to:

  • Enterprise Systems: SAP, Dynamics 365, Salesforce, ServiceNow
  • Databases: SQL Server, PostgreSQL, MongoDB, CosmosDB
  • Cloud Services: AWS, Google Cloud, Azure services
  • Productivity: Office 365, SharePoint, Teams
  • Custom: REST APIs, SOAP services, on-premises systems

Visual Design Experience

Low-code/no-code workflow designer that enables business users and developers to collaborate on process automation.

Agent Loop Pattern

Logic Apps implement the “Think-Act-Reflect” pattern:

  1. Think: Analyse business goals and current context
  2. Act: Execute operations through connectors
  3. Reflect: Evaluate outcomes and adjust strategy

This pattern enables AI agents to not just execute predefined workflows, but to dynamically orchestrate multi-step processes based on real-time conditions.

MCP Integration

Every Logic App HTTP endpoint can be exposed as an MCP tool, instantly making it discoverable to AI agents through APIC.

The failure of enterprise AI isn’t about weak models—it’s about weak integration. Azure Integration Services, powered by MCP, provides the missing middleware layer that transforms isolated AI experiments into production-grade, ROI-driven enterprise solutions.

By combining secure connectivity, governance, orchestration, and interoperability, AIS ensures that AI doesn’t just talk—it acts.

The integration gap is closing. The enterprises that embrace this architecture will be the ones that finally unlock AI’s full business value.