Dynamics 365 for Unified Operations has evolved into purpose-built applications to help you manage business functions. This would mean that there would be integration with diverse systems. The blog describes integration patterns, integration scenarios, and best practices. There is a number of ways users can interact with the D365 UO. There are different ways to populate data to Dynamics 365 UO and retrieve data from D365 UO. In my personal opinion, the integration option can be decided based on the following three criteria
- Retrieve data from D365 UO or Populate data to D365 UO
- Real-Time interaction with D365UO or Batch processing of Data
- Amount of Data which needs to be exchanged (Data volume)
|Real-Time/Near Real-Time (Small Data volume)||Batch Job (Large Data Volume)|
|Retrieve Data from D365UO||oData|
Custom Web service
|Data Management Framework|
|Populate Data to D365UO||oData|
Custom Web service
|Data Management Framework|
Dynamics 365 UO Real-Time Integration Options
Dynamics 365 UO Bulk/Batch Processing
Dynamics 365 UO oData and REST
Dynamics 365 UO provides a REST API to interact with D365UO via Data Entities. The REST API provides a mechanism to interact in real-time or near real-time way to interact with the D365 UO. oData can be used to populate, retrieve, update, and delete (CRUD) data in Dynamics 365 UO.
Data Entity: A data entity in D365 is an abstraction from the physical implementation of database tables. A data entity is a simplified de-normalized representation of underlying tables. A data entity represents a common data concept or functionality, (e.g. Vendors V2 where the details are stored in normalized relational tables) but all details are represented in one flat view in Vendor Details data entity.
The data flow for interacting with Dynamics 365 UO using oData:
The Technical implementation of oData with Dynamics 365 UO can be found here
Dynamics 365 UO Business Event
The Dynamics 365 UO Business Events can send events/trigger/notification to external applications such as Azure Integrations, which can use this trigger to handle specific integration or business process scenarios.
The Events existed in Finance and Operations were previously confined to use within Finance and Operations. The new capability provides a framework that will allow business processes in Finance and Operations to capture business events as business processes are executed and send the events to an external system or application.
More about the business event can be found here
Business events provide a perfect integration scenario when an event occurs in D365FO and requires this information to be passed on to ThirdParty systems.
These business events can be used by
- Azure Service Bus
- Azure Logic Apps
- Microsoft Flow
- Azure Functions
- HTTPS Trigger
Since these events happen in the context of business processes, they are called business events that enable business process integration. External business processes will subscribe to specific business events from Finance and Operations to get notified when they occur. The business events can also be consumed as “triggers” in the Finance and Operations connector.
A custom or OOTB business event can trigger Azure Integration Services to process or forward the trigger to Third-party applications.
Dynamics 365 UO Custom webservice
In Microsoft Dynamics UO, a developer can create custom services to expose X++ functionality to external clients. Any existing X++ code can be exposed as a custom service by adding an attribute. D365 UO provides standard attributes that can be set on the data contract class and its members to automatically serialize and de-serialize data that is sent and received across a network connection. Many predefined types, such as collections and tables, are also supported. When a developer writes a custom service under a service group, the service group is always deployed on two endpoints:
- SOAP endpoint
- JSON endpoint
SOAP-based custom service
SOAP-based services remain the same as they were in Dynamics AX 2012.
- All the service groups under the AOTService group node are automatically deployed.
- All services that must be deployed must be part of a service group.
Example endpoint for a dev environment
JSON-based custom service
The JSON Endpoint is
Bulk or Batch Data Processing
Data Management Framework
Data Management Framework: DMF is the new all-in-one concept introduced by Microsoft in Dynamics 365 for Finance and Operations. It supports and manages all core data management related tasks. This enables asynchronous and high-performing data insertion and extraction scenarios. Here are some examples: Interactive file-based import/export, Recurring integrations (file, queue, and so on)
Data Package: Data Package is a simple .zip file that contains the source (import) or target data(export) itself . The zip file contains three files. The data file and the manifest files which contain metadata information of the Data Entity and the processing instructions for DMF.
Interacting with Dynamics 365 UO DMF REST API
In order to call the D365 F&O APIs, it is necessary to authenticate with a valid access token. The token can be retrieved from Azure Active Directory using a valid Application Id and secret key, which has access to the D365FO environment. The application ID and secret key are created by registering an application in Azure Active directory. Then the DMF REST API can be invoked.
Interaction using REST API to Export Data
The high level interaction of API calls to retieve the data package via REST API is shown below.
The detailed technical implemetation of Dynamics 365 UO DMF interaction using REST API has been descrbed here
Dynamics 365 UO Recurring Integration
Recurring integration does the following things:
- It builds on data entities and the Data management framework.
- It enables the exchange of documents or files between Finance and Operations and any third-party application or service.
- It supports several document formats, source mapping, Extensible Stylesheet Language Transformations (XSLT), and filters.
- Document/file exchange in several document formats
- It uses secure REST application programming interfaces (APIs) and authorization mechanisms to receive data from, and send data back to, integration systems.
The complete flow to import job to recurring integration is shown below
- The third party client applications authenticates to the Azure AD token issuance endpoint and requests an access token.
- The Azure AD token issuance endpoint issues the access token.
- The access token is used to authenticate to the D365FO DMF and initiate the import or Export Job. The endpoints are:
The following set of APIs is used to exchange data between the Dynamics 365 F&O Recurring Integrations client and Finance and Operations.
The detailed Technical implementation of Recurring integration can be found here
Microsoft integration patterns can be found here
- D365 FO: Priority based throttling for integrations
- Monitoring and alerting for Azure Key Vault
- D365 FO: Set financial dimension value using oData
- Azure Integration using Managed Identity
- D365 Finance and Operations integration using BYOD
- Azure Service Bus and Logic App integration Pattern using PeekLock