Dynamics 365 UO: Set financial dimension using oData integration

Use case: I had to set financial dimension values for Purchase Order Lines using oData while inserting data into D365 Unified Operations.

To set the financial dimensions, we need to set the value for the field “DefaultLedgerDimensionDisplayValue“. When I tried to set the value, I got the following error.

Dynamics365 UO Set financial dimension using oData

Write failed for table row of type ‘PurchPurchaseOrderLineEntity’. Infolog: Error: The number of dimension values specified in XXXX is different from the number set up in the active format for data entities.Things to try: • Make sure the number of dimension values specified matches the active format for data entities. • Make sure that any dimension values specified that include the chart of accounts delimiter character in the name escape the delimiter character with a backslash character ‘\’.; Error: Error found when validating record..

So the error seems to be related to settings for Dimensions in Dynamics 365 UO. So just to confirm this I have manually set the Dimension values and retrieved that record using OData. I found out that that dimension value was empty in the JSON.
Req :https://XXX.sandbox.operations.dynamics.com/data/PurchaseOrderHeadersV2?$filter=PurchaseOrderNumber%20eq%20%27XXXXX%27&$top=2 and the value for the field DefaultLedgerDimensionDisplayValue was empty : “DefaultLedgerDimensionDisplayValue”: “”,
This confirmed that the settings for Dimension values are not correctly set in D365UO. In D365UO, there are separate Dimension settings for integration. This can be found in General ledger > Chart of accounts > Dimensions > Financial dimension configuration for integrating applications

Dynamics 365 UO Set financial dimension oData Finanncial dimension configuration General ledger > Chart of accounts > Dimensions > Financial dimension configuration for integrating applications

Then I added a “Default dimension format” dimension settings for oData data import. I clicked Add, then gave the “Name of the dimension format”, selected “Default dimension format”, and Set Yes for “Active form type”. Then added the financial dimensions needed, which would give the format I should use in my oData call.

The Financial dimension format for my use case was ContractHeader|ContractNumber|Crop|Producer|QualityCode|Partnercode . The delimiter for the dimension values is |. The delimiter character can be changed in D365 FO in the page General ledger>Ledger setup>General ledger parameters under Chart of accounts and dimensions tab

Change the Financial Dimension Delimiter

So let us say we need to fill in only ContrctHeader and ContractNumber for the use case, then the value should be filled in as 0001|0200|||| for the oData call. What it means is that we always have to stick to the dimension format even when no values are needed to be filled for that dimension. So the following examples

  • When ContractHeader, ContractNumber, Partnercode values need to be filled then 0001|0200||||4555
  • When Crop, Producer, QualityCode, Partnercode values need to be filled then ||Onion|33445|HQ|4555

Then I submitted my oData call and which was successfully executed. The output was of the oData call showed the correct dimension value.

Azure Integration using Managed Identity

This blog describes managed identities and different types of managed identities. Managed Identities are in essence similar to Service Principals and they are actually Service Principals. The biggest difference between both is that Azure Managed identities manage the initial creation of the service principal and automatic renewal of the service principal without any additional workload required – Which is great and highly recommended to be used.

The managed identity can be used when the source resource and the target resource are in the same azure ad tenant. That is when Logic App or Function App in the Azure Tenant wants to access the information from the target application in the same tenant such as D365 CE, D365FO, KeyVault

Service Principal

A service principal is An application whose tokens can be used to authenticate and grant access to specific Azure resources or cloud resources from a user-app, third party applications, service, or automation tool when an organization is using Azure Active Directory. This is the approach usually used in integration designs. Service Principal can be considered as a replacement for the Service Account concept on on-premises Active Directory.

Managed Identity

The Managed identity is similar to Service Principal but they are always linked to an Azure Resource (such as Logic App, Azure Functions), not to an application or 3rd party connector. They are created automatically when Managed Identity is enabled for an Azure Resource. This would indeed automatically create a service Principal, but no one knows the credentials. There are two types of Managed identities

  • System assigned
  • User Assigned

System assigned

These identities are tied directly to an Azure resource such as Logic App, Azure Functions, and abide by that resources’ lifecycle. For instance, if that resource is deleted then the identity too will be removed.

Let’s walk through the scenario of enabling System assigned managed identity for Logic App. Then using the managed identity accessing the Secrets from Azure Key Vault.
Steps to enable managed identity for Logic App

  • Go to the Azure Portal
  • Select the Logic App and then Open the Logic App
  • Under the Settings Tab on the Left, find Identity and click on it
  • Then Select System Identity
  • Set Status as On, and save the changes. This would create a Managed Identity for the Logic App
Enable Logic App system  assigned azure managed identity

Now let’s go to the Key vault where the Secrets are stored, which the previous Logic App needs to Access

  • Select the Azure Key Vault and then Open the Key vault
  • Under the Settings Tab on the Left, find Access Policies and click on it
  • Then Click on Add Access Policy
azure managed identity system assigned azure keyvault access policy

Then select the Secret Management for Configure from the template, Secret permission to Get and List. Then click on Select Principal and search for the Managed Identity of the Logic App. Then Click Select and Click Add
Then click Save to add the Logic App managed Identity to KeyVaul.

azure managed identity system assigned azure keyvault access policy

Now Logic App would be able to access the KeyVault Secrets using the following action

azure managed identity system assigned access key vault using HTTPS action

User Assigned 

These identities are created independent of an Azure resource, and as such can be used between different resources. Removing them is a manual process whenever you see fit.

This means that you first have to create it as a stand-alone Azure resource by itself, after which it can be linked to multiple Azure Resources. An example here could be out of integration with Key Vault, where different Azure services belonging to the same application stack (such as Logic App, Azure Functions, Web App), need to read out information from Key Vault. In this case, We could create a Single Managed Identity, and link it to the function, Multiple logic apps all belonging to the same application architecture.
Let’s walk through the scenario of creating the user-managed identity, enabling User assigned managed identity for Logic App. Then using the managed identity accessing the Secrets from Azure Key Vault.
Steps to enable managed identity for Logic App

  • Go to the Azure Portal
  • Create new Resource, and search for “Managed Identities”
  • click Create.
  • Specify the Resource Group, Azure Region and Name for this resource.
create user assigned azure managed identity
create user assigned azure managed identity
  • Select the Logic App and then Open the Logic App
  • Under the Settings Tab on the Left, find Identity and click on it
  • Then Select User assigned and Click Add
  • The search for the User assigned Managed identity which was created and then select. Then click Add.
Enable Logic App user assigned azure managed identity

Now let’s go to the Key vault where the Secrets are stored, which the previous Logic App needs to Access

  • Select the Azure Key Vault and then Open the Key vault
  • Under the settings Tab on the Left, find Access Policies and click on it
  • Then Click on Add Access Policy
azure managed identity user assigned azure keyvault access policy

Then select the Secret Management for Configure from the template, Secret permission to Get and List. Then click on Select Principal and search for the User Managed Identity which was created. Then Click Select and Click Add
Then click Save to add the User assigned managed Identity to KeyVault.
Then click Save

azure managed identity user assigned azure keyvault access policy

Now Logic App would be able to access the KeyVault Secrets using following action

azure managed identity user assigned access key vault using HTTPS action

conclusion

The managed Identity creation and automatically roll over the service principal is done by Azure for you. This is done by Azure in the background and requires no human/customer intervention. These credentials are rotated/rolled over every 46 days, this is a default behaviour/policy. Put simply, the difference between a managed identity and a service principal is that a managed identity manages the creation and automatic renewal of a service principal on your behalf.

D365 Finance and Operations integration using BYOD

This blog explains the integration option to export data entities from Dynamics 365 F&O into a Microsoft Azure SQL database. D365FO provides a feature via Data Management Frame called bring your own database (BYOD). The BYOD feature lets D365 administrators export one or more data entities that are available in D365FO into an Azure SQL database.

Steps to create DMF dynamics 365 UO  finance and operation byod azure database data management framework

The BYOD feature lets:

  • Define one or more SQL databases that admin can export D365 FO entity data into.
  • Export either all the records (full push) or only the records that have changed or been deleted (incremental push).
  • Use the rich scheduling capabilities of the batch framework to enable periodic exports.
  • Access the entity database by using Transact-SQL (T-SQL), and even extend the database by adding more tables.

The BYOD feature is recommended for the following use cases:

  • You must export data into your own data warehouse.
  • You use analytical tools other than Power BI, and those tools require T-SQL access to data.
  • You must perform batch integration with other systems.

D365 FO BYOD: Steps to setup the BYOD for Integration

Steps to create DMF dynamics 365 UO  finance and operation byod azure database data management framework

D365 FO BYOD: Steps to Create Azure SQL Database

To create a resource group, server, and single database in the Azure portal:

  1. Sign in to the portal.
  2. From the Search bar, search for and select Azure SQL.
  3. On the Azure SQL page, select Add.
Azure Database for DMF dynamics 365 UO  finance and operation byod azure database data management framework

  1. On the Select SQL deployment options page, select the SQL databases tile, with Single database under Resource type. You can view more information about the different databases by selecting Show details.
  2. Select Create.
Azure Database for DMF dynamics 365 UO  finance and operation byod azure database data management framework
  1. On the Basics tab of the Create SQL database form, under Project details, select the correct Azure Subscription if it isn’t already selected.
  2. Under Resource Group, select Create new, enter resourceGroupName, and select OK.
  3. Under Database details, for Database name enter D365FOBYODDEV.
  4. For Server, select Create New, and fill out the New server form as follows:
    • Server name: Enter D365FOBYODDEV, and some characters for uniqueness.
    • Server admin login: Enter azureuser.
    • Password: Enter a password that meets requirements, and enter it again in the Confirm password field.
    • Location: Drop down and choose a location, such as West Europe.Select OK.
Azure Database for DMF dynamics 365 UO  finance and operation byod azure database data management framework
  1. Record the server admin login and password so you can log in to the server and its databases. If you forget your login or password, you can get the login name or reset the password on the SQL server page after database creation. To open the SQL server page, select the server name on the database Overview page.
  2. Under Compute + storage, if you want to reconfigure the defaults, select Configure database. On the Configure page, you can optionally:
    • Change the Compute tier from Provisioned to Serverless.
    • Review and change the settings for vCores and Data max size.
    • Select Change configuration to change the hardware generation.After making any changes, select Apply.
  3. Select Next: Networking at the bottom of the page.
Azure Database for DMF dynamics 365 UO  finance and operation byod azure database data management framework
  1. On the Networking tab, under Connectivity method, select Public endpoint.
  2. Under Firewall rules, set Add current client IP address to Yes.
  3. Select Next: Additional settings at the bottom of the page.
Azure Database for DMF dynamics 365 UO  finance and operation byod azure database data management framework
  1. For more information about firewall settings, see Allow Azure services and resources to access this server and Add a private endpoint.
  2. On the Additional settings tab, in the Data source section, for Use existing data, select Sample.
  3. Select Review + create at the bottom of the page.

Get the connection string details from the Azure portal

Log In to the Azure portal, go to your Azure Database for SQL server, and then click Connection strings to get the string list for BYOD instance:

Connection string for DMF dynamics 365 UO  finance and operation byod azure database data management framework

D365 FO BYOD Data source Configuration

  1. Log into D365FO
  2. Then Go to Systems Administrations > Workspace > Data management
  3. Click on > Configure Entity Export to database
Configure DMF dynamics 365 UO  finance and operation byod azure database data management framework
  1. Then click on New
    Now fill the data
    • source name and
    • description
    • Enter the connection string. It should be in the format:
      Data Source={azure.database.windows.net},1433;Initial Catalog={database};Integrated Security=False;User ID={userid};Password={password}
  1. The click on Validate
    Note: You have to add the client ip on Azure SQL Firewall
Connection string for DMF dynamics 365 UO  finance and operation byod azure database data management framework

Publishing Data Entities

To make a data entity available in your BYOD database, you need to first publish it.  This simply creates the table in the BYOD database. The Publish page enables several scenarios:

  • Publish new entities to the database.
  • Delete previously published entities from the database. (For example, you might want to re-create the schema.)
  • Compare published entities with the entity schema. (For example, if new fields are added later, you can compare the fields with your database schema.)
  • Configure change tracking functionality that enables incremental updates of your data.
  1. Now Go back to Systems Administrations > Workspace > Data management
  2. Click on > Data Entities
Export Entity using DMF dynamics 365 UO  finance and operation byod azure database data management framework
  1. Now select the required Entity which you want to export, then click on Publish
  2. Select the BYOD source name which was created using previous step and click publish
Export Entity using DMF dynamics 365 UO  finance and operation byod azure database data management framework
  1. A Job Will schedule and Message will appear once job complete and a table will be created to targeted DB.

There is the option to set CHANGE TRACKING for each entity. If the change tracking is enabled for an entity, then you will be able to export incrementally. If you do not set this, then you can only do a full export. This feature must be set individually for each exporting entities

Create Data Job and Export

  1. Now Go back to Systems Administrations > Workspace > Data management
  2. Click on > Export Data
Export Entity using DMF dynamics 365 UO  finance and operation byod azure database data management framework
  1. Now fill the required fields Like Name and Select Target data format as the BYOD Data source, Default refresh type should be incremental or Full push (based on the use case) then click on Add
Export Entity using DMF dynamics 365 UO  finance and operation byod azure database data management framework
  1. Once you fill all the required fields, You are ready to export
Export Entity using DMF dynamics 365 UO  finance and operation byod azure database data management framework
  1. After completion of data export you can verify on your targeted data source in Azure Database

The BYOD feature has the following limitations.
There should be no active locks on your database during synchronization
You can’t export composite entities into your own database
Entities that don’t have unique keys can’t be exported by using incremental push

A good article on Debugging BYOD errors are described here and more detailed information can be found in Microsoft Docs

Azure Integration: Azure Service Bus and Logic App integration Pattern using PeekLock

This blog entry describes the best practice for reading a message from Azure Service Bus in Logic App using Peek Lock Mechanism

Azure Integration: Azure Service Bus

The central capability of a message broker such as Service Bus is to accept messages into a queue or topic and hold them available for later retrieval. When the broker transfers a message to a client, the broker and client want to establish an understanding of whether the message has been successfully processed and can therefore be removed, or whether the message delivery or processing failed, and thus the message might have to be delivered again. There are two types of delivery

  • ReceiveAndDelete
  • Peek-Lock Message

The Receive-and-Delete mode tells the broker to consider all messages it sends to the receiving client as settled when sent. That means that the message is considered consumed as soon as the broker has put it onto the wire. If the message transfer fails, the message is lost. So this is not desirable for our use case, hence we will concentrate on the PeekLock option.

Peek-Lock (Non-Destructive Read) using Logic App

Peek Lock azure integration pattern for Logic App to handle Message Locks

Logic App atomically retrieves and locks a message from a queue or subscription for processing. The message is now guaranteed not to be delivered to other receivers during the lock duration specified in the queue/subscription description. When the lock expires, the message becomes available to other receivers. In order to complete processing of the message, the receiver should issue a delete command with the lock ID received from this operation. To abandon processing of the message and unlock it for other receivers, an Unlock Message command should be issued, otherwise the lock duration period can expire. This operation should be used in applications that require At-Least-Once delivery assurances. If the receiver does not delete the message before processing succeeds, this ensures that another receiver is able to attempt processing after the lock duration period expires.

In Logic App we will use a variable to monitor the status of the Message. The message would have the following status. Then based on the message state, we need to perform the related actions (Handle Message Lock loop in the below image is the one performing these actions)

  • Processing: If the processing of the message takes a longer duration, then message lock needs to be renewed by Logic App. The default duration is 60 seconds.
  • Processed: Which means the message has been successfully processed by Logic App and then we need to inform the Service bus to complete the message on the queue
  • Error: Which means the message couldn’t be processed, hence we need to inform the ASB to abandon the message
Peek Lock azure integration pattern for Logic App to handle Message Locks

When Message is processed successfully (i.e. end of the Try Processing the Message) block, we will set the status of the message state to processed. In Catch the Exception block, we will set the MessageState to Error. Please note that there is a link between Handle Message Lock and the rest of the operation. They coexist. We will use Until loop action from Logic App to monitor the message state and perform the relevant action. The loop ends when the message state is either Done[Processed] or Terminate[Error]

Peek Lock azure integration pattern for Logic App to handle Message Locks

The following are the JSON for Logic App to Handle the lock correctly.

{
    "definition": {
        "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
        "actions": {
            "Finally": {
                "actions": {
                    "Condition": {
                        "actions": {
                            "Terminate": {
                                "inputs": {
                                    "runError": {
                                        "code": "500",
                                        "message": "Message processing encoutered an Error!"
                                    },
                                    "runStatus": "Failed"
                                },
                                "runAfter": {},
                                "type": "Terminate"
                            }
                        },
                        "expression": {
                            "and": [
                                {
                                    "equals": [
                                        "@variables('MessageState')",
                                        "Terminate"
                                    ]
                                }
                            ]
                        },
                        "runAfter": {},
                        "type": "If"
                    }
                },
                "runAfter": {
                    "Process_the_Message": [
                        "Succeeded"
                    ]
                },
                "type": "Scope"
            },
            "Initialize_MessageState": {
                "inputs": {
                    "variables": [
                        {
                            "name": "MessageState",
                            "type": "string",
                            "value": "Processing"
                        }
                    ]
                },
                "runAfter": {},
                "type": "InitializeVariable"
            },
            "Process_the_Message": {
                "actions": {
                    "Catch_the_Exception": {
                        "actions": {
                            "Set_variable": {
                                "inputs": {
                                    "name": "MessageState",
                                    "value": "Error"
                                },
                                "runAfter": {},
                                "type": "SetVariable"
                            }
                        },
                        "runAfter": {
                            "Try_processing_the_Message": [
                                "Failed"
                            ]
                        },
                        "type": "Scope"
                    },
                    "Handle_Message_Lock": {
                        "actions": {
                            "Switch": {
                                "cases": {
                                    "Abandon_the_Message": {
                                        "actions": {
                                            "Abandon_the_message_in_a_queue": {
                                                "inputs": {
                                                    "host": {
                                                        "connection": {
                                                            "name": "@parameters('$connections')['servicebus']['connectionId']"
                                                        }
                                                    },
                                                    "method": "post",
                                                    "path": "/@{encodeURIComponent(encodeURIComponent(parameters('WebFromCEQueuName')))}/messages/abandon",
                                                    "queries": {
                                                        "lockToken": "@triggerBody()?['LockToken']",
                                                        "queueType": "Main",
                                                        "sessionId": ""
                                                    }
                                                },
                                                "runAfter": {},
                                                "type": "ApiConnection"
                                            },
                                            "Set_the_MessageState_to_Error": {
                                                "inputs": {
                                                    "name": "MessageState",
                                                    "value": "Terminate"
                                                },
                                                "runAfter": {
                                                    "Abandon_the_message_in_a_queue": [
                                                        "Succeeded"
                                                    ]
                                                },
                                                "type": "SetVariable"
                                            }
                                        },
                                        "case": "Error"
                                    },
                                    "Complete_the_Message": {
                                        "actions": {
                                            "Complete_the_message_in_a_queue": {
                                                "inputs": {
                                                    "host": {
                                                        "connection": {
                                                            "name": "@parameters('$connections')['servicebus']['connectionId']"
                                                        }
                                                    },
                                                    "method": "delete",
                                                    "path": "/@{encodeURIComponent(encodeURIComponent(parameters('WebFromCEQueuName')))}/messages/complete",
                                                    "queries": {
                                                        "lockToken": "@triggerBody()?['LockToken']",
                                                        "queueType": "Main",
                                                        "sessionId": ""
                                                    }
                                                },
                                                "runAfter": {},
                                                "type": "ApiConnection"
                                            },
                                            "Set_the_Message_State_to_Done": {
                                                "inputs": {
                                                    "name": "MessageState",
                                                    "value": "Done"
                                                },
                                                "runAfter": {
                                                    "Complete_the_message_in_a_queue": [
                                                        "Succeeded"
                                                    ]
                                                },
                                                "type": "SetVariable"
                                            }
                                        },
                                        "case": "Processed"
                                    },
                                    "Renew_the_Message_lock": {
                                        "actions": {
                                            "Renew_lock_on_the_message_in_a_queue": {
                                                "inputs": {
                                                    "host": {
                                                        "connection": {
                                                            "name": "@parameters('$connections')['servicebus']['connectionId']"
                                                        }
                                                    },
                                                    "method": "post",
                                                    "path": "/@{encodeURIComponent(encodeURIComponent(parameters('WebFromCEQueuName')))}/messages/renewlock",
                                                    "queries": {
                                                        "lockToken": "@triggerBody()?['LockToken']",
                                                        "queueType": "Main"
                                                    }
                                                },
                                                "runAfter": {},
                                                "type": "ApiConnection"
                                            },
                                            "Wait_half_the_standard_queue_lock_time": {
                                                "inputs": {
                                                    "interval": {
                                                        "count": 30,
                                                        "unit": "Second"
                                                    }
                                                },
                                                "runAfter": {
                                                    "Renew_lock_on_the_message_in_a_queue": [
                                                        "Succeeded"
                                                    ]
                                                },
                                                "type": "Wait"
                                            }
                                        },
                                        "case": "Processing"
                                    }
                                },
                                "default": {
                                    "actions": {}
                                },
                                "expression": "@variables('MessageState')",
                                "runAfter": {},
                                "type": "Switch"
                            }
                        },
                        "expression": "@or(equals(variables('MessageState'), 'Done'),equals(variables('MessageState'), 'Terminate'))",
                        "limit": {
                            "count": 60,
                            "timeout": "PT1H"
                        },
                        "runAfter": {},
                        "type": "Until"
                    },
                    "Try_processing_the_Message": {
                        "actions": {
                            "Set_the_Message_State_to_Processed": {
                                "inputs": {
                                    "name": "MessageState",
                                    "value": "Processed"
                                },
                                "runAfter": {},
                                "type": "SetVariable"
                            }
                        },
                        "runAfter": {},
                        "type": "Scope"
                    }
                },
                "runAfter": {
                    "Initialize_MessageState": [
                        "Succeeded"
                    ]
                },
                "type": "Scope"
            }
        },
        "contentVersion": "1.0.0.0",
        "outputs": {},
        "parameters": {
            "$connections": {
                "defaultValue": {},
                "type": "Object"
            },
            "WebFromCEQueuName": {
                "defaultValue": "webform-ce-queue-dev",
                "type": "String"
            }
        },
        "triggers": {
            "When_a_message_is_received_in_a_queue_(peek-lock)": {
                "inputs": {
                    "host": {
                        "connection": {
                            "name": "@parameters('$connections')['servicebus']['connectionId']"
                        }
                    },
                    "method": "get",
                    "path": "/@{encodeURIComponent(encodeURIComponent(parameters('WebFromCEQueuName')))}/messages/head/peek",
                    "queries": {
                        "queueType": "Main",
                        "sessionId": "None"
                    }
                },
                "recurrence": {
                    "frequency": "Minute",
                    "interval": 3
                },
                "type": "ApiConnection"
            }
        }
    },
    "parameters": {
        "$connections": {
            "value": {
                "servicebus": {
                    "connectionId": "/subscriptions/SubscriptionID/resourceGroups/RG/providers/Microsoft.Web/connections/servicebus",
                    "connectionName": "servicebus",
                    "id": "/subscriptions/SubscriptionID/providers/Microsoft.Web/locations/westeurope/managedApis/servicebus"
                }
            }
        }
    }
}

Azure Integration: Generate a Flat file with ANSI encoding using Logic App

Handling non UTF8 encoding in Logic App

Recently there was a requirement to generate a flat file in Azure Logic App and deliver to Azure File Share with ANSI encoding as targeting application could only process ANSI encoding file. Much of cloud services assume that a text payload will be some form of UTF (Unicode) encoding. Azure Logic App Assumes it is UTF-8. Such that when your text payload is in a different encoding, such as a page code based encoding, the non-basic Latin characters gets mangled. This is particularly common with Flat Files because they integrate with ancient systems that often were not written with Unicode support.

My approach to solving the problem was to create an Azure Function App that converts the encoding from UTF-8 to windows-1252 (or to any other encoding) and then stores the file content in Azure File storage.

Azure Integration Logic APP ANSI Encoding using Azure Function App

This seems to be an easy fix but the main problem was that Azure Logic App did not like the output from Azure Function App and threw an exception as shown below

BadRequest. Http request failed as the content was not valid: ‘Unable to translate bytes [E4] at index 83 from specified code page to Unicode.’.

Azure Integration Logic APP ANSI Encoding using Azure Function App failure exception

The solution would be to use Base 64 encoding. Base 64 encoding ensures that none of the services in Azure integration going to assume a UTF encoding. Once you convert any non-UTF flat file such (as windows-1252) to UTF-8, then base 64 decodes it safely and process it with flat-file decode.

Azure Function App to change the encoding

The following azure function app can be used to convert the encoding of the text in base64 encoding

using System;
using System.Net;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Newtonsoft.Json;

namespace PnJ.FunctionApp.ConvertEncoding
{
    public static class ChangeBase64Encoding
    {
        [FunctionName("ChangeBase64Encoding")]
        public static async Task<object> Run([HttpTrigger(WebHookType = "genericJson")]HttpRequestMessage req, TraceWriter log)
        {
            log.Info($"Change base 64 Encoding function App was triggered");

            Encoding inputEncoding = null;

            string jsonContent = await req.Content.ReadAsStringAsync();
            dynamic data = JsonConvert.DeserializeObject(jsonContent);

            if (data == null || data.text == null || data.encodingInput == null || data.encodingOutput == null)
            {
                return req.CreateResponse(HttpStatusCode.BadRequest, new
                {
                    error = "Please pass text/encodingOutput properties in the input Json object."
                });
            }

            try
            {
                string encodingInput = data.encodingInput;
                inputEncoding = Encoding.GetEncoding(name: encodingInput);
            }
            catch (ArgumentException)
            {
                return req.CreateResponse(HttpStatusCode.BadRequest, new
                {
                    error = "Input char set value '" + data.encodingInput + "' is not supported. Supported value are listed at https://msdn.microsoft.com/en-us/library/system.text.encoding(v=vs.110).aspx."
                });
            }

            Encoding encodingOutput;
            try
            {
                string outputEncoding = data.encodingOutput;
                encodingOutput = Encoding.GetEncoding(outputEncoding);
            }
            catch (ArgumentException)
            {
                return req.CreateResponse(HttpStatusCode.BadRequest, new
                {
                    error = "Output char set value '" + data.encodingOutput + "' is not supported. Supported value are listed at https://msdn.microsoft.com/en-us/library/system.text.encoding(v=vs.110).aspx."
                });
            }

            string input = data.text;
            var outputBytes = Encoding.Convert(srcEncoding: inputEncoding, dstEncoding: encodingOutput, bytes: Convert.FromBase64String(input));

            var response = req.CreateResponse(HttpStatusCode.OK);
            response.Content = new StringContent(content: JsonConvert.SerializeObject(new
            {
                text = Convert.ToBase64String(outputBytes)
            }).ToString(), encoding: encodingOutput, mediaType: "application/json");

            return response;
        }
    }
}

The function app receives the following input and encodes to the desired format, sends it back.

{
  "encodingInput": "utf-8",
  "encodingOutput": "windows-1252",
  "text": "U0hQMDAwMDExMzZ8VHN1YmFraSBFdXJvcGUgQi5WLnxBdmVudHVyaWpufDMzMTYgTEJ8RG9yZHJlY2h0fE5MfDE4NDMwOHxSdXRoZW5iZXJnIExhbmR0ZWNobmlrfENhcmwtQm9yZ3dhcmQtU3RyIDF8R8O8dGVyc2xvaHwzMzMzNXxERXwyMDIwMDQyOXw0OE58U3xCb3ggM3xDaGFpbiBhbmQgcGFydHN8M3wwLjAyfA=="
}

If the above approach doesn’t work, then please use the following approach, which gave me the desired results.

using System;
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
using Newtonsoft.Json;

namespace PnJ.FunctionApp.ConvertEncoding
{
    public static class ChangeEncoding
    {
        [FunctionName("ChangeEncoding")]
        public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequestMessage req, TraceWriter log)
        {
            log.Info($"Change Encoding function App was triggered was triggered!");

            Encoding inputEncoding = null;

            string jsonContent = await req.Content.ReadAsStringAsync();
            dynamic data = JsonConvert.DeserializeObject(jsonContent);

            if (data == null || data.text == null || data.encodingInput == null || data.encodingOutput == null)
            {
                return req.CreateResponse(HttpStatusCode.BadRequest, new
                {
                    error = "Please pass text/encodingOutput properties in the input Json object."
                });
            }
            try
            {
                string encodingInput = data.encodingInput;
                inputEncoding = Encoding.GetEncoding(name: encodingInput);
            }
            catch (ArgumentException)
            {
                return req.CreateResponse(HttpStatusCode.BadRequest, new
                {
                    error = "Input char set value '" + data.encodingInput + "' is not supported. Supported value are listed at https://msdn.microsoft.com/en-us/library/system.text.encoding(v=vs.110).aspx."
                });
            }

            Encoding encodingOutput = null;
            try
            {
                string outputEncoding = data.encodingOutput;
                encodingOutput = Encoding.GetEncoding(outputEncoding);
            }
            catch (ArgumentException)
            {
                return req.CreateResponse(HttpStatusCode.BadRequest, new
                {
                    error = "Output char set value '" + data.encodingOutput + "' is not supported. Supported value are listed at https://msdn.microsoft.com/en-us/library/system.text.encoding(v=vs.110).aspx."
                });
            }

            string input = data.text;
            var outputBytes = Encoding.Convert(srcEncoding: inputEncoding, dstEncoding: encodingOutput,  inputEncoding.GetBytes(input));
            var response = req.CreateResponse(HttpStatusCode.OK);
            MemoryStream ms = new MemoryStream(outputBytes);
            response.Content = new StreamContent(ms);
            response.Content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
            return response;
        }
    }
}
{
  "encodingInput": "utf-8",
  "encodingOutput": "windows-1252",
  "text": "U0hQMDAwMDExMzZ8VHN1YmFraSBFdXJvcGUgQi5WLnxBdmVudHVyaWpufDMzMTYgTEJ8RG9yZHJlY2h0fE5MfDE4NDMwOHxSdXRoZW5iZXJnIExhbmR0ZWNobmlrfENhcmwtQm9yZ3dhcmQtU3RyIDF8R8O8dGVyc2xvaHwzMzMzNXxERXwyMDIwMDQyOXw0OE58U3xCb3ggM3xDaGFpbiBhbmQgcGFydHN8M3wwLjAyfA=="
}

I created following Azure to Logic App to test the out come. The outcome from the second approach gave me better results.

Azure Integration Logic APP ANSI Encoding using Azure Function App

Azure Integration: Dynamics 365 UO DMF Data Export using Logic Apps

This Blog describes the method of implementing Dynamics 365 UO’s Data Management Framework Recurring Integration Module using the Logic App. The blog provides a technical implementation of queuing and dequeuing of the Jobs using REST API of Dynamics 365 UO’s Recurring Integration Module. Dynamics 365 for Operation provides two primary sets of APIs, Recurring Integration APIs & Data Management Platform (DMF) package APIs, to support file-based integration scenarios. These APIs allow DMF data files as well as data projects to be imported and exported from Dynamics 365 for Operations.
In this use scenario for Dynamics 365UO  recurring integrations, data will be exported to a downstream system on a recurring schedule using Logic App and store that information in Blob Storage. It shows how to export a data package and save it to a Blob and notify the systems using the Azure service bus.

Azure Integration: Dynamics 365 UO DMF Data Export using Logic Apps

More on Recurring Integration can be found here

Dynamics 365 UO Recurring Integration API for Export (dequeue)

To return a data package that contains all the data entities that were defined in the data project, and that the client application can unzip and consume, use the following structure.

https://<base URL>/api/connector/dequeue/<activity ID>

After the client downloads the data, an acknowledgment must be sent back to Finance and Operations, so that you can mark the data as received. In cases when there was no file uploaded to the blob, the dequeue API will return a response indicating as such.

  1. The execution Id of the DMF Job has been returned to the client application, which can be used to monitor the progress of the execution of the Job.

The set up involves the following set and the API supports import/export of DMF Data projects

oData end point for Dynamics 365 UO Integration Design Patterns: Recurring Integration
Authorization for the Dynamics 365 UO Recurring Integration API

The integration REST API uses the same OAuth 2.0 authentication model as the other service endpoints. Before the integrating client application can consume this endpoint, you must create an application ID in Microsoft Azure Active Directory (Azure AD) and give it appropriate permission to the application. When you create and enable a recurring job, you’re prompted to enter the Azure AD application ID that will interact with that recurring job. Therefore, be sure to make a note of the application ID.

Set up a Dynamics 365 UO data project and Dynamics 365 UO recurring data jobs

Create a data project
  1. On the main dashboard, select the Data management tile to open the Data management workspace.
  2. Select the Import or Export tile to create a new data project.
  3. Enter a valid job name, data source, and entity name.
  4. Download a data file for one or more entities. Make sure that each entity is added, and that no errors occur.
  5. Target data format, select XML Element (Could use any format)
  6. Select Save.
Create a Dynamics 365 UO recurring data job
  1. On the Data project page, select Create a recurring data job.
  2. Enter a valid name and a description for the recurring data job.
  3. On the Set-up authorization policy tab, enter the application ID that was generated for your application, and mark it as enabled.
  4. Expand the Advanced options tab and specify either File or Data package.
  5. Select Set processing recurrence, and then, in the Define recurrence dialog box, set up a valid recurrence for your data job
  6. Select OK, and then select Yes in the confirmation message box.
Download data from Dynamics 365 UO using recurring data jobs

You can use integration REST endpoints to integrate with the client, submit documents (import), or poll available documents for download (export). These endpoints support OAuth.

Dequeue the Dynamics 365 UO recurring Export Job
Dynamics 365 UO recurring data jobs API for Export (dequeue)

Make an HTTP POST call against the following URL and In the message body, you can then pass the data as a memory stream.

1https://<base URL>/api/connector/dequeue/<activity ID>

The following approach shows the way to dequeue the export Job to recurring integration. This approach uses data package-based export. Recurring integration supports both Data package export and the file export. The following parameters will be used

  • The D365UO environment
  • The Legal entity Name
  • The ID of the recurring Job which was created in the previous step
  • Name or description for the Export Job

Logic App and D365UO Export

The following section describes the method to create the Logic App for downloading Data Package from Recurring integration

  1. Sign in to Azure portal: https://portal.azure.com
  2. Select Create a resource, then select Logic Apps and click on Create
  3. Select the appropriate Resource Group and a Logic App Name and then click on Review+Create
  4. On the Logic App page select the Logic app designer option and then select Blank Logic App
  5. Search for Recurrence and select the Trigger.
  6. Select the Interval and the Frequency (based on your demand, shorter the interval, the more expensive the integration will be)
  7. Add Parameters for the Logic App
    ContainerName“: { “value”: “The Blob Container to store exported files” },
    Dyn365fOClientId“: {“value”: “Client Id user for the authentication” },
    Dyn365foURL“: { “value”: “The Url of D3365UO enviornment xxxxxx.dynamics.com” },
    processQueueName“: { “value”: “The queue Name for The ExportJob” },
    ExportJobId“: { “value”: “The batch job execution ID” }
  8. Add a New step, Seach for Key Vault and Add an Action called “Get Secret”
  9. Add a Connection to the KeyVault, Enter the name of the Key Vault.
Access secret for Azure Integration: Dynamics 365 UO DMF Data Export using Logic Apps
  1. Add a new step and search for Until
  2. Before you add value to control the loop, Add an action:
oData endpoint to loop for Azure Integration: Dynamics 365 UO DMF Data Export using Logic Apps
  1. Search for HTTP. In Actions select HTTP action [ This is the action that will execute the export Batch Job that we have created earlier. Provide appropriate Name for the Action]
  2. Fill in the Parameters
    The URI to the DMF dequeue endpoint
    Authentication Type: Active Directory oAuth
    Authority: https://login.windows.net/
    Tenant: Name of the tenant
    Audience: Url of the D365UO environment (without the / in the end)
    Client ID: The client ID registered in Azure AD and authorized in DMF Recurring integration)
    Client Secret: The Application secret to connecting to Dynamics UO {The secret should be stored in KeyVault}
oData endpoint to get the export job for Azure Integration: Dynamics 365 UO DMF Data Export using Logic Apps
  1. Now you can add a value to control the loop. On ‘Choose a value’ text field and select the OutputParameters Status Code is not equal to 200
  2. Add a +New step and search for Condition
  3. Now you can add a value to control the loop. On ‘Choose a value’ text field and select the OutputParameters Status Code is not equal to 200
  4. Add a +New step and search for Until, Add the Until Action
  5. Search for HTTP. In Actions select HTTP action [ This is the action that will download the Data Package File. Provide appropriate Name for the Action]
  6. Fill in the Parameters
    Method: Get
    The URI would be the output from the Step 10 : replace(replace(body(‘HTTP’)[‘DownloadLocation’], ‘http:’, ‘https:’), ‘:80’, ‘:443’)
    Authentication Type: Active Directory oAuth
    Authority: https://login.windows.net/
    Tenant: Name of the tenant
    Audience: Url of the D365UO environment (without the / in the end)
    Client ID: The client ID registered in Azure AD and authorized in DMF Recurring integration)
    Client Secret: The Application secret to connecting to Dynamics UO {The secret should be stored in KeyVault}
oData endpoint to download the export job for Azure Integration: Dynamics 365 UO DMF Data Export using Logic Apps
  1. Now define the condition for Until Action. If the Value output parameters for the HTTP Status Code is equal to 200.
oData endpoint to download the export job for Azure Integration: Dynamics 365 UO DMF Data Export using Logic Apps
  1. The false condition indicates that the package can’t be downloaded. We will send an email with the message. Add an action and search for Send an email (V2):
  2. Sign in with your Outlook credentials and include the email information:
  3. Add an action for the True condition. Search for Create A Blob
  4. Add a connection to the Blob for the Create Blob connection
  5. Then Add New action to Send a Message to Azure Service Bus.
  6. Acknowledge The package download to Dynamics 365 UO Recurring integration using HTTP action. The HTTP code looks like below
{
    "inputs": {
        "method": "POST",
        "uri": "@{concat('https://', parameters('Dyn365fOURL'), '/api/connector/ack/', encodeURIComponent(parameters('ExportJobId')))}",
        "body": "@body('Get_the_Export_Job')",
        "authentication": {
            "audience": "@concat('https://', parameters('Dyn365fOURL'))",
            "authority": "https://login.windows.net/",
            "clientId": "@parameters('Dyn365fOClientId')",
            "secret": "@parameters('Dyn365fOClientSecret')",
            "tenant": "@parameters('Tenant')",
            "type": "ActiveDirectoryOAuth"
        }
    }
}
  1. Save the Logic App. The Flow should look like this
oData endpoint to acknowledge the export job for Azure Integration: Dynamics 365 UO DMF Data Export using Logic Apps
  1. Run the Logic App
  2. Go back to the D365FO portal and review the Job history in the Data management workspace. A new export job with the name ExpensesExport should be executed

The JSON for the logic App is described below

{
    "definition": {
        "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
        "actions": {
            "Condition": {
                "actions": {
                    "Check_the_Download_is_successful": {
                        "actions": {
                            "Acknowledge_The_Package_Download": {
                                "inputs": {
                                    "authentication": {
                                        "audience": "@concat('https://', parameters('Dyn365fOURL'))",
                                        "authority": "https://login.windows.net/",
                                        "clientId": "@parameters('Dyn365fOClientId')",
                                        "secret": "@body('The_Secret_for_D365UO')?['value']",
                                        "tenant": "@parameters('Tenant')",
                                        "type": "ActiveDirectoryOAuth"
                                    },
                                    "body": "@body('Get_the_Export_Job')",
                                    "method": "POST",
                                    "uri": "@{concat('https://', parameters('Dyn365fOURL'), '/api/connector/ack/', encodeURIComponent(parameters('ExportJobId')))}"
                                },
                                "runAfter": {
                                    "Send_message": [
                                        "Succeeded"
                                    ]
                                },
                                "type": "Http"
                            },
                            "Create_blob": {
                                "inputs": {
                                    "body": "@body('Download_The_Package_File')",
                                    "host": {
                                        "connection": {
                                            "name": "@parameters('$connections')['azureblob']['connectionId']"
                                        }
                                    },
                                    "method": "post",
                                    "path": "/datasets/default/files",
                                    "queries": {
                                        "folderPath": "/@{parameters('ContainerName')}",
                                        "name": "@guid()"
                                    }
                                },
                                "runAfter": {},
                                "type": "ApiConnection"
                            },
                            "Send_message": {
                                "inputs": {
                                    "body": {
                                        "ContentData": "@{base64(body('Create_blob')?['Path'])}",
                                        "MessageId": "@{body('Create_blob')?['Path']}"
                                    },
                                    "host": {
                                        "connection": {
                                            "name": "@parameters('$connections')['servicebus']['connectionId']"
                                        }
                                    },
                                    "method": "post",
                                    "path": "/@{encodeURIComponent(parameters('processQueueName'))}/messages"
                                },
                                "runAfter": {
                                    "Create_blob": [
                                        "Succeeded"
                                    ]
                                },
                                "type": "ApiConnection"
                            }
                        },
                        "expression": "@equals(and(equals(outputs('Download_The_Package_File')?['StatusCode'], 200), not(equals(body('Download_The_Package_File'), parameters('EmptyFile')))), 200)",
                        "runAfter": {
                            "Wait_Until_The_Package_Is_Downloaded": [
                                "Succeeded"
                            ]
                        },
                        "type": "If"
                    },
                    "Wait_Until_The_Package_Is_Downloaded": {
                        "actions": {
                            "Download_The_Package_File": {
                                "inputs": {
                                    "authentication": {
                                        "audience": "@concat('https://', parameters('Dyn365fOURL'))",
                                        "authority": "https://login.windows.net/",
                                        "clientId": "@parameters('Dyn365fOClientId')",
                                        "secret": "@body('The_Secret_for_D365UO')?['value']",
                                        "tenant": "heiway.net",
                                        "type": "ActiveDirectoryOAuth"
                                    },
                                    "method": "GET",
                                    "uri": "@{replace(replace(body('Get_the_Export_Job')['DownloadLocation'], 'http:', 'https:'), ':80', ':443')}"
                                },
                                "runAfter": {},
                                "type": "Http"
                            }
                        },
                        "expression": "@equals(outputs('Get_the_Export_Job')['statusCode'], 200)",
                        "limit": {
                            "count": 60,
                            "timeout": "PT1H"
                        },
                        "runAfter": {},
                        "type": "Until"
                    }
                },
                "else": {
                    "actions": {
                        "Terminate": {
                            "inputs": {
                                "runStatus": "Cancelled"
                            },
                            "runAfter": {},
                            "type": "Terminate"
                        }
                    }
                },
                "expression": "@equals(outputs('Get_the_Export_Job')['statusCode'], 200)",
                "runAfter": {
                    "Wait_Until_Export_Job_Is_Complete": [
                        "Succeeded"
                    ]
                },
                "type": "If"
            },
            "The_Secret_for_D365UO": {
                "inputs": {
                    "host": {
                        "connection": {
                            "name": "@parameters('$connections')['keyvault_1']['connectionId']"
                        }
                    },
                    "method": "get",
                    "path": "/secrets/@{encodeURIComponent('\"SecretName\"')}/value"
                },
                "runAfter": {},
                "type": "ApiConnection"
            },
            "Wait_Until_Export_Job_Is_Complete": {
                "actions": {
                    "Get_the_Export_Job": {
                        "inputs": {
                            "authentication": {
                                "audience": "@concat('https://', parameters('Dyn365fOURL'))",
                                "authority": "https://login.windows.net/",
                                "clientId": "@parameters('Dyn365fOClientId')",
                                "secret": "@body('The_Secret_for_D365UO')?['value']",
                                "tenant": "@parameters('Tenant')",
                                "type": "ActiveDirectoryOAuth"
                            },
                            "method": "GET",
                            "uri": "@{concat('https://', parameters('Dyn365fOURL'), '/api/connector/dequeue/', encodeURIComponent(parameters('ExportJobId')))}"
                        },
                        "runAfter": {},
                        "type": "Http"
                    }
                },
                "expression": "@equals(outputs('Get_the_Export_Job')['statusCode'], 200)",
                "limit": {
                    "count": 5,
                    "timeout": "PT1H"
                },
                "runAfter": {
                    "The_Secret_for_D365UO": [
                        "Succeeded"
                    ]
                },
                "type": "Until"
            }
        },
        "contentVersion": "1.0.0.0",
        "outputs": {},
        "parameters": {
            "$connections": {
                "defaultValue": {},
                "type": "Object"
            },
            "Dyn365fOClientId": {
                "type": "String"
            },
            "Dyn365fOURL": {
                "type": "String"
            },
            "EmptyFile": {
                "defaultValue": "",
                "type": "String"
            },
            "ExportJobId": {
                "defaultValue": "{18FA7FDE-557F-4421-BF6C-2C4417963C69}",
                "type": "String"
            },
            "Tenant": {
                "defaultValue": "heiway.net",
                "type": "String"
            },
            "processQueueName": {
                "type": "String"
            }
        },
        "triggers": {
            "Recurrence": {
                "recurrence": {
                    "frequency": "Minute",
                    "interval": 2
                },
                "runtimeConfiguration": {
                    "concurrency": {
                        "runs": 1
                    }
                },
                "type": "Recurrence"
            }
        }
    },
    "parameters": {
        "$connections": {
            "value": {
                "azureblob": {
                    "connectionId": "/subscriptions/9ce707d5-a33e-4243-ba8e-9a3826893136/resourcegroups/RG-Tasubaki-Encoding/providers/Microsoft.Web/connections/azureblob",
                    "connectionName": "azureblob",
                    "id": "/subscriptions/9ce707d5-a33e-4243-ba8e-9a3826893136/providers/Microsoft.Web/locations/westeurope/managedApis/azureblob"
                },
                "keyvault_1": {
                    "connectionId": "/subscriptions/9ce707d5-a33e-4243-ba8e-9a3826893136/resourceGroups/RG-Tasubaki-Encoding/providers/Microsoft.Web/connections/keyvault-2",
                    "connectionName": "keyvault-2",
                    "id": "/subscriptions/9ce707d5-a33e-4243-ba8e-9a3826893136/providers/Microsoft.Web/locations/westeurope/managedApis/keyvault"
                },
                "servicebus": {
                    "connectionId": "/subscriptions/9ce707d5-a33e-4243-ba8e-9a3826893136/resourcegroups/RG-Tasubaki-Encoding/providers/Microsoft.Web/connections/servicebus",
                    "connectionName": "servicebus",
                    "id": "/subscriptions/9ce707d5-a33e-4243-ba8e-9a3826893136/providers/Microsoft.Web/locations/westeurope/managedApis/servicebus"
                }
            }
        },
        "Dyn365fOClientId": {
            "value": "0ae2fd1b-7b98-457c-b1c5-108824f924ec"
        },
        "Dyn365fOURL": {
            "value": "nl1hhs-acc.sandbox.operations.dynamics.com"
        },
        "processQueueName": {
            "value": "ExportJob"
        }
    }
}

D365UO https vs finance connectors

HTTPS vs D365UOconnectors for integration with D365UO

Microsoft Dynamics 365 for Unified Operation provides project-driven organizations with the tools they need to connect and manage their clients, finances, projects, and people in a unified, integrated environment. D365UO helps services organizations streamline key finance and project-related business processes, simplify the management of their human capital, and gain the visibility they need to maximize their revenue and profitability levels. Microsoft D365UO connectors provides following operations:

ACTIONS
Create recordCreate a new record in an entity
Delete recordDeletes a single record in an entity
Execute actionExecute action
Get a recordRetrieves a single record
Get list of entitiesRetrieves a list of entities
Lists items present in tableLists items present in table
Update a recordUpdates a single record in an entity
D365UO connector operations

The Microsoft Dynamics 365 for Finance and Operations connector uses the OData version 4.0 protocol. The connector has following limitations

  • It does not support cross company queries
  • This connector does not currently have built-in proxy support and does not honor the Atom proxy settings

So i have been using HTTPS connector to perform the CRUD operations on D365UO. The following table shows the reason to use the HTTPS connector instead of D365UO connector.

HTTPSDynamics connector
Performance is much better (~8x faster). This because it makes a direct connection D365FO environmentPerformance is poor. The call goes via custom connector, then to D365FO
It supports Cross company queriesIt does not support cross company queries
Less probability throttling exception. The throttling exception is limited to D365FOMore probability throttling exception. The throttling exception is limited to D365FO
Supports App and Managed identity-based authentication (more secure)It does not Supports App and Managed identity-based authentication (more secure)  
Nesting of queries (Expand) is supported, resulting in lesser actions. As a result, cheaper and faster execution of Logic AppNesting of queries (Expand) is not supported, resulting in lesser actions. As a result, expensive and slower execution of Logic App
No impact on Microsoft changes to dynamics connectorThe integration could break due to changes done by Microsoft
The development time is higher and requires more technical skills.This connector helps developer and reduce the development time

Dynamics 365 UO Integration Design Patterns

Dynamics 365 for Unified Operations has evolved into purpose-built applications to help you manage business functions. This would mean that there would be integration with diverse systems. The blog describes integration patterns, integration scenarios, and best practices. There is a number of ways users can interact with the D365 UO. There are different ways to populate data to Dynamics 365 UO and retrieve data from D365 UO. In my personal opinion, the integration option can be decided based on the following three criteria

  • Retrieve data from D365 UO or Populate data to D365 UO
  • Real-Time interaction with D365UO or Batch processing of Data
  • Amount of Data which needs to be exchanged (Data volume)
Real-Time/Near Real-Time (Small Data volume)Batch Job (Large Data Volume)
Retrieve Data from D365UOoData
Custom Web service
Business Events
Business Alert

Data Management Framework
Recurring Integration
Populate Data to D365UOoData
Custom Web service
Data Management Framework
Recurring Integration
Dynamics 365 UO Integration Patterns
Dynamics 365 UO Real-Time Integration Options
Dynamics 365 UO Integration Design Patterns for real time data integration
Dynamics 365 UO Bulk/Batch Processing
Dynamics 365 UO Integration Design Patterns for Bulk data integration processing

Dynamics 365 UO oData and REST

Dynamics 365 UO provides a REST API to interact with D365UO via Data Entities. The REST API provides a mechanism to interact in real-time or near real-time way to interact with the D365 UO. oData can be used to populate, retrieve, update, and delete (CRUD) data in Dynamics 365 UO.
oData: Open Data Protocol (OData) is a standard protocol for consuming data exposed by Dynamics 365 for Operations. OData is a new Representational State Transfer (REST) based protocol for CRUD operations – C-Create, R-Read, U-Update, and D-Delete – that allows for integrating with Dynamics 365 for Operations. It is applied to all types of web technologies, such as HTTP and JavaScript Object Notation (JSON).
Data Entity: A data entity in D365 is an abstraction from the physical implementation of database tables. A data entity is a simplified de-normalized representation of underlying tables. A data entity represents a common data concept or functionality, (e.g. Vendors V2 where the details are stored in normalized relational tables) but all details are represented in one flat view in Vendor Details data entity.
The data flow for interacting with Dynamics 365 UO using oData:

Dynamics 365 UO Integration Design Patterns: oData REST API authentication against azure AD

The Technical implementation of oData with Dynamics 365 UO can be found here

Dynamics 365 UO Business Event

The Dynamics 365 UO Business Events can send events/trigger/notification to external applications such as Azure Integrations, which can use this trigger to handle specific integration or business process scenarios.
The Events existed in Finance and Operations were previously confined to use within Finance and Operations. The new capability provides a framework that will allow business processes in Finance and Operations to capture business events as business processes are executed and send the events to an external system or application.
More about the business event can be found here

Business events provide a perfect integration scenario when an event occurs in D365FO and requires this information to be passed on to ThirdParty systems.

These business events can be used by

  • Azure Service Bus
  • Azure Logic Apps
  • Microsoft Flow
  • Azure Functions
  • HTTPS Trigger

Since these events happen in the context of business processes, they are called business events that enable business process integration. External business processes will subscribe to specific business events from Finance and Operations to get notified when they occur. The business events can also be consumed as “triggers” in the Finance and Operations connector.
A custom or OOTB business event can trigger Azure Integration Services to process or forward the trigger to Third-party applications.

Dynamics 365 UO Integration Design Patterns using Business Events

Dynamics 365 UO Custom webservice

In Microsoft Dynamics UO, a developer can create custom services to expose X++ functionality to external clients. Any existing X++ code can be exposed as a custom service by adding an attribute. D365 UO provides standard attributes that can be set on the data contract class and its members to automatically serialize and de-serialize data that is sent and received across a network connection. Many predefined types, such as collections and tables, are also supported. When a developer writes a custom service under a service group, the service group is always deployed on two endpoints:

  • SOAP endpoint
  • JSON endpoint

SOAP-based custom service

SOAP-based services remain the same as they were in Dynamics AX 2012.
Key changes

  • All the service groups under the AOTService group node are automatically deployed.
  • All services that must be deployed must be part of a service group.

Example endpoint for a dev environment

https://XXXX.dynamics.com/soap/services/DevServices?wsdl

JSON-based custom service

This feature enables X++ classes to be consumed as JSON services. In other words, the return data set is in JSON format. JSON, which stands for JavaScript Object Notation, is a compact, lightweight format that is commonly used to communicate data between the client and the server.

The JSON Endpoint is https://host_uri/api/services/service_group_name/service_group_service_name/operation_name.

Bulk or Batch Data Processing

Data Management Framework

Data Management Framework: DMF is the new all-in-one concept introduced by Microsoft in Dynamics 365 for Finance and Operations. It supports and manages all core data management related tasks. This enables asynchronous and high-performing data insertion and extraction scenarios. Here are some examples: Interactive file-based import/export, Recurring integrations (file, queue, and so on)

Data Package: Data Package is a simple .zip file that contains the source (import) or target data(export) itself . The zip file contains three files. The data file and the manifest files which contain metadata information of the Data Entity and the processing instructions for DMF.

Interacting with Dynamics 365 UO DMF REST API

In order to call the D365 F&O APIs, it is necessary to authenticate with a valid access token. The token can be retrieved from Azure Active Directory using a valid Application Id and secret key, which has access to the D365FO environment. The application ID and secret key are created by registering an application in Azure Active directory. Then the DMF REST API can be invoked.

Dynamics 365 UO Integration Design Patterns: DMF REST API
Interaction using REST API to Export Data

The high level interaction of API calls to retieve the data package via REST API is shown below.

Dynamics 365 UO Integration Design Patterns: steps to export of DMF data packge and download using oData

The detailed technical implemetation of Dynamics 365 UO DMF interaction using REST API has been descrbed here

Dynamics 365 UO Recurring Integration

Recurring integration does the following things:

  • It builds on data entities and the Data management framework.
  • It enables the exchange of documents or files between Finance and Operations and any third-party application or service.
  • It supports several document formats, source mapping, Extensible Stylesheet Language Transformations (XSLT), and filters.
  • Document/file exchange in several document formats
  • It uses secure REST application programming interfaces (APIs) and authorization mechanisms to receive data from, and send data back to, integration systems.

The complete flow to import job to recurring integration is shown below

Dynamics 365 UO Integration Design Patterns: Recurring Integration oData API for authentocation and download
Recurring Integration using REST API
  1. The third party client applications authenticates to the Azure AD token issuance endpoint and requests an access token.
  2. The Azure AD token issuance endpoint issues the access token.
  3. The access token is used to authenticate to the D365FO DMF and initiate the import or Export Job. The endpoints are:

The following set of APIs is used to exchange data between the Dynamics 365 F&O Recurring Integrations client and Finance and Operations.

The detailed Technical implementation of Recurring integration can be found here

Microsoft integration patterns can be found here

Azure Integration: Setting null value in Logic App Action to Dynamics 365

In one of my azure integration involving Dynamics 365, I had to send a null value when a field value is empty. The Dynamics 365 Actions for Update or Create a Record Action in the Logic App was always sending empty string (i.e. “”) instead of null value, which resulted in the integration failure. The sending null value fixed the integration, which was not possible to do via the Designer.

The expression was sending out null, but logic app converted to “”.

@{if(equals(items(‘XXX’)?[‘Type’], null), null,replace(items(‘XXX’)?[‘FieldName’], ‘;’, ‘,’))}

The expression for the field value resulted in the following error “Edm Object passed should have the options selected

{
  "status": 400,
  "message": "--batchresponse_94b2278b-f1fd-4f65-94c9-c3640fba018b\r\nContent-Type: application/http\r\nContent-Transfer-Encoding: binary\r\n\r\nHTTP/1.1 204 No Content\r\nOData-Version: 4.0\r\n\r\n\r\n--batchresponse_94b2278b-f1fd-4f65-94c9-c3640fba018b\r\nContent-Type: application/http\r\nContent-Transfer-Encoding: binary\r\n\r\nHTTP/1.1 400 Bad Request\r\nREQ_ID: 64fce8db-bc4c-4653-92f7-8d976c4da1c7\r\nContent-Type: application/json; odata.metadata=minimal\r\nOData-Version: 4.0\r\n\r\n{\"error\":{\"code\":\"0x0\",\"message\":\"Edm Object passed should have the options selected. \",\"innererror\":{\"message\":\"Edm Object passed should have the options selected. \",\"type\":\"Microsoft.Crm.CrmHttpException\",\"stacktrace\":\"   at Microsoft.Crm.Extensibility.OData.TypeConverters.OptionSetValueCollectionEdmTypeConverter.ConvertToCrmTypeInternal(String edmTypeValue, String operationName)\\r\\n 
  },
  "source": "XXXXXXX.crm4.dynamics.com",
  "errors": [],
  "debugInfo": "clientRequestId: 2e965f81-110b-4f77-964c-057f4651c702"
}

Azure Integration with Dynamics 365: Use case

The use case was to create or update an Address entity in Dynamics 365 based on specific conditions. I used the standard Dynamics 365 action Update or Create a record action. There is a specific field that accepts a specific value or null but nor an empty value.

Azure Integration: Setting null value in Logic App Action to Dynamics 365

Azure Integration with Dynamics 365: Resolution

The Logic App adds curly braces {} in the expression value, the Logic App runtime will convert the null value to an empty string. The trick is to go to the code view of the Logic App designer, find the action and then remove the curly braces {} around the expression:

@{if(equals(items(‘XXX’)?[‘Type’], null), null,replace(items(‘XXX’)?[‘FieldName’], ‘;’, ‘,’))}

to

@if(equals(items(‘XXX’)?[‘Type’], null), null,replace(items(‘XXX’)?[‘FieldName’], ‘;’, ‘,’))

Dynamics 365 UO: Integration using oData/REST API

This blog describes the method to interact with the Dynamics 365 Unified Operation using oData. Dynamics 365 UO provides REST API to interact with Data via Data Entities.

oData: Open Data Protocol (OData) is a standard protocol for consuming data exposed by Dynamics 365 for Operations. OData is a new Representational State Transfer (REST) based protocol for CRUD operations – C-Create, R-Read, U-Update and D-Delete – that allows for integrating with Dynamics 365 for Operations. It is applied to all types of web technologies, such as HTTP and JavaScript Object Notation (JSON).

Data Entity: A data entity in D365 is an abstraction from the physical implementation of database tables. A data entity is a simplified de-normalized representation of underlying tables. A data entity represents a common data concept or functionality, (e.g. Vendors V2  where the details are stored in normalized relational tables) but all details are represented in one flat view in Vendor Details data entity.

The data flow for querying data using oData:

Dynamics 365 UO: Integration using oData/REST API

Dynamics 365UO: oData Features

  1. CRUD operations are handled through HTTP verb support for POST, PATCH, PUT, and DELETE.
  2. The D365 UO for UO supports paging and maximum page size is 1,000.
  3. Available query options are:
  4. Filter Options are: Equals, Not equals, Greater than, Greater than or equal, Less than, Less than or equal, And, Or, Not
  5. D365 FO provides option to query data from Cross-company

URI Conventions for oData in Deatil has been described here

Querying Data Cross-Company

By default, OData returns only data that belongs to the users default company. To query the data from outside the users default company, specify the following keyword ?cross-company=true in the query. This option will return data from all companies that the user has access to.

Example: http://%5BbaseURI%5D/data/PurchaseOrderHeadersV2?cross-company=true

To filter by a particular company that isn’t your default company, use the following syntax:

http://%5BbaseURI%5D/data/PurchaseOrderHeadersV2?$filter=dataAreaId eq 'usrt'&cross-company=true

Azure Active Directory Authentication

In order to call the D365 UO oData EndPoints, it is necessary to authenticate with a valid access token. The token can be retrieved from Azure Active Directory using a valid Application Id and secret key, which has access to the D365FO environment. The application ID and secret key are created by registering an application in Azure Active directory.

Pre-requisite :

  1. Register an application in Azure AD and Grant access to D365FO. The detailed steps are described here. Instead of Dynamics CRM  select Dynamics ERP 
  2. Register the AAD application in D365FO
    • System administration > Setup > Azure Active Directory applications
    • Click “New” -> Enter APP-ID(created as part of the previous step), Meaningful name and User ID (the permission you would like to assign).
Dynamics 365 UO: Integration using oData/REST API
  1. The client application authenticates to the Azure AD token issuance endpoint and requests an access token.
  2. The Azure AD token issuance endpoint issues the access token.
  3. The access token is used to authenticate to the D365FO DMF and initiate DMF Job.
  4. Data from the DMF is returned to the third-party application.
Http Method: POST
Request URL: https://login.microsoftonline.com//oauth2/token 
Parameters : grant_type: client_credentials [Specifies the requested grant type. In a Client Credentials Grant flow, the value must be client_credentials.]
client_id: Registered App ID of the AAD Application 
client_secret: Enter a key of the registered application in AAD.
Resource: Enter the URL of the D365FO Url (e.g. https://dev-d365-fo-ultdeabc5b35da4fe25devaos.cloudax.dynamics.com)

The Resource URL should not have “/” in the end, othwerise you would always get access denied while accessing the target resource

C# Code

//Azure AAD Application settings
//The Tenant URL (use friendlyname or the TenantID
static string aadTenant = "https://login.windows.net/dev.onmicrosoft.com";
//The URL of the resource you would be accessing using the access token.Please ensure / is not there in the end of the URL
static string aadResource = "https://dev-testdevaos.sandbox.ax.dynamics.com";
//APplication ID . Store them securely / Encrypted config file or secure store
static string aadClientAppId = "GUID Of the Azure application";
//Application secret. Store them securely / Encrypted config file or secure store
static string aadClientAppSecret = "Secret of the Azure application"; 
/// Retrieves an authentication header from the service.The authentication header for the Web API call.        private static string GetAuthenticationHeader()
        {
            //using Microsoft.IdentityModel.Clients.ActiveDirectory;
            AuthenticationContext authenticationContext = new AuthenticationContext(aadTenant);
            var creadential = new ClientCredential(aadClientAppId, aadClientAppSecret);
            AuthenticationResult authenticationResult = authenticationContext.AcquireTokenAsync(aadResource, creadential).Result;
            return authenticationResult.AccessToken;
        }

CRUD Operations on Data Entities

The following code shows the example for Creating, Reading, Updating and Deleting a PuchaseOrderHeader entity. More detailed information on oData can be found here

 private static async void CRUDonPurchaseOrderHeader()
        {
            string authHeader = GetAuthenticationHeader();
            HttpClient client = new HttpClient();
            client.BaseAddress = new Uri(aadResource);
            client.DefaultRequestHeaders.Clear();
            client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", authHeader);

            //Initiate the PurchaseOrderHeader object
            var payload = new PurchaseOrderHeader()
            {
                PurchaseOrderNumber = "001-000234",
                DataAreaId = "001",           
                OrderVendorAccountNumber = "000001",
                DeliveryAddressCountryRegionId = "NL",
                DeliveryAddressDescription = "Business Location",
                AccountingDate = "2019-09-19T12:00:00Z",
                PurchaseOrderName = "JDE Professional",
                RequestedDeliveryDate = "2019-09-19T12:00:00Z",
                ExpectedStoreAvailableSalesDate = "2019-09-19T12:00:00Z",
                ConfirmedDeliveryDate = "2019-09-19T12:00:00Z",
                ExpectedStoreReceiptDate = "2019-09-19T12:00:00Z",
                FixedDueDate = "2019-09-19T12:00:00Z",
                ExpectedCrossDockingDate = "2019-09-19T12:00:00Z"
            };      
            var stringPayload = JsonConvert.SerializeObject(payload);
            var httpContent = new StringContent(stringPayload, Encoding.UTF8, "application/json");
            var result = client.PostAsync("/data/PurchaseOrderHeadersV2", httpContent).Result;
            string resultContent = await result.Content.ReadAsStringAsync();
            JObject joResponse = JObject.Parse(resultContent);

            //Get a Purchase Order
             result = client.GetAsync("/data/PurchaseOrderHeadersV2?$filter=PurchaseOrderNumber eq '001-000234'").Result;
             resultContent = await result.Content.ReadAsStringAsync();
             joResponse = JObject.Parse(resultContent);

            //Update the PurchaseOrderHeader object
            payload = new PurchaseOrderHeader()
            {
                PurchaseOrderNumber = "001-000233",
                DataAreaId = "001",
                OrderVendorAccountNumber = "000001",
                DeliveryAddressCountryRegionId = "NL",
                DeliveryAddressDescription = "Business Location Address changed",
                AccountingDate = "2019-09-19T12:00:00Z",
                PurchaseOrderName = "JDE Professional",
                RequestedDeliveryDate = "2019-09-19T12:00:00Z",
                ExpectedStoreAvailableSalesDate = "2019-09-19T12:00:00Z",
                ConfirmedDeliveryDate = "2019-09-19T12:00:00Z",
                ExpectedStoreReceiptDate = "2019-09-19T12:00:00Z",
                FixedDueDate = "2019-09-19T12:00:00Z",
                ExpectedCrossDockingDate = "2019-09-19T12:00:00Z"
            };
             stringPayload = JsonConvert.SerializeObject(payload);
             httpContent = new StringContent(stringPayload, Encoding.UTF8, "application/json");
             result = client.PatchAsync("/data/PurchaseOrderHeadersV2(dataAreaId='001',PurchaseOrderNumber='001-000234')", httpContent).Result;

            //Delete the PurchaseOrderHeader object          
            result = client.DeleteAsync("/data/PurchaseOrderHeadersV2(dataAreaId='001',PurchaseOrderNumber='001-000234')").Result;

        }