Dynamics 365 UO: Recurring Integration to avoid DMF parallel execution issue

This Blog describes the limitation of Dynamics 365 UO’s Data Management Framework in Parallel execution and how to get around it using Dynamics 365 UO’s Recurring Integration Module. The blog provides a technical implementation in .NET for queuing and dequeuing of the Jobs using REST API of Dynamics 365 UO’s Recurring Integration Module.

Use Case: Dynamics 365 UO Recurring Integration

Dynamics 365 UO DMF Import and export fails during parallel execution of the Job

Data Management framework works great when a large amount of data should be imported or exported from Dynamics 365 UO. The DMF REST API to import and export doesn’t work when third party applications queue Import and export Jobs in parallel. It results in an unexpected exception from DMF endpoints and failures in the DMF Data Job. The parallel import/export data jobs work for different entities and fail only when the job is for the same entity. E.g. We can import Vendors, Customers and General Journal in Parallel, but we cannot have parallel import of multiple jobs for same the entity.

Dynamics 365 UO Integration Design Patterns: Recurring Integration
DMF Parallel execution issue

The following exceptions were discovered during the import

  • XML is not in correct format and thus 0 General Journal records are inserted in staging.
  • Cannot edit a record in Entities for a processing group (DMFDefinitionGroupEntity).\nThe record has never been selected.
  • Cannot delete a record in Source (DMFDefinitionGroupEntityXMLFields).\nDeadlock, where one or more users have simultaneously locked the whole table or part of it.”,
  • Exception occurred while executing action ImportFromPackage on Entity DataManagementDefinitionGroup: BOX API can’t be used from non-interactive sessions
  • ‘The record already exists’
  • “Cannot create a record in Source (DMFDefinitionGroupEntityXMLFields). Entity: General Journal, ACCOUNTINGDATE.\nDeadlock, where one or more users have simultaneously locked the whole table or part of it.”,​

Resolution : Dynamics 365 UO Recurring Integration

Resolution to the problem is to use the Recurring Integration D365UO module. The recurring integration provides

Queuing mechanism for Data Jobs (import/export)

The module will ensure the sequential execution of the Job.

The module provides opportunity to ordered execution of the Job

Dynamics 365 UO Integration Design Patterns: Recurring Integration
Recurring Integration

Dynamics 365 UO Recurring Integration

Recurring integration does the following things:

  • It builds on data entities and the Data management framework.
  • It enables the exchange of documents or files between Finance and Operations and any third-party application or service.
  • It supports several document formats, source mapping, Extensible Stylesheet Language Transformations (XSLT), and filters.
  • Document/file exchange in several document formats
  • It uses secure REST application programming interfaces (APIs) and authorization mechanisms to receive data from, and send data back to, integration systems.

The complete flow to import job to recurring integration is shown below

Dynamics 365 UO Integration Design Patterns: Recurring Integration
Recurring Integration using REST API
  1. The third party client applications authenticates to the Azure AD token issuance endpoint and requests an access token.
  2. The Azure AD token issuance endpoint issues the access token.
  3. The access token is used to authenticate to the D365FO DMF and initiate the import or Export Job. The endpoints are:

The following set of APIs is used to exchange data between the Dynamics 365 F&O Recurring Integrations client and Finance and Operations.

Dynamics 365 UO Recurring Integration API for Import (enqueue)

Make an HTTP POST call against the following URL.

https://<base URL>/api/connector/enqueue/<activity ID>?entity=<entity name>

In the message body, you can the pass the data as a memory stream.

To get the activity ID, on the Manage scheduled data jobs page, in the ID field, copy the globally unique identifier (GUID).

Dynamics 365 UO Recurring Integration API for Export (dequeue)

To return a data package that contains all the data entities that were defined in the data project, and that the client application can unzip and consume, use the following structure.

https://<base URL>/api/connector/dequeue/<activity ID>

After the client downloads the data, an acknowledgment must be sent back to Finance and Operations, so that you can mark the data as received. In cases when there was no file uploaded to the blob, the dequeue API will return a response indicating as such.

  1. The execution Id of the DMF Job has been returned to the client application, which can be used to monitor the progress of the execution of the Job.

The set up involves following set and the API supports import/export of DMF Data projects

Dynamics 365 UO Integration Design Patterns: Recurring Integration
The Recurring Int set up
Dynamics 365 UO Integration Design Patterns: Recurring Integration

Authorization for the Dynamics 365 UO Recurring Integration API

The integration REST API uses the same OAuth 2.0 authentication model as the other service endpoints. Before the integrating client application can consume this endpoint, you must create an application ID in Microsoft Azure Active Directory (Azure AD) and give it appropriate permission to the application. When you create and enable a recurring job, you’re prompted to enter the Azure AD application ID that will interact with that recurring job. Therefore, be sure to make a note of the application ID.

 internal static class AuthManager
    {      
        static string aadTenant =  "https://login.windows.net/<<TenantName>>";
        internal static string aadResource =  "https://XXXXX.cloudax.dynamics.com";
        static string aadClientAppId = "The client ID";
        static string aadClientAppSecret = "The Client Secret";

        /// <summary>
        /// Retrieves an authentication header from the service.
        /// </summary>
        /// <returns>The authentication header for the Web API call.</returns>
        internal static string GetAuthenticationHeader()
        {
            AuthenticationContext authenticationContext = new AuthenticationContext(aadTenant);
            var creadential = new ClientCredential(aadClientAppId, aadClientAppSecret);
            AuthenticationResult authenticationResult = authenticationContext.AcquireTokenAsync(aadResource, creadential).Result;
            return authenticationResult.AccessToken;
        }
    }

Set up a Dynamics 365 UO data project and Dynamics 365 UO recurring data jobs

Create a data project

  1. On the main dashboard, select the Data management tile to open the Data management workspace.
  2. Select the Import or Export tile to create a new data project.
  3. Enter a valid job name, data source, and entity name.
  4. Upload a data file for one or more entities. Make sure that each entity is added, and that no errors occur.
  5. Select Save.

Create a Dynamics 365 UO recurring data job

  1. On the Data project page, select Create recurring data job.
  2. Enter a valid name and a description for the recurring data job.
  3. On the Set-up authorization policy tab, enter the application ID that was generated for your application, and mark it as enabled.
  4. Expand Advanced options tab and specify either File or Data package.
  5. Select Set processing recurrence, and then, in the Define recurrence dialog box, set up a valid recurrence for your data job
  6. Select OK, and then select Yes in the confirmation message box.

Submitting data to Dynamics 365 UO recurring data jobs

You can use integration REST endpoints to integrate with the client, submit documents (import), or poll available documents for download (export). These endpoints support OAuth.

Queue the Dynamics 365 UO recurring Import Job

Dynamics 365 UO recurring data jobs API for import (enqueue)

Make an HTTP POST call against the following URL and In the message body, you can the pass the data as a memory stream.

https://<base URL>/api/connector/enqueue/<activity ID>?entity=<entity name>

The following code shows the way to queue the import Job to recurring integration. This approach uses data package-based import. Recurring integration supports both Data package import and the file import. The following parameters will be used

  • The D365UO environment
  • The Legal entity Name
  • The ID of the recurring Job which was created in the previous step
  • The entity name which needs to be imported
  • Name or description for the import Job
  public static class RecurringIntegration
    {
        /// <summary>
        /// Post request
        /// </summary>
        /// <param name="uri">Enqueue endpoint URI</param>
        /// <param name="authenticationHeader">Authentication header</param>
        /// <param name="bodyStream">Body stream</param>        
        /// <param name="message">ActivityMessage context</param>
        /// <returns></returns>
        public static async Task<HttpResponseMessage> SendPostRequestAsync(Uri uri, string authenticationHeader, Stream bodyStream, string externalCorrelationHeaderValue = null)
        {
            string externalidentifier = "x-ms-dyn-externalidentifier";
            ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls |
                    SecurityProtocolType.Tls11 |
                    SecurityProtocolType.Tls12;

            using (HttpClientHandler handler = new HttpClientHandler() { UseCookies = false })
            {
                using (HttpClient httpClient = new HttpClient(handler))
                {
                    httpClient.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", authenticationHeader);

                    // Add external correlation id header id specified and valid
                    if (!string.IsNullOrEmpty(externalCorrelationHeaderValue))
                    {
                        httpClient.DefaultRequestHeaders.Add(externalidentifier, externalCorrelationHeaderValue);
                    }

                    if (bodyStream != null)
                    {
                        using (StreamContent content = new StreamContent(bodyStream))
                        {
                            return await httpClient.PostAsync(uri, content);
                        }
                    }
                }
            }

            return new HttpResponseMessage()
            {
                Content = new StringContent("Request failed at client.", Encoding.ASCII),
                StatusCode = System.Net.HttpStatusCode.PreconditionFailed
            };
        }

        /// <summary>
        /// Get the Enqueue URI
        /// </summary>
        /// <returns>Enqueue URI</returns>
        private static Uri GetEnqueueUri(string recurringJobId, string legalEntity, string entityName)
        {
            string enviornmentUrl =  "https://XXXXXXX.cloudax.dynamics.com";
            string enqueueUrl = "/api/connector/enqueue/";           
            //access the Connector API
            UriBuilder enqueueUri = new UriBuilder(enviornmentUrl);
            enqueueUri.Path = enqueueUrl + recurringJobId;
            // Data package        
            string enqueueQuery = "entity=" + entityName;
            if (!string.IsNullOrEmpty(legalEntity))
            {
                enqueueQuery += "&company=" + legalEntity;
            }
            enqueueUri.Query = enqueueQuery;        

            return enqueueUri.Uri;
        }
        public static Stream Read(string fullFilePath)
        {
            if (File.Exists(fullFilePath))
            {
                return new FileStream(fullFilePath,
                            FileMode.Open,
                            FileAccess.Read,
                            FileShare.Read,
                            0x1000,
                            true);
            }
            return null;
        }

        /// <summary>
        // Enqueue the Data package to Recurring integration
        /// </summary>
        /// <returns>Status</returns>
        internal static async void QueueImport()
        {
            Stream stream = Read(@"C:\Temp\GL\General Journal.zip");
            string authHeader = AuthManager.GetAuthenticationHeader();
            Uri enqueueUri =GetEnqueueUri("<<ID of the recurring Job>>", "<<Legal Entity>>", "<<Entity Name>>");
            string jobName = "The name of the Job";
            HttpResponseMessage result = SendPostRequestAsync(enqueueUri, authHeader, stream, jobName).Result;
            string resultContent = await result.Content.ReadAsStringAsync();
            Console.WriteLine("Response is");
            Console.WriteLine(resultContent);
        }
    }

Dynamics 365 FO Integration using Business Events

This blog describes a method to use Azure Integration with Dynamics 365 FO Business Events. The Dynamics 365 FO Business Events can send events/trigger/notification to external applications such as Azure Integrations, which can use this trigger to handle specific integration or business process scenarios.

The Events existed in Finance and Operations were previously confined to use within Finance and Operations. The new capability provides a framework that will allow business processes in Finance and Operations to capture business events as business processes are executed and send the events to an external system or application.

More about business event can be found here

Business events provides a perfect integration scenario when an event occurs in D365FO and requires this information to be passed on to ThirdParty systems.

These business event can be used by

  • Azure Service Bus
  • Azure Logic Apps
  • Microsoft Flow
  • Azure Functions
  • HTTPS Trigger

Since these events happen in the context of business processes, they are called business events that enable business process integration.
External business processes will subscribe to specific business events from Finance and Operations to get notified when they occur. The business events can also be consumed as “triggers” in the Finance and Operations connector.

A Dynamics 365 FO Integration usecase scenario

Use case: Trigger a Third party application when a Vendor Record is created

In high level what i am trying to achieve is shown below. A custom business event will be created to trigger logic app to forward the trigger to Third party application.

Dynamics 365 FO Integration Design Pattern: Business Events

Creating a custom Dynamics 365 FO business event

In this demo I will create a new business event from scratch to show the steps involved in creating and consuming business event via Logic App. As a trigger data source, I will use Vendor Table (VendTable) as source of the business event. To create a custom business event, we would need following three artifacts.

  • BusinessEventsContract class
  • BusinessEventsBase class 
  • A Trigger Class to send the business Event

More information on creating business event can be found here

BusinessEventContract Class

BusinessEventsContract Class creates the business data contract class. A business event contract class extends the BusinessEventsContract class. It defines and populates the payload of the business event. The process of implementing a business event contract involves extending the BusinessEventContract class, defining internal state, implementing an initialization method, implementing a static constructor method, and implementing parm methods to access the contract state.

/// <summary>
/// The data contract for the <c>VendorCreatedBusinessEvent</c>,business events.
/// </summary>
[DataContract]
public  class DevVendorCreatedBusinessEventContract extends BusinessEventsContract
{   
    private VendAccount vendAccount;
    /// <summary>
    /// Initializes the field values.
    /// </summary>
    private void initialize(VendTable _vendTable)
    {
        vendAccount = _vendTable.AccountNum;
      
    }
    /// <summary>
    /// Creates a <c>VendorCreatedBusinessEventContract</c> from a <c>VendTable</c> record.
    /// </summary>
    /// <param name = "_VendTable">A <c>VendTable</c> record.</param>
    /// <returns>A <c>VendorCreatedBusinessEventContract</c>.</returns>
    public static DevVendorCreatedBusinessEventContract newFromVendTable(VendTable _vendTable)
    {
        var contract =  DevVendorCreatedBusinessEventContract::construct();
        contract.initialize(_vendTable);
        contract.parmVendAccount(_vendTable.AccountNum);
        return contract;
    }

    [DataMember('AccountNumber'), BusinessEventsDataMember("@Dev:AccountNumber")]
    public VendAccount parmVendAccount(VendAccount _vendAccount = vendAccount)
    {
        vendAccount = _vendAccount;

        return vendAccount;
    }
   
    private void new()
    {
    }

    public static DevVendorCreatedBusinessEventContract construct()
    {
        DevVendorCreatedBusinessEventContract retVal = new DevVendorCreatedBusinessEventContract();
        return retVal;
    }    
}
BusinessEventsBase extension

The process of implementing an extension of the BusinessEventsBase class involves extending the BusinessEventsBase class, and implementing a static constructor method, a private new method, methods to maintain internal state, and the buildContract method.

[BusinessEvents(classStr(DevVendorCreatedBusinessEventContract),
"Dev:VendorCreatedEvent","Dev:VendorCreatedEventDescription",ModuleAxapta::Vendor)]
public final class DevVendorCreatedBusinessEvent extends BusinessEventsBase
{
    private VendTable vendTable;
    private VendTable parmVendTable(VendTable _vendTable = vendTable)
    {
        vendTable = _vendTable;
        return vendTable;
    }

    private void new()
    {
        super();
    }

    public static DevVendorCreatedBusinessEvent construct()
    {
        DevVendorCreatedBusinessEvent retVal = new DevVendorCreatedBusinessEvent();
        return retVal;
    }

    [Wrappable(true), Replaceable(true)]
    public BusinessEventsContract buildContract()
    {
        return DevVendorCreatedBusinessEventContract::newFromVendTable(vendTable);
    }

    static public DevVendorCreatedBusinessEvent newFromVendTable(VendTable _vendTable)
    {
        DevVendorCreatedBusinessEvent businessEvent =  DevVendorCreatedBusinessEvent::construct();
        businessEvent.parmVendTable(_vendTable);
        return businessEvent;
    }
    
}
Sending/Triggering a Dynamics 365 FO business event

The trigger class is responsible for triggering the business event. In my use case, i would like to trigger the business event after the creating on Vendor record in the vendTable. So I will be extending the “VendTable_onInserted” method to send the business event.

public static class DevVendorCreatedBusinessEventTrigger_Extension
{
    
    /// <summary>
    ///Send the business event on vendor record creation.
    /// </summary>
    /// <param name="sender">Vendor Table</param>
    /// <param name="e"></param>
    [DataEventHandler(tableStr(VendTable), DataEventType::Inserted)]
    public static void VendTable_onInserted(Common sender, DataEventArgs e)
    {
      
        VendTable vendTable = sender;
        DevVendorCreatedBusinessEvent businessEvent = DevVendorCreatedBusinessEvent::newFromVendTable(vendTable);
        if(businessEvent)
        {
            businessEvent.send();
        }
    }
}

Activate the custom Dynamics 365 FO business event

The Business Event catalog doesnt get automatically refreshed. To refresh the Business Event catalog, go to
System Administration -> Business Event Catalog -> Manage -> Rebuild business event catalog
Once rebuild is complete, the new business event would be added to the list .

Dynamics 365 FO Integration Design Pattern: Business Events

Activate the Business Event and assign the end point to the busiess event. Once it is activated, the business event should appear in the “Active events” tab.

Consume the Dynamics 365 FO business event in Logic App

The newly created Business event would appear as the Trigger in Logic App under D365 Fin and Ops Module.

Dynamics 365 FO Integration Design Pattern: Business Events

Then the Logic app can be used to get the information about the vendor and forward it to the third party applications.

Dynamics 365 FO Integration Design Pattern: Business Events

    Azure Integration Release Management best practices

    This blog describe the best practises and guidelines for using Azure DevOps for Azure Integration development and release management.

    Introduction

    This blog describe the best practises and guidelines for using Azure DevOps for Azure Integration development and release management.

    Figure 1: High Level view of AIS Release management

    Azure DevOps / Release management

    Branches

    For releasing software to other environments, three different branches are used.

    Figure 2: Branching strategy for AIS

    Dev: The dev branch is used by developers to check-in all pending changes in Azure integration solutions.

    Main: The Main branch is used by the release manager to merge all changes from the Dev-branch once these changes are tested and approved by the Tester. The artifacts from this branch will always be used to deploy the solution to Test, Acceptance and Production environment.

    Release: The release branch is created by the release manager once Azure integration solutions for the current iteration are approved. The latest version of the release-branches can be used to apply hotfixes on the current release.

    Check-in Policy in DevOps: Every check-in (Change set) must be linked to a related Task or Issue in the DevOps

    Naming convention for Release Branch

    Release Branches should be named with the combination Business Release Number and date of the release in the format of YYYYMMDD

    Guidelines to Merge the Code

    The following merging guidelines should be followed during code merge

    Manual Merge of code changes from Release to Dev

    The merging of HotFix changes is merged manually. The HotFix engineer should communicate the fixed to the DevLead of the project. The change information should be clearly communicated with information such as : The actual change, DevOps issue ID, the changeset number. The Hotfix engineer should ensure that he/she receives confirmation from DevLead that the change has been merged. The DevLead of the project should check-in the change with the correct issue id from the DevOps.

     Manual Merge of code changes from Dev to Main

    The merging of Dev changes is merged to Main. The Release manager of the project should merge the changes and link all the changesets and DevOps Tasks/Issues, which are part of the merge. This would help to create a Technical release document with all the features/issues  which are included in the release.

    Build pipelines

    Two different build pipelines are available:

    Build-Main: Manual build for Main-branch.

    Release Build: Manual build for Release-branch.

    Release pipelines

    Two different Release pipelines are available:

    Release-Main: Release pipeline for Main-branch.

    Release Hotfix: Release pipeline for Release(Hotfix)-branch.

    Resource Group management

    Resource group management. The following would be the recommendation for resource group management for Azure integration.

    Azure Integration: ARM visualiser on Azure portal

    Azure Resource Manager visualization is part of Azure Portal now. It provides a visual way of visualizing Azure Resource Manager Templates. It helps to understand the components in the ARM template, how they interact and components used in the template. It provides a way to understand the components in a resource group.

    Azure Integration: ARM Visualizer

    1. Log in to Release candidate of Azure Portal https://rc.portal.azure.com/
    2. Select the correct subscription
    3. Go to one of the resource groups you want to visualize
    4. Click on the Export Template under the setting Tab

    5. Then click on the Visualizer Template Tab to visualize the components.

    6. It also provides option to filter the components and shows the with or without the Labels.

    Azure: Enumerating all the Logic App runs history using PowerShell and REST API

    What Are Logic Apps?

    Logic Apps are a piece of integration workflow hosted on Azure which is used to create scale-able integrations between various systems. These are very easy to design and provide connectivity between various disparate systems using many out of the box connectors as well as with the facility to design custom connectors for specific purposes. This makes integration easier than ever as the design aspect of the earlier complex integrations is made easy with minimum steps required to get a workflow in place and get it running.

    Problem Scenario

    Whenever Logic Apps are Hosted onto the Microsoft Azure Platform for integrating various business flows, it becomes imperative that the Logic App run is monitored on a daily basis to check if there were any errors and resubmit the execution of the Logic Apps. There will be scenarios where Azure Administrators should be able to re-submit all the failed runs of the logic App. The problem with the Azure Powershell Get-AzureRmLogicAppRunHistory is that it returns only the latest 30 items.

    The following Azure Powersehll command ony returns latest 30 runs
    Get-AzureRmLogicAppRunHistory -ResourceGroupName $grpName -Name $logicApp.Name

    To enumerate for all the Logic App Runs, REST API should be used to get all the runs using the paging mechanism. The REST can be called from PowerShell using
    Invoke-RestMethod

    REST APIs allow users to interact with various services over HTTP/HTTPS and follow a common methodology of using methods to read and manipulate information. REST return information in a standard way, typically through JavaScript Object Notation (JSON). The Invoke-RestMethod cmdlet is built with REST in mind. It allows the user to invoke various methods for web service APIs and easily parse the output.

    The Invoke-RestMethod cmdlet sends HTTP and HTTPS requests to Representational State Transfer (REST) web services that return richly structured data. PowerShell formats the response based on the data type. For JavaScript Object Notation (JSON) or XML, PowerShell converts (or deserializes) the content into objects.

    Get Logic App Runs using REST

    Azure exposes REST API to list the workflow runs.

    https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Logic/workflows/{workflowName}/runs/{runName}?api-version=2016-06-01

    The API endpoints are protected using oAuth2.0. The endpoints can be accessed using the Bearer token (Access Token). The Bearer authentication (also called token authentication) is an HTTP authentication scheme that involves security tokens called bearer(access) tokens. The client must send the access token in the Authorization header when making requests to protected resources:
    Authorization: Bearer <token>
    The access Token from the interactive login can be accessed using PowerShell as described below. The Access Token can also be retrieved using a registered appliaction credentials in Azure AD.

    Connect-AzureRmAccount
    Get-AzureRmContext
    $subscription = Get-AzureRmSubscription -SubscriptionName $subscriptionName
    $context = $subscription | Set-AzureRmContext
    $tokens = $context.TokenCache.ReadItems() | Where-Object { $_.TenantId -eq $context.Subscription.TenantId } | Sort-Object -Property ExpiresOn -Descending
    $token = $tokens[0].AccessToken

    The Authentication header for Invoke-RestMethod can be passed as described below.

     $headers = @{
        'Authorization' = 'Bearer ' + $token
      }
    Invoke-RestMethod -Method 'POST' -Uri $uri -Headers $headers

    The following code describes the method to get all the Logic App Runs and re-submit the failed one. The Response contains the nextLink property to get the link for the next Paging items.

    function Get-LogicAppHistory {
      param
      (
        [Parameter(Mandatory = $true)]
        $Token,
        [Parameter(Mandatory = $true)]
        $subscriptionId,
        [Parameter(Mandatory = $true)]
        $resourceGroupName,
        [Parameter(Mandatory = $true)]
        $logicAppName,
        [Parameter(Mandatory = $false)]
        $status,
        [Parameter(Mandatory = $true)]
        $startDateTime,
        [Parameter(Mandatory = $false)]
        $endDateTime
      )
      $headers = @{
        'Authorization' = 'Bearer ' + $token
      }
      $uri = 'https://management.azure.com/subscriptions/{0}/resourceGroups/{1}/providers/Microsoft.Logic/workflows/{2}/runs?api-version=2016-06-01' -f $subscriptionId,$resourceGroupName,$logicAppName
      $method = (Invoke-RestMethod -Uri $uri -Headers $headers -Method Get) 
      $output = $method.value
      foreach ($item in $output) {
        if ((($item.properties.status -eq $status) -and ($item.properties.startTime -ge $startDateTime)) -and ($item.properties.startTime -le  $endDateTime ))
        {
          $uri = 'https://management.azure.com/subscriptions/{0}/resourceGroups/{1}/providers/Microsoft.Logic/workflows/{2}/triggers/{3}/histories/{4}/resubmit?api-version=2016-06-01' -f $subscriptionId,$resourceGroupName,$logicAppName,$item.properties.Trigger.Name,$item.Name
          Write-Host "Submitting" $uri
          Invoke-RestMethod -Method 'POST' -Uri $uri -Headers $headers
        }
      }
      while ($method.nextLink)
      {
        $nextLink = $method.nextLink; 
        Write-Host $nextLink
        $method = (Invoke-RestMethod -Uri $nextLink -Headers $headers -Method Get)
        $output = $method.value
        foreach ($item in $output) {
          if (($item.properties.status -eq $status) -and ([DateTime]$item.properties.startTime -ge $startDateTime) -and ([DateTime]$item.properties.startTime -le $endDateTime))
          {
            $uri = 'https://management.azure.com/subscriptions/{0}/resourceGroups/{1}/providers/Microsoft.Logic/workflows/{2}/triggers/{3}/histories/{4}/resubmit?api-version=2016-06-01' -f $subscriptionId,$resourceGroupName,$logicAppName,$item.properties.Trigger.Name,$item.Name
            Write-Host "Submitting" $uri
            Invoke-RestMethod -Method 'POST' -Uri $uri -Headers $headers
          }
        }
      }
    }

    The complete Code

    The complete code is below.

    function Get-LogicAppHistory {
      param
      (
        [Parameter(Mandatory = $true)]
        $Token,
        [Parameter(Mandatory = $true)]
        $subscriptionId,
        [Parameter(Mandatory = $true)]
        $resourceGroupName,
        [Parameter(Mandatory = $true)]
        $logicAppName,
        [Parameter(Mandatory = $false)]
        $status,
        [Parameter(Mandatory = $true)]
        $startDateTime,
        [Parameter(Mandatory = $false)]
        $endDateTime
      )
      $headers = @{
        'Authorization' = 'Bearer ' + $token
      }
      $uri = 'https://management.azure.com/subscriptions/{0}/resourceGroups/{1}/providers/Microsoft.Logic/workflows/{2}/runs?api-version=2016-06-01' -f $subscriptionId,$resourceGroupName,$logicAppName
      $method = (Invoke-RestMethod -Uri $uri -Headers $headers -Method Get) 
      $output = $method.value
      foreach ($item in $output) {
        if ((($item.properties.status -eq $status) -and ($item.properties.startTime -ge $startDateTime)) -and ($item.properties.startTime -le  $endDateTime ))
        {
          $uri = 'https://management.azure.com/subscriptions/{0}/resourceGroups/{1}/providers/Microsoft.Logic/workflows/{2}/triggers/{3}/histories/{4}/resubmit?api-version=2016-06-01' -f $subscriptionId,$resourceGroupName,$logicAppName,$item.properties.Trigger.Name,$item.Name
          Write-Host "Submitting" $uri
          Invoke-RestMethod -Method 'POST' -Uri $uri -Headers $headers
        }
      }
      while ($method.nextLink)
      {
        $nextLink = $method.nextLink; 
        Write-Host $nextLink
        $method = (Invoke-RestMethod -Uri $nextLink -Headers $headers -Method Get)
        $output = $method.value
        foreach ($item in $output) {
          if (($item.properties.status -eq $status) -and ([DateTime]$item.properties.startTime -ge $startDateTime) -and ([DateTime]$item.properties.startTime -le $endDateTime))
          {
            $uri = 'https://management.azure.com/subscriptions/{0}/resourceGroups/{1}/providers/Microsoft.Logic/workflows/{2}/triggers/{3}/histories/{4}/resubmit?api-version=2016-06-01' -f $subscriptionId,$resourceGroupName,$logicAppName,$item.properties.Trigger.Name,$item.Name
            Write-Host "Submitting" $uri
            Invoke-RestMethod -Method 'POST' -Uri $uri -Headers $headers
          }
        }
      }
    }
    function ResubmitFailedLogicApp {
      param(
        [Parameter(Mandatory = $true)]
        [string]$subscriptionName,
        [Parameter(Mandatory = $true)]
        [string]$resourceGroupName,
        [Parameter(Mandatory = $true)]
        [string]$logicAppName,
        [Parameter(Mandatory = $true)]
        [string]$status
      )
      $currentAzureContext = Get-AzureRmContext
      if (!$currentAzureContext)
      {
        Connect-AzureRmAccount
        $currentAzureContext = Get-AzureRmContext
      }
      $startDateTime = Get-Date -Date '2019-10-14'
      $endDateTime = Get-Date -Date '2019-10-23'
      $subscription = Get-AzureRmSubscription -SubscriptionName $subscriptionName
      $context = $subscription | Set-AzureRmContext
      $tokens = $context.TokenCache.ReadItems() | Where-Object { $_.TenantId -eq $context.Subscription.TenantId } | Sort-Object -Property ExpiresOn -Descending
      $token = $tokens[0].AccessToken
      $subscriptionId = $subscription.Id;
      Write-Host $subscriptionId
      Get-LogicAppHistory -Token $token -SubscriptionId $subscriptionId -resourceGroupName $resourceGroupName -logicAppName $logicAppName -Status $status -startDateTime $startDateTime -endDateTime $endDateTime
    }
    Write-Host "#######  Example  #######"
    Write-Host "ResubmitFailedLogicApp -subscriptionName 'New ENT Subscription' -resourceGroupName 'resourceName' -logicAppName 'LogicAppName' -status 'Failed'"
    Write-Host "#######  Example  #######"
    ResubmitFailedLogicApp

    View Post

    D365FO: Interacting with Data Management framework using REST API

    This blog describes the method to interact with the Data Management framework using REST API to export the delta changes of an entity. The package API lets third party applications to integrate by using data packages.

    Use case scenario:

    The vendor changes are tracked using the “Change Tracker” functionality available on D365FO at Entity Level. Using the Change Tracker on Vendor V2 entity, the DMF Incremental export job can export the Entities which are modified from last export execution. Using the REST API, third part applications will initiate the export Job. The following image describes on high level the data flow.

    Introduction to the terms

    Data Management Framework: DMF is the new all-in-one concept introduced by Microsoft in Dynamics 365 for Finance and Operations. It supports and manages all core data management related tasks. This enables asynchronous and high-performing data insertion and extraction scenarios. Here are some examples: Interactive file-based import/export, Recurring integrations (file, queue, and so on)

    Data Entity: A data entity in D365 is an abstraction from the physical implementation of database tables. A data entity is a simplified de-normalized representation of underlying tables. A data entity represents a common data concept or functionality, (e.g. Vendors V2  where the details are stored in normalized relational tables) but all details are represented in one flat view in Vendor Details data entity.

    Data Package: Data Package is a simple .zip file that contains the source (import) or target data(export) itself . The zip file contains three files. The data file and the manifest files which contain metadata information of the Data Entity and the processing instructions for DMF.

    Implementation Details

    The integration involves the following steps

    • Enable change tracking
    • Creation of the Data Export DMF Project
    • Authentication against Azure AD
    • Interact with DMF using REST API

    Enable change tracking

    Change tracking enables incremental export of data from Finance and Operations by using Data management. In an incremental export, only records that have changed are exported. To enable incremental export, you must enable change tracking on entities. If you don’t enable change tracking on an entity, you can only enable a full export each time.

    The vendor changes are tracked using the “Change Tracker” functionality available on Entity Level. Using the Change Tracker on Vendor V2 entity, the DMF Incremental export job can export the Entities which are modified from last export execution. The following steps are used to enable Change Tracking in D365 FO

    1. Go to Data Management work space-> Data Entities.
    2. Select the “Vendors V2” Entity for which you want to enable Change Tracking.
    3. In the Action Pane, go to Change Tracking. Select the following option:
      • Enable entire entity – It will enable tracking for all writable Data sources used in the entities. It would result in a negative performance impact on the system.
      • Enable primary table
      • Enable Custom query

    Creation of the DMF Data Project

    In D365FO a batch job should be created in the Data Management Framework to export the Vendor Changes. The export will be provided with the changes happened to Vendor Records from previous successful export. Below are the steps to import or export data.

    1. Create an import or export job (more on this can be found here)
      • Define the project category (Export/Import): Export
      • Identify the entities to import or export: “Vendors V2
      • Set the data format for the job: Excel, CSV, XML etc.
      • Determine whether to use staging tables : No
      • Group Name: VendorIncrementalChanges
      • Default refresh type: Incremental Push Only
    2. Validate that the source data and target data are mapped correctly.

    Azure AD Authentication

    In order to call the D365 F&O APIs, it is necessary to authenticate with a valid access token. The token can be retrieved from Azure Active Directory using a valid Application Id and secret key, which has access to the D365FO environment. The application ID and secret key are created by registering an application in Azure Active directory.

    Pre-requisite :

    1. Register an application in Azure AD and Grant access to D365FO. The detailed steps are described here. Instead of Dynamics CRM  select Dynamics ERP 
    2. Register the AAD application in D365FO
      • System administration > Setup > Azure Active Directory applications
      • Click “New” -> Enter APP-ID(created as part of the previous step), Meaningful name and User ID (the permission you would like to assign).
    1. The client application authenticates to the Azure AD token issuance endpoint and requests an access token.
    2. The Azure AD token issuance endpoint issues the access token.
    3. The access token is used to authenticate to the D365FO DMF and initiate DMF Job.
    4. Data from the DMF is returned to the third-party application.
    Http Method: POST
    Request URL: https://login.microsoftonline.com//oauth2/token 
    Parameters : grant_type: client_credentials [Specifies the requested grant type. In a Client Credentials Grant flow, the value must be client_credentials.]
    client_id: Registered App ID of the AAD Application 
    client_secret: Enter a key of the registered application in AAD.
    Resource: Enter the URL of the D365FO Url (e.g. https://dev-d365-fo-ultdeabc5b35da4fe25devaos.cloudax.dynamics.com)
    

    The Resource URL should not have “/” in the end, othwerise you would always get access denied while accessing the target resource

    C# Code

    //Azure AAD Application settings
    //The Tenant URL (use friendlyname or the TenantID
    static string aadTenant = "https://login.windows.net/dev.onmicrosoft.com";
    //The URL of the resource you would be accessing using the access token.Please ensure / is not there in the end of the URL
    static string aadResource = "https://dev-testdevaos.sandbox.ax.dynamics.com";
    //APplication ID . Store them securely / Encrypted config file or secure store
    static string aadClientAppId = "GUID Of the Azure application";
    //Application secret. Store them securely / Encrypted config file or secure store
    static string aadClientAppSecret = "Secret of the Azure application"; 
    /// 
    /// Retrieves an authentication header from the service.The authentication header for the Web API call.        private static string GetAuthenticationHeader()
            {
                //using Microsoft.IdentityModel.Clients.ActiveDirectory;
                AuthenticationContext authenticationContext = new AuthenticationContext(aadTenant);
                var creadential = new ClientCredential(aadClientAppId, aadClientAppSecret);
                AuthenticationResult authenticationResult = authenticationContext.AcquireTokenAsync(aadResource, creadential).Result;
                return authenticationResult.AccessToken;
            }
    

    Interaction using REST API

    The high level interaction of API calls to get the delta package via REST API is shown below.

    Step 1: Export to Package:

    The Export to Package API is used to initiate an export of a data package.
    Request Message:

    POST /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ExportToPackage
    Message Body
    {
        "definitionGroupId":" The name of the data project for export.",
        "packageName":" The name of the exported data package.",
        "executionId":" The ID to use for the job. If an empty ID is assigned, a new execution ID will be created.",
        "reExecute”: True,
        "legalEntityId":" The legal entity for the data import."
    }
    
    

    C# Code

     string authHeader = GetAuthenticationHeader();
     HttpClient client = new HttpClient();
     client.BaseAddress = new Uri(aadResource);
     client.DefaultRequestHeaders.Clear();        
     client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", authHeader);
    //Initiate the Export
     string execytionID = Guid.NewGuid().ToString();
     var payload = new DMFExport()
     {
     DefinitionGroupId = jobName,
     PackageName = packageName,
     ExecutionId =execytionID,
     ReExecute = true,
     LegalEntityId =legalEntity
    m};
     var stringPayload = JsonConvert.SerializeObject(payload);
    var httpContent = new StringContent(stringPayload, Encoding.UTF8, "application/json");
     var result = client.PostAsync("/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ExportToPackage", httpContent).Result;         
     string resultContent = await result.Content.ReadAsStringAsync();
     JObject joResponse = JObject.Parse(resultContent);
     string outPut = string.Empty;
     if (result.StatusCode == System.Net.HttpStatusCode.OK)
     {
     // Successs
     }
     else
    { 
    // failure
    }
    

    Step 2: GetExecutionSummaryStatus

    The GetExecutionSummaryStatus API is used for both import jobs and export jobs. It is used to check the status of a data project execution job. The following values are possible values for the Execution Status:
    Unknown / NotRun / Executing / Succeeded / PartiallySucceeded / Failed / Canceled
    If the status is ‘Executing’, then submit the request after 60 second until the response gets completion status in success or failure response.

    Request:

    POST /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatus
    BODY
    {"executionId":"Execution Id Provided to the Previous Request"}
    

    C#

    int maxLoop = 15;
    do
    {
    //"Waiting for package to execution to complete"
    Thread.Sleep(5000);
    maxLoop--;
    if (maxLoop <= 0)
    {
    break;
    }
    //("Checking status...");
    stringPayload = JsonConvert.SerializeObject(new DMFExportSummary() { ExecutionId = execytionID });
    httpContent = new StringContent(stringPayload, Encoding.UTF8, "application/json");
    result = client.PostAsync("/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatus", httpContent).Result;
    resultContent = await result.Content.ReadAsStringAsync();
    outPut = JObject.Parse(resultContent).GetValue("value").ToString();
    //"Status of export is "+ outPut
    }
    while (outPut == "NotRun" || outPut == "Executing");
    

    Step 3 GetExportedPackageUrl

    The GetExportedPackageUrl API is used to get the URL of the data package that was exported by a call to Export Package.

    Step 4 Download package File

    The file can be downloaded using HTTP GET request using the URL provided in the response of the previous request. The downloaded files will be a zip file. The file needs to be extracted.  The extracted zip file contains three files but “Vendors V2.xml” would be the only file be used for processing.

    Complete Code C#

    using Microsoft.Azure.Storage.Blob;
    using Microsoft.IdentityModel.Clients.ActiveDirectory;
    using Newtonsoft.Json;
    using Newtonsoft.Json.Linq;
    using System;
    using System.IO;
    using System.Net.Http;
    using System.Net.Http.Headers;
    using System.Text;
    using System.Threading;
    
    namespace Dev.DMF.Interface
    {
        public class DMFExport
        {
            [JsonProperty("definitionGroupId")]
            public string DefinitionGroupId { get; set; }
    
            [JsonProperty("packageName")]
            public string PackageName { get; set; }
    
            [JsonProperty("executionId")]
            public string ExecutionId { get; set; }
    
            [JsonProperty("reExecute")]
            public bool ReExecute { get; set; }
    
            [JsonProperty("legalEntityId")]
            public string LegalEntityId { get; set; }
        }
    
        public class DMFExportSummary
        {      
            [JsonProperty("executionId")]
            public string ExecutionId { get; set; }     
        }
        internal class DMFManager
        {
          
            static string downloadUrl = string.Empty;
    
            //Azure AAD Application settings
            //The Tenant URL (use friendlyname or the TenantID
            static string aadTenant = "https://login.windows.net/dev.onmicrosoft.com";
            //The URL of the resource you would be accessing using the access token
            //Please ensure / is not there in the end of the URL
            static string aadResource = "https://dev-testdevaos.sandbox.ax.dynamics.com";
            //APplication ID . Store them securely / Encrypted config file or secure store
            static string aadClientAppId = "GUID Of the Azure application";
            //Application secret . Store them securely / Encrypted config file or secure store
            static string aadClientAppSecret = "Secret of the Azure application";      
    
            /// <summary>
            /// Retrieves an authentication header from the service.
            /// </summary>
            /// <returns>The authentication header for the Web API call.</returns>
            private static string GetAuthenticationHeader()
            {
                //using Microsoft.IdentityModel.Clients.ActiveDirectory;
                AuthenticationContext authenticationContext = new AuthenticationContext(aadTenant);
                var creadential = new ClientCredential(aadClientAppId, aadClientAppSecret);
                AuthenticationResult authenticationResult = authenticationContext.AcquireTokenAsync(aadResource, creadential).Result;
                return authenticationResult.AccessToken;
            }
    
            // Setup Step 
            // - Create an export project within Dynamics called ExportVendors in company USMF before you run the following code
            // - It can of any data format XML and can include any number of data entities
            // 1. Initiate export of a data project to create a data package within Dynamics 365 for Operations
    
            private static async void Export(string jobName, string packageName, string legalEntity, string filePath, string fileName)
            {
                string authHeader = GetAuthenticationHeader();
                HttpClient client = new HttpClient();
                client.BaseAddress = new Uri(aadResource);
                client.DefaultRequestHeaders.Clear();        
                client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", authHeader);
    
                //Initiate the Export
                string execytionID = Guid.NewGuid().ToString();
                var payload = new DMFExport()
                {
                    DefinitionGroupId = jobName,
                    PackageName = packageName,
                    ExecutionId =execytionID,
                    ReExecute = true,
                    LegalEntityId =legalEntity
                };
                Console.WriteLine("Initiating export of a data project...");
                var stringPayload = JsonConvert.SerializeObject(payload);
                var httpContent = new StringContent(stringPayload, Encoding.UTF8, "application/json");
                var result = client.PostAsync("/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ExportToPackage", httpContent).Result;         
                string resultContent = await result.Content.ReadAsStringAsync();
                JObject joResponse = JObject.Parse(resultContent);
                string outPut = string.Empty;
                if (result.StatusCode == System.Net.HttpStatusCode.OK)
                {
               
                    Console.WriteLine("Initiating export of a data project...Complete");
                    int maxLoop = 15;
                    do
                    {
                        Console.WriteLine("Waiting for package to execution to complete");
    
                        Thread.Sleep(5000);
                        maxLoop--;
    
                        if (maxLoop <= 0)
                        {
                            break;
                        }
    
                        Console.WriteLine("Checking status...");
    
                        stringPayload = JsonConvert.SerializeObject(new DMFExportSummary() { ExecutionId = execytionID });
                        httpContent = new StringContent(stringPayload, Encoding.UTF8, "application/json");
    
                         result = client.PostAsync("/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatus", httpContent).Result;
                         resultContent = await result.Content.ReadAsStringAsync();
                         outPut = JObject.Parse(resultContent).GetValue("value").ToString();
                      
                        Console.WriteLine("Status of export is "+ outPut);
    
                    }
                    while (outPut == "NotRun" || outPut == "Executing");
    
                    if (outPut != "Succeeded" && outPut != "PartiallySucceeded")
                    {
                        throw new Exception("Operation Failed");
                    }
                    else
                    {
                        // 3. Get downloable Url to download the package    
                        //    POST / data / DataManagementDefinitionGroups / Microsoft.Dynamics.DataEntities.GetExportedPackageUrl
                        stringPayload = JsonConvert.SerializeObject(new DMFExportSummary() { ExecutionId = execytionID });
                        httpContent = new StringContent(stringPayload, Encoding.UTF8, "application/json");
    
                        result = client.PostAsync("/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExportedPackageUrl", httpContent).Result;
                        resultContent = await result.Content.ReadAsStringAsync();
                        downloadUrl = JObject.Parse(resultContent).GetValue("value").ToString();
                    }
    
                    // 4. Download the file from Url to a local folder
                    Console.WriteLine("Downloading the file ...");
                    var blob = new CloudBlockBlob(new Uri(downloadUrl));
                    blob.DownloadToFile(Path.Combine(filePath, fileName + ".zip"), System.IO.FileMode.Create);
                    Console.WriteLine("Downloading the file ...Complete");
    
    
                }
                else
                {
                    Console.WriteLine("Initiating export of a data project...Failed");
                }
            }       
        }
    }
    
    

    Dynamics 365 UO: Data Task automation

    Dynamics 365 UO Data task automation is a framework which helps with following features

    • Demo Data Setup
    • Golden Configuration Setup
    • Data Migration Validation
    • Data Entities and Integration Test automation

    D365 UO: Intrduction to Data Task Automation

    Data packages from the Shared Asset Library or Project Asset Library can be downloaded and imported automatically into D365FO using Data Task Automation (DTA), which is available in the Data management workspace. The high level data flow diagram is shown below:

    The following image shows the features of Data Task Automation

    The automation tasks are configured through a manifest. The following figure shows an example of a DTA manifest file.

    The above manifest file can be loaded into Data management and results in the creation of several data automation tasks as shown below.

    The combination of data packages and data task automation will allow the users to build a flexible framework that automates the generation of all relevant data in a new deployment from the template and create test cases for recurring Integration testing and normal test cases.

    Data Task Automation: The Manifest file

    Manifest file provides mechanism to create a test cases / Tasks for Data Task automation.The manifest has two main sections: SharedSetup and Task definition

    Data Task Automation: SharedSetup

    The Shared setup section defines general task parameters and behaviors for all tasks in the manifest.

    1: Data files from LCS : This elements define the data packages and data files that the tasks in the manifest will use. The data files must be either in the LCS asset library of your LCS project or in the Shared asset library.

    • Name
    • Project ID (LCS Project ID and when its empty it uses access the package from Shared Asset Library)

    2: Generic Job Definition: This section defines the data project definition. There can be more than one job definition in a manifest. The test case can override the entity definition in Task level.

    The Recurring batch type defined under mode element would help in creating Recurring Integration scenarios. There other types of Modes to simulate different scenarios such as initiating export/import from UI.

    3: Generic Entity Definition

    This section defines the characteristics of an entity that a task in the manifest will use. There can be more than one definition, one for each entity that is used by the tasks in the manifest. The test case can override the entity definition in Task level

    Data Task Automation: Test Group definition

    Test groups can be used to organize related tasks in a manifest. There can be more than one test group in a manifest. The TestGroup contains number of test cases. Test Case definition can override the Generic Job, Entity definitions by specifying the values specific to the test case (e.g. DataFile, Mode etc.)

    Data Task Automation: Best practices for manifest design

    Granularity

    • Define the granularity of your manifest on functional use case.    
    • During the development start with as many manifests, as the project progresses then merge manifests on functional level.
    • Consider separation of duties. For example, you might have one manifest for the setup of demo data and another manifest for the setup of the golden configuration for your environment. In this way, you can make sure that team members use only the manifests that they are supposed to use.

    Inheritance

    • The manifest schema supports inheritance of common elements that will apply to all tasks in the manifest. A task can override a common element to define a unique behavior. The purpose of the Shared setup section is to minimize repetition of configuration elements, so that elements are reused as much as possible. The goal is to keep the manifest concise and clean, to improve maintenance and readability.

    Source Code

    • Manifests that must be used by all the members of an implementation team should be stored in source control in the Application Object Tree (AOT).

    Benefits of Data Task Automation

    Data task automation in Microsoft Dynamics 365 for Finance and Operations lets you easily repeat many types of data tasks and validate the outcome of each task.  Following lists the benefits of DTA

    • Built into the D365 Product – Environment Agnostic / Available to everyone
    • Low Code/No Code – Declarative Authoring of Tasks [xml]
    • LCS Integration for Test Data (As Data Packages)
      • Shared Asset Library
      • Project Asset Library
    • Upload Data to Multiple Legal Entities in one-go using DTA
    • It can be included to AOT resources using VS Packages (Dev ALM)
      • It will be available in D365FO UI
    • It also supports Loading of the Manifest from File system
    • Good Validations
      • Job, batch and Integration status
      • Records Count
      • Staging and Target status
      • Other options
        • Skip staging
        • Truncations

    Steps to Setup Data Task Automation

    Create the required Data Packages

    Identify the required data for the test cases (Tasks) and create the Data packages.

    Upload the Data packages to LCS

    Upload the data packages to LCS. Steps to create data packages.

    Create the Manifest file for Data Task Automation

    Identify the operations and Test cases required to be performed via DTA. Then create Manifest file required for these operations.  The detailed information about Manifest can be found here. Store the Manifest file in source control and please use the best practice while designing the Manifest file.

    Below is an example simple manifest file to upload CustomerV3 Entity. It expects a Data package named “CustomersV3” in LCS (as result of previous two steps). Visual Studio or VS Code is ideal for development.

    <?xml version='1.0' encoding='utf-8'?>
    <TestManifest name='HSO-DTA-Testing-Intro'>
        <SharedSetup>
            <DataFile ID='CustomersV3' name='CustomersV3' assetType='Data package' lcsProjectId='1368270'/>
            <JobDefinition ID='GenericIntegrationTestDefinition'>
                <Operation>Import</Operation>
                <SkipStaging>Yes</SkipStaging>
                <Truncate></Truncate>
                <Mode>Recurring batch</Mode>
                <BatchFrequencyInMinutes>1</BatchFrequencyInMinutes>
                <NumberOfTimesToRunBatch >2</NumberOfTimesToRunBatch>
                <UploadFrequencyInSeconds>1</UploadFrequencyInSeconds>
                <TotalNumberOfTimesToUploadFile>1</TotalNumberOfTimesToUploadFile>
                <SupportedDataSourceType>Package</SupportedDataSourceType>
                <ProcessMessagesInOrder>No</ProcessMessagesInOrder>
                <PreventUploadWhenZeroRecords>No</PreventUploadWhenZeroRecords>
                <UseCompanyFromMessage>Yes</UseCompanyFromMessage>
                <LegalEntity>DAT</LegalEntity>
            </JobDefinition>
            <EntitySetup ID='Generic'>
                <Entity name='*'>
                    <SourceDataFormatName>Package</SourceDataFormatName>
                    <ChangeTracking></ChangeTracking>
                    <PublishToBYOD></PublishToBYOD>
                    <DefaultRefreshType>Full push only</DefaultRefreshType>
                    <ExcelWorkSheetName></ExcelWorkSheetName>
                    <SelectFields>All fields</SelectFields>
                    <SetBasedProcessing></SetBasedProcessing>
                    <FailBatchOnErrorForExecutionUnit>No</FailBatchOnErrorForExecutionUnit>
                    <FailBatchOnErrorForLevel>No</FailBatchOnErrorForLevel>
                    <FailBatchOnErrorForSequence>No</FailBatchOnErrorForSequence>
                    <ParallelProcessing>
                        <Threshold></Threshold>
                        <TaskCount></TaskCount>
                    </ParallelProcessing>               
     <!-- <MappingDetail StagingFieldName='devLNFN' AutoGenerate='Yes' AutoDefault='No' DefaultValue='' IgnoreBlankValues='No' TextQualifier='No' UseEnumLabel='No'/> --></Entity>
            </EntitySetup>
        </SharedSetup>
        <TestGroup name='Manage Integration Test for Entities'>
            <TestCase Title='Adding New Customer  via Integration' ID='AddNewCustomeViaIntegration' RepeatCount='1' TraceParser='on' TimeOut='20'>
                <DataFile RefID='CustomersV3' />
                <JobDefinition RefID='GenericIntegrationTestDefinition'/>
                <EntitySetup RefID='Generic' />
            </TestCase>
        </TestGroup>
    </TestManifest>                   

    Upload Manifest to Dynamics 365 UO

    1.  Log into D365 FO portal

    2.   Navigate to Data management from the workspace:

    3.  Navigate to DTA by clicking on Data Tool Automation

    4.    Click Load Tasks from the file on Top left corner

    5.   Select and Upload the Manifest xml

    6. Upload would List all the tasks which are defined in the Manifest and select All and click Run.

    7.    This would result in the dialog and Click on “Click here to Connect to Lifecycle service” and this should result in success.

    8.  Then click OK
    9.   The Task will start running and the outcome of the test cases will be shown once the task is complete

    10.  Select the Task/Test case and click on the “Show validation results” to see the detailed results of the test case.

    Summary of Data Task Automation

    • The DTA is a great way to automate the test cases during development.
    • It provides recurring mechanism to Test Data Entities using data packages.
    • The business logic associated with Entities are executed during DTA.
    • DTA also helps with Performance baselining and regression testing.
    • It still requires improvements especially in the validation Phase. The outcome only says whether the Record has been added successfully but it doesn’t say anything about integrity of the data
    • The DTA Test cases are not yet integrated with DevOps but it is already part of ALM

    Azure Integration: Connecting to an On-Premise web service and file system from a Logic App

    This Blog entry describes the Azure Integration approach to connect OnPremise Webservices using OnPremise Data Gateway, Azure Integration Logic Apps and Logic App Custom Connector. It also describes an approach to read files from OnPremise using Azure Integration.

    UseCase : Azure Integration for OnPremise

    Consuming OnPremise webservice in Azure integration service. There will be a number of cases where enterprises want to connect their OnPremise environments to the cloud and expose data via webservice. This blog describes a method to consume the on premise web service in Azure Logic Apps and also reading the files from the on premise file system.

    Azure Gateway LogicApp:Connecting to an OnPremise webservice and file system from a Logic App

    The components used are:

    • Custom webservice on onPrem
    • OnPremise Data Gateway
    • Logic Apps custom connector
    • Logic Apps

    The onPremise webservice could be a REST or a WCF webservice. The communication between Ompremise and cloud happens using the OnPremise data gateway. Steps to configure and install a Data gateway has been described here.

    Export Webservice collection using Postman for Azure Integration

    We need to create a PostMan collection of our webservices for creating a custom Logic App connector. Detailed steps and additional information can be found here

    • In Postman, on the Builder tab, select the HTTP method, enter the request URL for the API endpoint, and select an authorization protocol, if any.
    • Enter key-value pairs for the request header. For common HTTP headers, you can select from the dropdown list.
      • Content-Type”application/json”
    • Enter content that you want to send in the request body. To check that the request works by getting a response back, choose Send.
    • Choose Save.
    • Under Save Request, provide a request name and a request description. The custom connector uses these values for the API operation summary and description.
    Azure Gateway LogicApp: Showing the response from onPrem web service: Create Postman collection
    • Choose + Create Collection and provide a collection name. The custom connector uses this value when you call the API.
    • Above the response window, choose Save Response.
    • At the top of the app, provide a name for your example response, and choose Save Example.
    • Under Headers, hover over each header, and choose the X next to the header to remove it. Choose Save to save the collection again.
    • Choose the ellipsis (. . . .) next to the collection, then choose Export.
    • Choose the Collection v1 export format, choose Export, then browse to the location where you want to save the JSON file.
    Azure Gateway LogicApp: Showing the response from onPrem web service: Create Postman collection

    Detailed steps and additional information can be found here

    Create an Azure Integration Custom Logic App Connector

    1. Log on to https://portal.azure.com/ with the account used for Gateway registration
    2. Select the correct Subscription and search for Logic Apps Custom Connector
    Azure Gateway LogicApp: Showing the response from onPrem web service: Create Custom Logic App connector

    3. Click on the Logic Apps Custom Connector and Click on ADD
    4. Then fill in the following information: Correct Subscription, Resource group, Custom Connector name and location

    Azure Gateway LogicApp: Showing the response from onPrem web service: Create Custom Logic App connector

    5. Click Review + Create and then Click on Create
    6. Once it is created, then Go to the Resource and click on Edit
    The fill in the following information
    API End Point: REST ( SOAP based on your service type)
    Select Postman collection V1 and import the file downloaded
    from Postman.
    This will fill in the information such as description, scheme, hostname and base Url . Ensure the information is correct. If not edit them.
    Ensure to select  “Connect via on premise data gateway” and optionally upload an image

    Azure Gateway LogicApp: Showing the response from onPrem web service: Create Custom Logic App connector

    7. Click on Security, then select then correct authentication scheme and click on the definition
    8. Validate the definition and click on  “Update connector” 

    Azure Gateway LogicApp: Showing the response from onPrem web service: Create Custom Logic App connector

    Interact with an on-premise webservice in Azure Integration Logic App

    • Create A new Logic App with a start condition either with a scheduler or an HTTP trigger
    • Add an action and search for the custom connector, which would appear in the custom section
    Azure Gateway LogicApp: Showing the response from onPrem web service: Select the OnPrem API connection
    • Select one of the actions from the custom connector. Each method in the webservice would appear as the actions here.
    • While selecting the action, this would bring up a dialog to create an API Connection required to communicate with the webservice. Then fill in the information such as
      Connection Name: A name to identify the API connection,
      Authentication Type : Auth type used by the web service
      Subscription: The subscription where Gateway is present
      Connection Gateway: Choose your on-premise gateway which you would want to use to connect to on-premise web service
      Then click on Create
    Azure Gateway LogicApp: Showing the response from onPrem web service: Create the API connection
    • Enter the values required to call the web method.
    Azure Gateway LogicApp: Showing the response from onPrem web service

    Interact with an on-premise file system from Azure integration Logic Apps

    • Click on a new Action on the Logic app, search for “File System” and select any of the following action
    Azure Gateway LogicApp: Showing the results from Onprem file system and response from onPrem web service: Select File system action
    • This would bring up a dialog to create an API connection to connect to the file system and Click on Create
    Azure Gateway LogicApp: Showing the results from Onprem file system and response from onPrem web service: Create File system API connection
    • Click on save and then run it and the run should show the results from OnPremise webservice and the file system
    Azure Gateway LogicApp: Showing the results from Onprem file system and response from onPrem web service

    Azure Integration: Installing and configuring On-premises data gateway

    The on-premises data gateway provides quick and secure data transfer between on-premises data sources and Azure components. It acts as a bridge between onPremise and Azure Cloud. Currently, the gateway supports connections to the following data sources hosted on-premises:

    • Azure Logic Apps
    • Power BI
    • Power Apps
    • Power Automate
    • Azure Analysis Services 

    Azure Integration: Install Azure Data Gateway

    1. The query created by the Cloud service with the encrypted credentials for the on-premises data source (e.g. call to custom webservice, a call to read a file from onPremise server), will be sent to a queue for the gateway to process.
    2. The gateway cloud service analyzes the query and pushes the request to the Azure Service Bus.
    3. The on-premises data gateway polls the Azure Service Bus for pending requests.
    4. The gateway gets the query, decrypts the credentials, and connects to the data source with those credentials.
    5. The gateway sends the query to the data source for execution.
    6. The results are sent from the data source, back to the gateway, and then to the gateway cloud service. The gateway cloud service then uses the results.
    How Logic App on-premise data gateway works

    Install data gateway

    1. Download and run the gateway installer on a local computer.
    2. After the installer opens, select Next.
    • Select On-premises data gateway (recommended), which is standard mode, and then select Next.
    Installer intro
    • Review the minimum requirements, keep the default installation path, accept the terms of use, and then select Install.
    Select gateway mode
    • After the gateway successfully installs, provide the email address for your Azure account, and then select Next, for example:

    Your gateway installation can link to only one Azure account.

    • Select Register a new gateway on this computer > Next. This step registers your gateway installation with the gateway cloud service.
    Sign in with work or school account
    • Provide this information for your gateway installation:
      • A gateway name that’s unique across your Azure AD tenant
      • The recovery key, which must have at least eight characters, that you want to use
      • Confirmation for your recovery key
    Register gateway

     Important: Save and keep your recovery key in a safe place. You need this key if you ever want to change the location, move, recover, or take over a gateway installation.

    Check the region for the gateway cloud service and Azure Service Bus that’s used by your gateway installation. By default, this region is the same location as the Azure AD tenant for your Azure account.

    To accept the default region, select Configure. However, if the default region isn’t the one that’s closest to you, you can change the region.

    More information on Data Gateway can be found here

    Register on-premise gatway in Azure

    1. Log on to https://portal.azure.com/ with the account used for Gateway registration
    2. Select the correct Subscription and search for On-Premises Data Geteways

    3. Click on the On-Premises Data Geteways and fill-in the required information

    The name of the Onpremise gate will appear in the intsallation name. If they dont appear, then make sure correct subscription is selected(the domain used for registration), account has sufficient permission.

    4. Click on create and within a moment you will be able to use the on-premise gateway in the cloud