D365FO: Interacting with Data Management framework using REST API

This blog describes the method to interact with the Data Management framework using REST API to export the delta changes of an entity. The package API lets third party applications to integrate by using data packages.

Use case scenario:

The vendor changes are tracked using the “Change Tracker” functionality available on D365FO at Entity Level. Using the Change Tracker on Vendor V2 entity, the DMF Incremental export job can export the Entities which are modified from last export execution. Using the REST API, third part applications will initiate the export Job. The following image describes on high level the data flow.

Introduction to the terms

Data Management Framework: DMF is the new all-in-one concept introduced by Microsoft in Dynamics 365 for Finance and Operations. It supports and manages all core data management related tasks. This enables asynchronous and high-performing data insertion and extraction scenarios. Here are some examples: Interactive file-based import/export, Recurring integrations (file, queue, and so on)

Data Entity: A data entity in D365 is an abstraction from the physical implementation of database tables. A data entity is a simplified de-normalized representation of underlying tables. A data entity represents a common data concept or functionality, (e.g. Vendors V2  where the details are stored in normalized relational tables) but all details are represented in one flat view in Vendor Details data entity.

Data Package: Data Package is a simple .zip file that contains the source (import) or target data(export) itself . The zip file contains three files. The data file and the manifest files which contain metadata information of the Data Entity and the processing instructions for DMF.

Implementation Details

The integration involves the following steps

  • Enable change tracking
  • Creation of the Data Export DMF Project
  • Authentication against Azure AD
  • Interact with DMF using REST API

Enable change tracking

Change tracking enables incremental export of data from Finance and Operations by using Data management. In an incremental export, only records that have changed are exported. To enable incremental export, you must enable change tracking on entities. If you don’t enable change tracking on an entity, you can only enable a full export each time.

The vendor changes are tracked using the “Change Tracker” functionality available on Entity Level. Using the Change Tracker on Vendor V2 entity, the DMF Incremental export job can export the Entities which are modified from last export execution. The following steps are used to enable Change Tracking in D365 FO

  1. Go to Data Management work space-> Data Entities.
  2. Select the “Vendors V2” Entity for which you want to enable Change Tracking.
  3. In the Action Pane, go to Change Tracking. Select the following option:
    • Enable entire entity – It will enable tracking for all writable Data sources used in the entities. It would result in a negative performance impact on the system.
    • Enable primary table
    • Enable Custom query

Creation of the DMF Data Project

In D365FO a batch job should be created in the Data Management Framework to export the Vendor Changes. The export will be provided with the changes happened to Vendor Records from previous successful export. Below are the steps to import or export data.

  1. Create an import or export job (more on this can be found here)
    • Define the project category (Export/Import): Export
    • Identify the entities to import or export: “Vendors V2
    • Set the data format for the job: Excel, CSV, XML etc.
    • Determine whether to use staging tables : No
    • Group Name: VendorIncrementalChanges
    • Default refresh type: Incremental Push Only
  2. Validate that the source data and target data are mapped correctly.

Azure AD Authentication

In order to call the D365 F&O APIs, it is necessary to authenticate with a valid access token. The token can be retrieved from Azure Active Directory using a valid Application Id and secret key, which has access to the D365FO environment. The application ID and secret key are created by registering an application in Azure Active directory.

Pre-requisite :

  1. Register an application in Azure AD and Grant access to D365FO. The detailed steps are described here. Instead of Dynamics CRM  select Dynamics ERP 
  2. Register the AAD application in D365FO
    • System administration > Setup > Azure Active Directory applications
    • Click “New” -> Enter APP-ID(created as part of the previous step), Meaningful name and User ID (the permission you would like to assign).
  1. The client application authenticates to the Azure AD token issuance endpoint and requests an access token.
  2. The Azure AD token issuance endpoint issues the access token.
  3. The access token is used to authenticate to the D365FO DMF and initiate DMF Job.
  4. Data from the DMF is returned to the third-party application.
Http Method: POST
Request URL: https://login.microsoftonline.com//oauth2/token 
Parameters : grant_type: client_credentials [Specifies the requested grant type. In a Client Credentials Grant flow, the value must be client_credentials.]
client_id: Registered App ID of the AAD Application 
client_secret: Enter a key of the registered application in AAD.
Resource: Enter the URL of the D365FO Url (e.g. https://dev-d365-fo-ultdeabc5b35da4fe25devaos.cloudax.dynamics.com)

The Resource URL should not have “/” in the end, othwerise you would always get access denied while accessing the target resource

C# Code

//Azure AAD Application settings
//The Tenant URL (use friendlyname or the TenantID
static string aadTenant = "https://login.windows.net/dev.onmicrosoft.com";
//The URL of the resource you would be accessing using the access token.Please ensure / is not there in the end of the URL
static string aadResource = "https://dev-testdevaos.sandbox.ax.dynamics.com";
//APplication ID . Store them securely / Encrypted config file or secure store
static string aadClientAppId = "GUID Of the Azure application";
//Application secret. Store them securely / Encrypted config file or secure store
static string aadClientAppSecret = "Secret of the Azure application"; 
/// 
/// Retrieves an authentication header from the service.The authentication header for the Web API call.        private static string GetAuthenticationHeader()
        {
            //using Microsoft.IdentityModel.Clients.ActiveDirectory;
            AuthenticationContext authenticationContext = new AuthenticationContext(aadTenant);
            var creadential = new ClientCredential(aadClientAppId, aadClientAppSecret);
            AuthenticationResult authenticationResult = authenticationContext.AcquireTokenAsync(aadResource, creadential).Result;
            return authenticationResult.AccessToken;
        }

Interaction using REST API

The high level interaction of API calls to get the delta package via REST API is shown below.

Step 1: Export to Package:

The Export to Package API is used to initiate an export of a data package.
Request Message:

POST /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ExportToPackage
Message Body
{
    "definitionGroupId":" The name of the data project for export.",
    "packageName":" The name of the exported data package.",
    "executionId":" The ID to use for the job. If an empty ID is assigned, a new execution ID will be created.",
    "reExecute”: True,
    "legalEntityId":" The legal entity for the data import."
}

C# Code

 string authHeader = GetAuthenticationHeader();
 HttpClient client = new HttpClient();
 client.BaseAddress = new Uri(aadResource);
 client.DefaultRequestHeaders.Clear();        
 client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", authHeader);
//Initiate the Export
 string execytionID = Guid.NewGuid().ToString();
 var payload = new DMFExport()
 {
 DefinitionGroupId = jobName,
 PackageName = packageName,
 ExecutionId =execytionID,
 ReExecute = true,
 LegalEntityId =legalEntity
m};
 var stringPayload = JsonConvert.SerializeObject(payload);
var httpContent = new StringContent(stringPayload, Encoding.UTF8, "application/json");
 var result = client.PostAsync("/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ExportToPackage", httpContent).Result;         
 string resultContent = await result.Content.ReadAsStringAsync();
 JObject joResponse = JObject.Parse(resultContent);
 string outPut = string.Empty;
 if (result.StatusCode == System.Net.HttpStatusCode.OK)
 {
 // Successs
 }
 else
{ 
// failure
}

Step 2: GetExecutionSummaryStatus

The GetExecutionSummaryStatus API is used for both import jobs and export jobs. It is used to check the status of a data project execution job. The following values are possible values for the Execution Status:
Unknown / NotRun / Executing / Succeeded / PartiallySucceeded / Failed / Canceled
If the status is ‘Executing’, then submit the request after 60 second until the response gets completion status in success or failure response.

Request:

POST /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatus
BODY
{"executionId":"Execution Id Provided to the Previous Request"}

C#

int maxLoop = 15;
do
{
//"Waiting for package to execution to complete"
Thread.Sleep(5000);
maxLoop--;
if (maxLoop <= 0)
{
break;
}
//("Checking status...");
stringPayload = JsonConvert.SerializeObject(new DMFExportSummary() { ExecutionId = execytionID });
httpContent = new StringContent(stringPayload, Encoding.UTF8, "application/json");
result = client.PostAsync("/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatus", httpContent).Result;
resultContent = await result.Content.ReadAsStringAsync();
outPut = JObject.Parse(resultContent).GetValue("value").ToString();
//"Status of export is "+ outPut
}
while (outPut == "NotRun" || outPut == "Executing");

Step 3 GetExportedPackageUrl

The GetExportedPackageUrl API is used to get the URL of the data package that was exported by a call to Export Package.

Step 4 Download package File

The file can be downloaded using HTTP GET request using the URL provided in the response of the previous request. The downloaded files will be a zip file. The file needs to be extracted.  The extracted zip file contains three files but “Vendors V2.xml” would be the only file be used for processing.

Complete Code C#

using Microsoft.Azure.Storage.Blob;
using Microsoft.IdentityModel.Clients.ActiveDirectory;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using System;
using System.IO;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Text;
using System.Threading;

namespace Dev.DMF.Interface
{
    public class DMFExport
    {
        [JsonProperty("definitionGroupId")]
        public string DefinitionGroupId { get; set; }

        [JsonProperty("packageName")]
        public string PackageName { get; set; }

        [JsonProperty("executionId")]
        public string ExecutionId { get; set; }

        [JsonProperty("reExecute")]
        public bool ReExecute { get; set; }

        [JsonProperty("legalEntityId")]
        public string LegalEntityId { get; set; }
    }

    public class DMFExportSummary
    {      
        [JsonProperty("executionId")]
        public string ExecutionId { get; set; }     
    }
    internal class DMFManager
    {
      
        static string downloadUrl = string.Empty;

        //Azure AAD Application settings
        //The Tenant URL (use friendlyname or the TenantID
        static string aadTenant = "https://login.windows.net/dev.onmicrosoft.com";
        //The URL of the resource you would be accessing using the access token
        //Please ensure / is not there in the end of the URL
        static string aadResource = "https://dev-testdevaos.sandbox.ax.dynamics.com";
        //APplication ID . Store them securely / Encrypted config file or secure store
        static string aadClientAppId = "GUID Of the Azure application";
        //Application secret . Store them securely / Encrypted config file or secure store
        static string aadClientAppSecret = "Secret of the Azure application";      

        /// <summary>
        /// Retrieves an authentication header from the service.
        /// </summary>
        /// <returns>The authentication header for the Web API call.</returns>
        private static string GetAuthenticationHeader()
        {
            //using Microsoft.IdentityModel.Clients.ActiveDirectory;
            AuthenticationContext authenticationContext = new AuthenticationContext(aadTenant);
            var creadential = new ClientCredential(aadClientAppId, aadClientAppSecret);
            AuthenticationResult authenticationResult = authenticationContext.AcquireTokenAsync(aadResource, creadential).Result;
            return authenticationResult.AccessToken;
        }

        // Setup Step 
        // - Create an export project within Dynamics called ExportVendors in company USMF before you run the following code
        // - It can of any data format XML and can include any number of data entities
        // 1. Initiate export of a data project to create a data package within Dynamics 365 for Operations

        private static async void Export(string jobName, string packageName, string legalEntity, string filePath, string fileName)
        {
            string authHeader = GetAuthenticationHeader();
            HttpClient client = new HttpClient();
            client.BaseAddress = new Uri(aadResource);
            client.DefaultRequestHeaders.Clear();        
            client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", authHeader);

            //Initiate the Export
            string execytionID = Guid.NewGuid().ToString();
            var payload = new DMFExport()
            {
                DefinitionGroupId = jobName,
                PackageName = packageName,
                ExecutionId =execytionID,
                ReExecute = true,
                LegalEntityId =legalEntity
            };
            Console.WriteLine("Initiating export of a data project...");
            var stringPayload = JsonConvert.SerializeObject(payload);
            var httpContent = new StringContent(stringPayload, Encoding.UTF8, "application/json");
            var result = client.PostAsync("/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ExportToPackage", httpContent).Result;         
            string resultContent = await result.Content.ReadAsStringAsync();
            JObject joResponse = JObject.Parse(resultContent);
            string outPut = string.Empty;
            if (result.StatusCode == System.Net.HttpStatusCode.OK)
            {
           
                Console.WriteLine("Initiating export of a data project...Complete");
                int maxLoop = 15;
                do
                {
                    Console.WriteLine("Waiting for package to execution to complete");

                    Thread.Sleep(5000);
                    maxLoop--;

                    if (maxLoop <= 0)
                    {
                        break;
                    }

                    Console.WriteLine("Checking status...");

                    stringPayload = JsonConvert.SerializeObject(new DMFExportSummary() { ExecutionId = execytionID });
                    httpContent = new StringContent(stringPayload, Encoding.UTF8, "application/json");

                     result = client.PostAsync("/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatus", httpContent).Result;
                     resultContent = await result.Content.ReadAsStringAsync();
                     outPut = JObject.Parse(resultContent).GetValue("value").ToString();
                  
                    Console.WriteLine("Status of export is "+ outPut);

                }
                while (outPut == "NotRun" || outPut == "Executing");

                if (outPut != "Succeeded" && outPut != "PartiallySucceeded")
                {
                    throw new Exception("Operation Failed");
                }
                else
                {
                    // 3. Get downloable Url to download the package    
                    //    POST / data / DataManagementDefinitionGroups / Microsoft.Dynamics.DataEntities.GetExportedPackageUrl
                    stringPayload = JsonConvert.SerializeObject(new DMFExportSummary() { ExecutionId = execytionID });
                    httpContent = new StringContent(stringPayload, Encoding.UTF8, "application/json");

                    result = client.PostAsync("/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExportedPackageUrl", httpContent).Result;
                    resultContent = await result.Content.ReadAsStringAsync();
                    downloadUrl = JObject.Parse(resultContent).GetValue("value").ToString();
                }

                // 4. Download the file from Url to a local folder
                Console.WriteLine("Downloading the file ...");
                var blob = new CloudBlockBlob(new Uri(downloadUrl));
                blob.DownloadToFile(Path.Combine(filePath, fileName + ".zip"), System.IO.FileMode.Create);
                Console.WriteLine("Downloading the file ...Complete");


            }
            else
            {
                Console.WriteLine("Initiating export of a data project...Failed");
            }
        }       
    }
}

Dynamics 365 UO: Data Task automation

Dynamics 365 UO Data task automation is a framework which helps with following features

  • Demo Data Setup
  • Golden Configuration Setup
  • Data Migration Validation
  • Data Entities and Integration Test automation

D365 UO: Intrduction to Data Task Automation

Data packages from the Shared Asset Library or Project Asset Library can be downloaded and imported automatically into D365FO using Data Task Automation (DTA), which is available in the Data management workspace. The high level data flow diagram is shown below:

The following image shows the features of Data Task Automation

The automation tasks are configured through a manifest. The following figure shows an example of a DTA manifest file.

The above manifest file can be loaded into Data management and results in the creation of several data automation tasks as shown below.

The combination of data packages and data task automation will allow the users to build a flexible framework that automates the generation of all relevant data in a new deployment from the template and create test cases for recurring Integration testing and normal test cases.

Data Task Automation: The Manifest file

Manifest file provides mechanism to create a test cases / Tasks for Data Task automation.The manifest has two main sections: SharedSetup and Task definition

Data Task Automation: SharedSetup

The Shared setup section defines general task parameters and behaviors for all tasks in the manifest.

1: Data files from LCS : This elements define the data packages and data files that the tasks in the manifest will use. The data files must be either in the LCS asset library of your LCS project or in the Shared asset library.

  • Name
  • Project ID (LCS Project ID and when its empty it uses access the package from Shared Asset Library)

2: Generic Job Definition: This section defines the data project definition. There can be more than one job definition in a manifest. The test case can override the entity definition in Task level.

The Recurring batch type defined under mode element would help in creating Recurring Integration scenarios. There other types of Modes to simulate different scenarios such as initiating export/import from UI.

3: Generic Entity Definition

This section defines the characteristics of an entity that a task in the manifest will use. There can be more than one definition, one for each entity that is used by the tasks in the manifest. The test case can override the entity definition in Task level

Data Task Automation: Test Group definition

Test groups can be used to organize related tasks in a manifest. There can be more than one test group in a manifest. The TestGroup contains number of test cases. Test Case definition can override the Generic Job, Entity definitions by specifying the values specific to the test case (e.g. DataFile, Mode etc.)

Data Task Automation: Best practices for manifest design

Granularity

  • Define the granularity of your manifest on functional use case.    
  • During the development start with as many manifests, as the project progresses then merge manifests on functional level.
  • Consider separation of duties. For example, you might have one manifest for the setup of demo data and another manifest for the setup of the golden configuration for your environment. In this way, you can make sure that team members use only the manifests that they are supposed to use.

Inheritance

  • The manifest schema supports inheritance of common elements that will apply to all tasks in the manifest. A task can override a common element to define a unique behavior. The purpose of the Shared setup section is to minimize repetition of configuration elements, so that elements are reused as much as possible. The goal is to keep the manifest concise and clean, to improve maintenance and readability.

Source Code

  • Manifests that must be used by all the members of an implementation team should be stored in source control in the Application Object Tree (AOT).

Benefits of Data Task Automation

Data task automation in Microsoft Dynamics 365 for Finance and Operations lets you easily repeat many types of data tasks and validate the outcome of each task.  Following lists the benefits of DTA

  • Built into the D365 Product – Environment Agnostic / Available to everyone
  • Low Code/No Code – Declarative Authoring of Tasks [xml]
  • LCS Integration for Test Data (As Data Packages)
    • Shared Asset Library
    • Project Asset Library
  • Upload Data to Multiple Legal Entities in one-go using DTA
  • It can be included to AOT resources using VS Packages (Dev ALM)
    • It will be available in D365FO UI
  • It also supports Loading of the Manifest from File system
  • Good Validations
    • Job, batch and Integration status
    • Records Count
    • Staging and Target status
    • Other options
      • Skip staging
      • Truncations

Steps to Setup Data Task Automation

Create the required Data Packages

Identify the required data for the test cases (Tasks) and create the Data packages.

Upload the Data packages to LCS

Upload the data packages to LCS. Steps to create data packages.

Create the Manifest file for Data Task Automation

Identify the operations and Test cases required to be performed via DTA. Then create Manifest file required for these operations.  The detailed information about Manifest can be found here. Store the Manifest file in source control and please use the best practice while designing the Manifest file.

Below is an example simple manifest file to upload CustomerV3 Entity. It expects a Data package named “CustomersV3” in LCS (as result of previous two steps). Visual Studio or VS Code is ideal for development.

<?xml version='1.0' encoding='utf-8'?>
<TestManifest name='HSO-DTA-Testing-Intro'>
    <SharedSetup>
        <DataFile ID='CustomersV3' name='CustomersV3' assetType='Data package' lcsProjectId='1368270'/>
        <JobDefinition ID='GenericIntegrationTestDefinition'>
            <Operation>Import</Operation>
            <SkipStaging>Yes</SkipStaging>
            <Truncate></Truncate>
            <Mode>Recurring batch</Mode>
            <BatchFrequencyInMinutes>1</BatchFrequencyInMinutes>
            <NumberOfTimesToRunBatch >2</NumberOfTimesToRunBatch>
            <UploadFrequencyInSeconds>1</UploadFrequencyInSeconds>
            <TotalNumberOfTimesToUploadFile>1</TotalNumberOfTimesToUploadFile>
            <SupportedDataSourceType>Package</SupportedDataSourceType>
            <ProcessMessagesInOrder>No</ProcessMessagesInOrder>
            <PreventUploadWhenZeroRecords>No</PreventUploadWhenZeroRecords>
            <UseCompanyFromMessage>Yes</UseCompanyFromMessage>
            <LegalEntity>DAT</LegalEntity>
        </JobDefinition>
        <EntitySetup ID='Generic'>
            <Entity name='*'>
                <SourceDataFormatName>Package</SourceDataFormatName>
                <ChangeTracking></ChangeTracking>
                <PublishToBYOD></PublishToBYOD>
                <DefaultRefreshType>Full push only</DefaultRefreshType>
                <ExcelWorkSheetName></ExcelWorkSheetName>
                <SelectFields>All fields</SelectFields>
                <SetBasedProcessing></SetBasedProcessing>
                <FailBatchOnErrorForExecutionUnit>No</FailBatchOnErrorForExecutionUnit>
                <FailBatchOnErrorForLevel>No</FailBatchOnErrorForLevel>
                <FailBatchOnErrorForSequence>No</FailBatchOnErrorForSequence>
                <ParallelProcessing>
                    <Threshold></Threshold>
                    <TaskCount></TaskCount>
                </ParallelProcessing>               
 <!-- <MappingDetail StagingFieldName='devLNFN' AutoGenerate='Yes' AutoDefault='No' DefaultValue='' IgnoreBlankValues='No' TextQualifier='No' UseEnumLabel='No'/> --></Entity>
        </EntitySetup>
    </SharedSetup>
    <TestGroup name='Manage Integration Test for Entities'>
        <TestCase Title='Adding New Customer  via Integration' ID='AddNewCustomeViaIntegration' RepeatCount='1' TraceParser='on' TimeOut='20'>
            <DataFile RefID='CustomersV3' />
            <JobDefinition RefID='GenericIntegrationTestDefinition'/>
            <EntitySetup RefID='Generic' />
        </TestCase>
    </TestGroup>
</TestManifest>                   

Upload Manifest to Dynamics 365 UO

1.  Log into D365 FO portal

2.   Navigate to Data management from the workspace:

3.  Navigate to DTA by clicking on Data Tool Automation

4.    Click Load Tasks from the file on Top left corner

5.   Select and Upload the Manifest xml

6. Upload would List all the tasks which are defined in the Manifest and select All and click Run.

7.    This would result in the dialog and Click on “Click here to Connect to Lifecycle service” and this should result in success.

8.  Then click OK
9.   The Task will start running and the outcome of the test cases will be shown once the task is complete

10.  Select the Task/Test case and click on the “Show validation results” to see the detailed results of the test case.

Summary of Data Task Automation

  • The DTA is a great way to automate the test cases during development.
  • It provides recurring mechanism to Test Data Entities using data packages.
  • The business logic associated with Entities are executed during DTA.
  • DTA also helps with Performance baselining and regression testing.
  • It still requires improvements especially in the validation Phase. The outcome only says whether the Record has been added successfully but it doesn’t say anything about integrity of the data
  • The DTA Test cases are not yet integrated with DevOps but it is already part of ALM

Azure Integration: Connecting to an On-Premise web service and file system from a Logic App

This Blog entry describes the Azure Integration approach to connect OnPremise Webservices using OnPremise Data Gateway, Azure Integration Logic Apps and Logic App Custom Connector. It also describes an approach to read files from OnPremise using Azure Integration.

UseCase : Azure Integration for OnPremise

Consuming OnPremise webservice in Azure integration service. There will be a number of cases where enterprises want to connect their OnPremise environments to the cloud and expose data via webservice. This blog describes a method to consume the on premise web service in Azure Logic Apps and also reading the files from the on premise file system.

Azure Gateway LogicApp:Connecting to an OnPremise webservice and file system from a Logic App

The components used are:

  • Custom webservice on onPrem
  • OnPremise Data Gateway
  • Logic Apps custom connector
  • Logic Apps

The onPremise webservice could be a REST or a WCF webservice. The communication between Ompremise and cloud happens using the OnPremise data gateway. Steps to configure and install a Data gateway has been described here.

Export Webservice collection using Postman for Azure Integration

We need to create a PostMan collection of our webservices for creating a custom Logic App connector. Detailed steps and additional information can be found here

  • In Postman, on the Builder tab, select the HTTP method, enter the request URL for the API endpoint, and select an authorization protocol, if any.
  • Enter key-value pairs for the request header. For common HTTP headers, you can select from the dropdown list.
    • Content-Type”application/json”
  • Enter content that you want to send in the request body. To check that the request works by getting a response back, choose Send.
  • Choose Save.
  • Under Save Request, provide a request name and a request description. The custom connector uses these values for the API operation summary and description.
Azure Gateway LogicApp: Showing the response from onPrem web service: Create Postman collection
  • Choose + Create Collection and provide a collection name. The custom connector uses this value when you call the API.
  • Above the response window, choose Save Response.
  • At the top of the app, provide a name for your example response, and choose Save Example.
  • Under Headers, hover over each header, and choose the X next to the header to remove it. Choose Save to save the collection again.
  • Choose the ellipsis (. . . .) next to the collection, then choose Export.
  • Choose the Collection v1 export format, choose Export, then browse to the location where you want to save the JSON file.
Azure Gateway LogicApp: Showing the response from onPrem web service: Create Postman collection

Detailed steps and additional information can be found here

Create an Azure Integration Custom Logic App Connector

  1. Log on to https://portal.azure.com/ with the account used for Gateway registration
  2. Select the correct Subscription and search for Logic Apps Custom Connector
Azure Gateway LogicApp: Showing the response from onPrem web service: Create Custom Logic App connector

3. Click on the Logic Apps Custom Connector and Click on ADD
4. Then fill in the following information: Correct Subscription, Resource group, Custom Connector name and location

Azure Gateway LogicApp: Showing the response from onPrem web service: Create Custom Logic App connector

5. Click Review + Create and then Click on Create
6. Once it is created, then Go to the Resource and click on Edit
The fill in the following information
API End Point: REST ( SOAP based on your service type)
Select Postman collection V1 and import the file downloaded
from Postman.
This will fill in the information such as description, scheme, hostname and base Url . Ensure the information is correct. If not edit them.
Ensure to select  “Connect via on premise data gateway” and optionally upload an image

Azure Gateway LogicApp: Showing the response from onPrem web service: Create Custom Logic App connector

7. Click on Security, then select then correct authentication scheme and click on the definition
8. Validate the definition and click on  “Update connector” 

Azure Gateway LogicApp: Showing the response from onPrem web service: Create Custom Logic App connector

Interact with an on-premise webservice in Azure Integration Logic App

  • Create A new Logic App with a start condition either with a scheduler or an HTTP trigger
  • Add an action and search for the custom connector, which would appear in the custom section
Azure Gateway LogicApp: Showing the response from onPrem web service: Select the OnPrem API connection
  • Select one of the actions from the custom connector. Each method in the webservice would appear as the actions here.
  • While selecting the action, this would bring up a dialog to create an API Connection required to communicate with the webservice. Then fill in the information such as
    Connection Name: A name to identify the API connection,
    Authentication Type : Auth type used by the web service
    Subscription: The subscription where Gateway is present
    Connection Gateway: Choose your on-premise gateway which you would want to use to connect to on-premise web service
    Then click on Create
Azure Gateway LogicApp: Showing the response from onPrem web service: Create the API connection
  • Enter the values required to call the web method.
Azure Gateway LogicApp: Showing the response from onPrem web service

Interact with an on-premise file system from Azure integration Logic Apps

  • Click on a new Action on the Logic app, search for “File System” and select any of the following action
Azure Gateway LogicApp: Showing the results from Onprem file system and response from onPrem web service: Select File system action
  • This would bring up a dialog to create an API connection to connect to the file system and Click on Create
Azure Gateway LogicApp: Showing the results from Onprem file system and response from onPrem web service: Create File system API connection
  • Click on save and then run it and the run should show the results from OnPremise webservice and the file system
Azure Gateway LogicApp: Showing the results from Onprem file system and response from onPrem web service

Azure Integration: Installing and configuring On-premises data gateway

The on-premises data gateway provides quick and secure data transfer between on-premises data sources and Azure components. It acts as a bridge between onPremise and Azure Cloud. Currently, the gateway supports connections to the following data sources hosted on-premises:

  • Azure Logic Apps
  • Power BI
  • Power Apps
  • Power Automate
  • Azure Analysis Services 

Azure Integration: Install Azure Data Gateway

  1. The query created by the Cloud service with the encrypted credentials for the on-premises data source (e.g. call to custom webservice, a call to read a file from onPremise server), will be sent to a queue for the gateway to process.
  2. The gateway cloud service analyzes the query and pushes the request to the Azure Service Bus.
  3. The on-premises data gateway polls the Azure Service Bus for pending requests.
  4. The gateway gets the query, decrypts the credentials, and connects to the data source with those credentials.
  5. The gateway sends the query to the data source for execution.
  6. The results are sent from the data source, back to the gateway, and then to the gateway cloud service. The gateway cloud service then uses the results.
How Logic App on-premise data gateway works

Install data gateway

  1. Download and run the gateway installer on a local computer.
  2. After the installer opens, select Next.
  • Select On-premises data gateway (recommended), which is standard mode, and then select Next.
Installer intro
  • Review the minimum requirements, keep the default installation path, accept the terms of use, and then select Install.
Select gateway mode
  • After the gateway successfully installs, provide the email address for your Azure account, and then select Next, for example:

Your gateway installation can link to only one Azure account.

  • Select Register a new gateway on this computer > Next. This step registers your gateway installation with the gateway cloud service.
Sign in with work or school account
  • Provide this information for your gateway installation:
    • A gateway name that’s unique across your Azure AD tenant
    • The recovery key, which must have at least eight characters, that you want to use
    • Confirmation for your recovery key
Register gateway

 Important: Save and keep your recovery key in a safe place. You need this key if you ever want to change the location, move, recover, or take over a gateway installation.

Check the region for the gateway cloud service and Azure Service Bus that’s used by your gateway installation. By default, this region is the same location as the Azure AD tenant for your Azure account.

To accept the default region, select Configure. However, if the default region isn’t the one that’s closest to you, you can change the region.

More information on Data Gateway can be found here

Register on-premise gatway in Azure

  1. Log on to https://portal.azure.com/ with the account used for Gateway registration
  2. Select the correct Subscription and search for On-Premises Data Geteways

3. Click on the On-Premises Data Geteways and fill-in the required information

The name of the Onpremise gate will appear in the intsallation name. If they dont appear, then make sure correct subscription is selected(the domain used for registration), account has sufficient permission.

4. Click on create and within a moment you will be able to use the on-premise gateway in the cloud