Asynchronous D365 Fin & Ops OData action with Power Automate

Hi! It has been a while since my last post šŸ™‚

I recently had a 2 minutes gateway timeout issue with my custom OData action from Power Automate. The timeout was causing Power Automate to retry the OData action hence calling my action multiple times.

There are some settings for the Fin & Ops connector execute action in Power Automate to set a timeout for the asynchronous pattern but from what I understood this is not supported by Fin & Ops.

Fin & Ops OData and custom services are synchronous integration patterns and return 200 response when successful (not 202). By default the timeout for synchronous requests in Power Automate is 2 minutes and it cannot be changed šŸ˜¦

Similar to what Microsoft is doing with the ImportFromPackage and GetExecutionSummaryStatus in batch data API, you can have your OData action to call a class method asynchronously and then check a status or a flag from Power Automate in a do until loop.

In my example below, I’m calling the execute action with the unique identifier of a record which then calls myMethod in myClass asynchronously. MyMethod will retrieve the record using the unique identifier, execute some processes and finally update a flag on the record to mark it as completed.

Then after the execute action in Power Automate, I can check the record flag in a do until loop every 30 seconds for 10 minutes. When the flag is yes or we reach the count limit or the timeout it will exit the loop.

OData action code below, Global::runAsync method only takes a container of parameters. I didn’t need a class call back here as this this triggered outside of Fin & Ops.

public class myCustomEntity extends common
    [SysODataActionAttribute('PostProcessRunAsync', false)]
    public static void PostProcessRunAsync(DataAreaId _dataAreaId, SalesIdBase _salesId)
        // put parameters in a container
        container parms = [_dataAreaId, _salesId];

        // execute myMethod asynchronously using Global class
        Global::runAsync(classNum(myClass), staticMethodStr(myClass, myMethod), parms);

myMethod code:

class myClass
    static void myMethod(container _con)

        DataAreaId  dataAreaId  = conPeek(_con, 1);
        SalesIdBase salesId     = conPeek(_con, 2);

        // Run some process on a sales order using the salesId

        // Update a new custom flag to Yes on the sales order header

That way if your action runs for more than 2 minutes you can bypass the gateway timeout. I found this approach much easier and faster to implement than trying to leverage the ImportFromPackage in batch data API as this will require a file to be generated in Power Automate beforehand.


Synchronous and asynchronous operations in AX 7 ā€“ Sertan’s .dev blog (


Making fields editable in a read only form using event handlers

Requirement: make a custom field editable in the read-only invoice journal form.

By default the Tax invoice journal form is read only, there are obviously a lot of reasons why it is designed this way. But sometimes you actually need to be able to edit custom fields in the form (I had to implement this requirement a couple of times), it is not overly complex but there are few things to know/consider:

  1. In CustInvoiceJour table extension, add your custom field e.g. VNGCustomField
  2. In CustInvoiceJournal form extension, add the custom field,
  3. In a handler class, change your custom field’s form datasource allowEdit property to true,
  4. Using method allowEditFieldsOnFormDS_W, set all form datasource fields to allowEdit No,
  5. Enable allowEdit Yes for the custom field only,
  6. The custom field will still be read only so need force the form design ‘View Edit Mode’ property to Edit instead of View,
  7. Finally I was getting a validation error on the custTable.taxGroup field. Because the ‘View Edit Mode’ changed to Edit, validateWrite method on custTable was triggered. As the form datasource ‘Only Fetch Active’ property is Yes, it is only selecting the partyId. I had to force the custTable datasource to select all fields instead.

You can use this on any form.

And here is the code:

class VNGCustInvoiceJournalFrm_EventHandler
    /// <summary>
    /// Making the custom field editable
    /// </summary>
    /// <param name="sender"></param>
    /// <param name="e"></param>
    [FormEventHandler(formStr(CustInvoiceJournal), FormEventType::Initialized)]
    public static void CustInvoiceJournal_OnInitialized(xFormRun sender, FormEventArgs e)
        // change design view edit mode from view to edit;
        // get form datasource
        FormDataSource custInvoiceJour_ds = sender.dataSource(formDataSourceStr(CustInvoiceJournal, CustInvoiceJour));
        // allow edit on the datasource
        // set all datasource fields to non editable
        allowEditFieldsOnFormDS_W(custInvoiceJour_ds, false);
        // allow edit on custom field only
        custInvoiceJour_ds.object(fieldNum(CustInvoiceJour, VNGCustomField)).allowEdit(true);
    /// <summary>
    /// The form has custTable datasource with only fetch active = true, therefore only partyId is selected.
    /// When it goes in the custTable.validateWrite() it will fail as taxGroup is blank.
    /// Need to change the query to select all fields from CustTable
    /// </summary>
    /// <param name="sender"></param>
    /// <param name="e"></param>
    [FormDataSourceEventHandler(formDataSourceStr(CustInvoiceJournal, CustTable), FormDataSourceEventType::Initialized)]
    public static void CustTable_OnInitialized(FormDataSource sender, FormDataSourceEventArgs e)
        Query query = sender.query();
        QueryBuildDataSource qbds = query.dataSourceTable(tableNum(CustTable));

Until next time!

D365 Finance and Operations & IoT


I attended a Microsoft training few weeks ago where I saw a very cool demo with IoT so I decided to buy my own MXChip IoT DevKit to try it out.

Scenario: using your IoT device temperature sensor, you want to see the temperature real time in a Power BI dashboard and then when temperature reaches a certain threshold you want to raise a purchase order in FinOps for a bottle of water.


  1. First, you will have to connect your MXChip IoT DevKit to Azure Iot Hub
  2. I had to do a small code change in the get started sample to update the threshold to 20 degrees as it’s winter down here and also to always send temperature and humidity even though the value has not change since last time (Power BI was not seeing all fields before I changed it)
  3. Once you have done it and you can see temperature, humidity and counter displayed onto your device OLED Screen
  4. The next step is to configure a Stream Analytics job in Microsoft Azure to send your IoT messages to Power BI
  5. When successfully configured and running, you can create a new dashboard in Power BI using Custom Streaming Data tile to get something like thispower BI dashboard
  6. Now that we can see temperature real time in Power BI, let’s see how we can connect Logic Apps to Azure IoT Hub
  7. After creating your Service Bus Queue, IoT Hub custom endpoint and a routing rule, I only changed the logic app to add FinOps connector to create PO header first and then PO line between the trigger from the queue and the send email actionLogic app


  1. Turn on your IoT device and put it behind your laptop to increase the temperature
  2. When it reaches 20 degrees, the message sent to IoT hub will be routed to my Service Bus queue
  3. Then Logic app will pick up the message from the queue, create a PO in FinOpsPO00000156 FinOps created
  4. Send an email with the PO numberemail notif
  5. Finaly the bottle of water is delivered to my door šŸ˜‰20190603_161305.jpg


This is just a very simple scenario to show how you can easily interface an IoT device with D365 for Finance and Operations by leveraging the power of the Azure stack. In real life implemention, you will probably have to create a purchase requisition instead of a purchase order so that it can go through PR workflow for approval.

One thing to note in my example: because my device sends messages every 5sec (configurable), every time temperature is above 20 degrees, it is creating a PO in FinOps so I actually ended up with a lot of POs in FinOps!

That’s all for today and remember to stay hydrated šŸ™‚

Configuring your second build pipeline in Azure DevOps for a new D365 Finance and Operations application version

Scenario: you just went live with D365 Finance and Operations (e.g. application version 8.0) and you are planning to update to a new version (e.g. application version 8.1). Best practice is to have as many build servers as number of application versions you need to manage.

Unfortunately, online documentation is not very clear regarding steps to provision and configure a second build environment for your project:

For the build environment, Create a new agent pool and assign it to this environment on the Advanced options screen.

In Azure DevOps, visit your existing Build Definition and ensure that it is not using your new agent pool for 8.1/10.0. This will keep your new build agent from trying to compile older application code.

When you provision your first build environment from LCS, it will automatically create and configure a build pipeline in Azure DevOps. However, when provisioning a second build environment with your new version, no build pipeline is created at all in Azure DevOps.

Here I am assuming that you already provisioned your new build environment with your target application (version 8.1) and a new agent pool, created a new RELEASE branch (version 8.0) from your MAIN branch which will become version 8.1.

For the existing build pipeline hooked to the MAIN branch, we want to make sure that it is executed on the new build environment (version 8.1)

  1. Edit your build pipeline, select ‘Run on agent’ on the left then on the right hand side under ‘Demands’, add a new line with Agent.Name equals <your build agent name> (you configured that one when provisionning the new build environment e.g. BuildAgent2)01. agent name for new build server
  2. Then save the pipeline

Now we can clone this pipeline to create a new one for RELEASE branch (version 8.0) and edit it

  1. In ‘Get sources’, update the server path to the RELEASE branch02. get sources
  2. In ‘Run on agent’, update the agent name to the previous agent pool from the original build environment (e.g. BuildAgent)03. agent name
  3. In ‘Build the solution’, update the path for the AXModulesBuild.proj with the RELEASE branch04. build solution
  4. Then save the pipeline

You now have 2 automated builds running for 2 different application versions šŸ™‚

If you cannot remember your agent pool names, you can still review it in Azure DevOps > Project settings > Agent pools.

05. agent pools

Hope this helps.

Fixing the InvalidCastException error with ODataClient in Visual Studio for Dynamics 365 for Finance and Operations application version 8.0

Following my post from last yearĀ Consuming Dynamics 365 for Finance and Operations OData services from .NET console application, I had to reconfigure my ODataConsoleApplication and I had quite a few errors 4 months ago.

If you remember, in my previous post, we had to open file and update the MetadataDocumentUri variable to your organizationā€™s URL and adding /data/ at the end. The problem when doing it with the new application version 8.0 is that you will get an ugly errorĀ Running transformation: System.InvalidCastException: Unable to cast object of type ‘Microsoft.OData.Edm.Csdl.CsdlSemantics.UnresolvedType’ to type ‘Microsoft.OData.Edm.IEdmCollectionType’ and it will not generate the ODataClient1.cs file… At the time I’m writing this I am still not sure if it fixed or not.

Update: it does not seem to be fixed but the issue is mentioned in here and there is a metadata validator available in GitHub to get a more detailed error when this is happening.

Below are the steps to work around this issue.

  1. In your favorite web browser, open your D365 Finance and Operations organizationā€™s URL and add “/data/$metadata” at the end1.Metadata updated
  2. Select all the content, a paste it in a file that you will save locally as metadata.xml2. save metadata file locally
  3. Now open that file with Notepad++ as it is quite big (15Mb), and look forĀ <EntityType Name=”ItemType”> and rename it to <EntityType Name=”ItemTypes“>3. Rename entityType
  4. Also update the entity type under EntitySet from Microsoft.Dynamics.DataEntities.ItemType to Microsoft.Dynamics.DataEntities.ItemTypes4. update entitytype in entityset
  5. Now that the metadata.xml file is fixed, let’s go back in visual studio and check the version of out Microsoft.OData.Client, when I tested it 4 months back I used version 7.4.411. Odata client version 7.4.4
  6. Then delete the from your project12. delete Odata client files in project
  7. and add a new OData client13. New item14. Add Odata client
  8. Edit the newly createdĀ but this time we will not put theĀ organizationā€™s URL in MetadataDocumentUri variable but the metadata.xml file that you saved locally e.g.Ā File:///C:/D365FO/metadata.xml15. Update metadataDocumentUri
  9. you should get 6 compile errors16. compile errors ODataClient1.cs
  10. For the methods ending with Changing or Changed, only need to remove the space character17. compile errors ODataClient1.cs18. compile errors fixed ODataClient1.cs
  11. Then there should be one left for the EdmxReader, double click on the error and it will open the code19. compile error EdmxReader
  12. Replace EdmxReader by CdslReader20. compile error EdmxReader in code21. compile error EdmxReader in code fixed
  13. ODataClient1.cs fileĀ should be successfully generated and you should be able to run the sample code from the ODataConsoleApplication yay!!!

Hope this helps.



Consuming Dynamics 365 for Finance and Operations OData services from .NET console application

After attending Tech Conference this year in Seattle, I decided to tryĀ consuming Dynamics 365 for Finance and Operations data entities through OData services from a .NET application (it is about time…). As you may know, this is replacing the old Dynamics AX 2012 AIF document services (AXDs).

I found few links on the Internet but it was notĀ that straight forward toĀ configure/implement so I thought I would just share the steps here šŸ™‚

Azure app configuration

  1. Sign in to the Azure Portal with your O365 account which has access to D365 for Finance and Operations.
  2. Select Azure Active Directory icon on the left-hand side and App registrations then click New application registration.
  3. Enter your app name, select application type Native, enter your organization’s URL in Redirect URI (e.g. and then click Create.
  4. In Search by name or AppId field, enter yourĀ app name (or part of it) so it is filtering the list of appsĀ and select it from the list.
  5. Now let’s create a new key for the registered app. Enter a Key description, Duration and click Save. It will automatically generate a key value that you will need to save so that we can use it later in the Visual Studio project. Also save the Application ID for later.
  6. Next you will need to add required permissions for Dynamics 365 for Financials.
  7. Tick the Access as the signed-in user delegated permission and click Select.
  8. Verify that you can see the Dynamics 365 for Financials delegated permission.

D365 for Finance and Operations configuration

  1. Open your Dynamics 365 for Finance and Operations in your preferred web browser.
  2. In System administration > Setup > Azure Active Directory applications, click the New button to create a new record.
  3. Enter your Azure Application Id into the Client Id field.
  4. Enter your Azure Application name into the name field.
  5. Select Admin in User ID field.
  6. Click the Save button.

OData client configuration

  1. Download the codebase from Microsoft in GitHubĀ here.
  2. Start a new instance of Visual Studio 2015 and open the ServiceSamples solution.
  3. As suggested in the first link in References, I could not update the MetadataDocumentUri variable in file and save it to re-generate the ODataClient.cs.
  4. Instead I had to delete the and ODataClient.ttinclude and re-create the OData client.
  5. Right click on ODataUtility project > Add > New item, search for ‘OData’ in Online and rename it to (if you already installed it will appear under Installed on the left-hand side).
  6. In Solution Explorer, open file that you just added and update the MetadataDocumentUri variable toĀ “/data/” where is your organization’s URL, for example “;.
  7. Then save it to generate the proxy classes (might take a few min).
  8. You should be able to see proxy classes in the ODataClient.cs.

Client configuration

  1. Still in the the ServiceSamples solution in Visual Studio 2015, open the ClientConfiguration.cs under the AuthenticationUtility project.
  2. Update all variables under OneBox:
    1. UriString with your organization’s URL
    2. Username with you O365 account with access to Dynamics 365 for Finance and Operations
    3. Password with your password
    4. ActiveDirectoryTenant, same as your organization’s URL without the “/” at the end
    5. ActiveDirectoryTenant with your tenant Id
    6. ActiveDirectoryClientAppId with your Azure application Id
    7. ActiveDirectoryClientAppSecret with you Azure key value
  3. Save your changes.

Running the ODataConsoleApplication with simple CRUD operations for asset major types

  1. Select ODataConsoleApplication project in Solution Explorer, Right click > Set as Startup Project.
  2. Open the Program.cs file under ODataConsoleApplication
  3. Comment all lines except SimpleCRUDExamples.SimpleCRUD(context).
  4. Add 3 breakpoints in SimpleCRUDExamples() method for d365.SaveChanges() lines 23, 34 and 43.
  5. Run the code and onĀ first breakpoint have a look in Dynamics 365 for Finance and OperationsĀ FixedĀ assets > Setup > Fixed asset attributes > Major types and checkĀ Test01 is not created yet.
  6. Then back in Visual Studio, press F10 to step over and refresh your browser, you can now see new Test01 major typeĀ is created.
  7. Back in back in Visual Studio again, press F5 to go to the next breakpoint and then F10 to step over (almost there!!!).
  8. Check asset major types and description fieldĀ is now updated from Description of Test01 to Updated description.
  9. Press F5 twice to finish the execution, asset major type Test01 is deleted.
  10. Review the console output.


In summary, I have explained how to configure and consume simple CRUD operations for the asset major type data entity using OData services and .NET application. Obviously we used the asset major type as an example but you can use any standard data entity or even your custom data entity as long as isPublic property is set to Yes.

I did notĀ cover the OData query options or save changes options which are quite powerful. In addition, feel free to have a look at the other projects for JSON, SOAP and tests, I am sure they are worth a try.

I hope you will find this post helpful and I look forward to any feedback or question you may have.

Also it would be interesting to compare data entities features with the AX2012 AIF document services. For example, what are the equivalent of prepareForSave(), processingRecord()… methods? How do you implement external codes for items, customers… as it was previously configuration only in AX2012? How to trigger a packing slip posting after importing multiple lines?


  1. Consume the OData API from an external console application
  2. OData services endpoint
  3. Register a native application with AAD
  4. Microsoft Dynamics AX Integration: A Customer Perspective (if you have access to Microsoft Dynamics Learning Portal)

Dynamics 365 for Operations – Spring 2017 Release

I was lucky enough to attend Dynamics 365 Tech Conference couple of weeks ago and there were some really exciting announcements for the Spring Release during the opening key note from Sri Srinivasan:

  • On premises for Q2 CY 2017 and hybrid (cloud + edge) deployment options for H2 CY 2017,
  • Data upgradeĀ process for AX2012 to Dynamics 365 for Operations,
  • Dynamics 365 for Sales and Operations Prospect-to-Cash integration,
  • Big focus on application extensions as core models to be hard sealed by Fall Release 2017/Spring Release 2018,
  • Dynamics 365 Integration Platform leveraging Common Data Services (CDS),
  • Embedded PowerBI dashboards within Dynamics 365 for Operations web client

Stay tuned, more to come…