I attended a Microsoft training few weeks ago where I saw a very cool demo with IoT so I decided to buy my own MXChip IoT DevKit to try it out.
Scenario: using your IoT device temperature sensor, you want to see the temperature real time in a Power BI dashboard and then when temperature reaches a certain threshold you want to raise a purchase order in FinOps for a bottle of water.
- First, you will have to connect your MXChip IoT DevKit to Azure Iot Hub
- I had to do a small code change in the get started sample to update the threshold to 20 degrees as it’s winter down here and also to always send temperature and humidity even though the value has not change since last time (Power BI was not seeing all fields before I changed it)
- Once you have done it and you can see temperature, humidity and counter displayed onto your device OLED Screen
- The next step is to configure a Stream Analytics job in Microsoft Azure to send your IoT messages to Power BI
- When successfully configured and running, you can create a new dashboard in Power BI using Custom Streaming Data tile to get something like this
- Now that we can see temperature real time in Power BI, let’s see how we can connect Logic Apps to Azure IoT Hub
- After creating your Service Bus Queue, IoT Hub custom endpoint and a routing rule, I only changed the logic app to add FinOps connector to create PO header first and then PO line between the trigger from the queue and the send email action
- Turn on your IoT device and put it behind your laptop to increase the temperature
- When it reaches 20 degrees, the message sent to IoT hub will be routed to my Service Bus queue
- Then Logic app will pick up the message from the queue, create a PO in FinOps
- Send an email with the PO number
- Finaly the bottle of water is delivered to my door 😉
This is just a very simple scenario to show how you can easily interface an IoT device with D365 for Finance and Operations by leveraging the power of the Azure stack. In real life implemention, you will probably have to create a purchase requisition instead of a purchase order so that it can go through PR workflow for approval.
One thing to note in my example: because my device sends messages every 5sec (configurable), every time temperature is above 20 degrees, it is creating a PO in FinOps so I actually ended up with a lot of POs in FinOps!
That’s all for today and remember to stay hydrated 🙂
Scenario: you just went live with D365 Finance and Operations (e.g. application version 8.0) and you are planning to update to a new version (e.g. application version 8.1). Best practice is to have as many build servers as number of application versions you need to manage.
Unfortunately, online documentation is not very clear regarding steps to provision and configure a second build environment for your project:
For the build environment, Create a new agent pool and assign it to this environment on the Advanced options screen.
In Azure DevOps, visit your existing Build Definition and ensure that it is not using your new agent pool for 8.1/10.0. This will keep your new build agent from trying to compile older application code.
When you provision your first build environment from LCS, it will automatically create and configure a build pipeline in Azure DevOps. However, when provisioning a second build environment with your new version, no build pipeline is created at all in Azure DevOps.
Here I am assuming that you already provisioned your new build environment with your target application (version 8.1) and a new agent pool, created a new RELEASE branch (version 8.0) from your MAIN branch which will become version 8.1.
For the existing build pipeline hooked to the MAIN branch, we want to make sure that it is executed on the new build environment (version 8.1)
- Edit your build pipeline, select ‘Run on agent’ on the left then on the right hand side under ‘Demands’, add a new line with Agent.Name equals <your build agent name> (you configured that one when provisionning the new build environment e.g. BuildAgent2)
- Then save the pipeline
Now we can clone this pipeline to create a new one for RELEASE branch (version 8.0) and edit it
- In ‘Get sources’, update the server path to the RELEASE branch
- In ‘Run on agent’, update the agent name to the previous agent pool from the original build environment (e.g. BuildAgent)
- In ‘Build the solution’, update the path for the AXModulesBuild.proj with the RELEASE branch
- Then save the pipeline
You now have 2 automated builds running for 2 different application versions 🙂
If you cannot remember your agent pool names, you can still review it in Azure DevOps > Project settings > Agent pools.
Hope this helps.
Following my post from last year Consuming Dynamics 365 for Finance and Operations OData services from .NET console application, I had to reconfigure my ODataConsoleApplication and I had quite a few errors 4 months ago.
If you remember, in my previous post, we had to open ODataClient.tt file and update the MetadataDocumentUri variable to your organization’s URL and adding /data/ at the end. The problem when doing it with the new application version 8.0 is that you will get an ugly error Running transformation: System.InvalidCastException: Unable to cast object of type ‘Microsoft.OData.Edm.Csdl.CsdlSemantics.UnresolvedType’ to type ‘Microsoft.OData.Edm.IEdmCollectionType’ and it will not generate the ODataClient1.cs file… At the time I’m writing this I am still not sure if it fixed or not.
Update: it does not seem to be fixed but the issue is mentioned in docs.microsoft.com here and there is a metadata validator available in GitHub to get a more detailed error when this is happening.
Below are the steps to work around this issue.
- In your favorite web browser, open your D365 Finance and Operations organization’s URL and add “/data/$metadata” at the end
- Select all the content, a paste it in a file that you will save locally as metadata.xml
- Now open that file with Notepad++ as it is quite big (15Mb), and look for <EntityType Name=”ItemType”> and rename it to <EntityType Name=”ItemTypes“>
- Also update the entity type under EntitySet from Microsoft.Dynamics.DataEntities.ItemType to Microsoft.Dynamics.DataEntities.ItemTypes
- Now that the metadata.xml file is fixed, let’s go back in visual studio and check the version of out Microsoft.OData.Client, when I tested it 4 months back I used version 7.4.4
- Then delete the ODataClient1.tt from your project
- and add a new OData client
- Edit the newly created ODataClient1.tt but this time we will not put the organization’s URL in MetadataDocumentUri variable but the metadata.xml file that you saved locally e.g. File:///C:/D365FO/metadata.xml
- you should get 6 compile errors
- For the methods ending with Changing or Changed, only need to remove the space character
- Then there should be one left for the EdmxReader, double click on the error and it will open the code
- Replace EdmxReader by CdslReader
- ODataClient1.cs file should be successfully generated and you should be able to run the sample code from the ODataConsoleApplication yay!!!
Hope this helps.
Following my post earlier this year at tech conference 2017, more information regarding prospect to cash integration between Dynamics 365 for Sales and Dynamics 365 for Finance and Operations:
After attending Tech Conference this year in Seattle, I decided to try consuming Dynamics 365 for Finance and Operations data entities through OData services from a .NET application (it is about time…). As you may know, this is replacing the old Dynamics AX 2012 AIF document services (AXDs).
I found few links on the Internet but it was not that straight forward to configure/implement so I thought I would just share the steps here 🙂
Azure app configuration
- Sign in to the Azure Portal with your O365 account which has access to D365 for Finance and Operations.
- Select Azure Active Directory icon on the left-hand side and App registrations then click New application registration.
- Enter your app name, select application type Native, enter your organization’s URL in Redirect URI (e.g. https://demoXXXaos.cloudax.dynamics.com) and then click Create.
- In Search by name or AppId field, enter your app name (or part of it) so it is filtering the list of apps and select it from the list.
- Now let’s create a new key for the registered app. Enter a Key description, Duration and click Save. It will automatically generate a key value that you will need to save so that we can use it later in the Visual Studio project. Also save the Application ID for later.
- Next you will need to add required permissions for Dynamics 365 for Financials.
- Tick the Access as the signed-in user delegated permission and click Select.
- Verify that you can see the Dynamics 365 for Financials delegated permission.
D365 for Finance and Operations configuration
- Open your Dynamics 365 for Finance and Operations in your preferred web browser.
- In System administration > Setup > Azure Active Directory applications, click the New button to create a new record.
- Enter your Azure Application Id into the Client Id field.
- Enter your Azure Application name into the name field.
- Select Admin in User ID field.
- Click the Save button.
OData client configuration
- Download the codebase from Microsoft in GitHub here.
- Start a new instance of Visual Studio 2015 and open the ServiceSamples solution.
- As suggested in the first link in References, I could not update the MetadataDocumentUri variable in ODataClient.tt file and save it to re-generate the ODataClient.cs.
- Instead I had to delete the ODataClient.tt and ODataClient.ttinclude and re-create the OData client.
- Right click on ODataUtility project > Add > New item, search for ‘OData’ in Online and rename it to ODataClient.tt (if you already installed it will appear under Installed on the left-hand side).
- In Solution Explorer, open ODataClient.tt file that you just added and update the MetadataDocumentUri variable to “/data/” where is your organization’s URL, for example “https://demoXXXaos.cloudax.dynamics.com/data/”.
- Then save it to generate the proxy classes (might take a few min).
- You should be able to see proxy classes in the ODataClient.cs.
- Still in the the ServiceSamples solution in Visual Studio 2015, open the ClientConfiguration.cs under the AuthenticationUtility project.
- Update all variables under OneBox:
- UriString with your organization’s URL
- Username with you O365 account with access to Dynamics 365 for Finance and Operations
- Password with your password
- ActiveDirectoryTenant, same as your organization’s URL without the “/” at the end
- ActiveDirectoryTenant with your tenant Id
- ActiveDirectoryClientAppId with your Azure application Id
- ActiveDirectoryClientAppSecret with you Azure key value
- Save your changes.
Running the ODataConsoleApplication with simple CRUD operations for asset major types
- Select ODataConsoleApplication project in Solution Explorer, Right click > Set as Startup Project.
- Open the Program.cs file under ODataConsoleApplication
- Comment all lines except SimpleCRUDExamples.SimpleCRUD(context).
- Add 3 breakpoints in SimpleCRUDExamples() method for d365.SaveChanges() lines 23, 34 and 43.
- Run the code and on first breakpoint have a look in Dynamics 365 for Finance and Operations Fixed assets > Setup > Fixed asset attributes > Major types and check Test01 is not created yet.
- Then back in Visual Studio, press F10 to step over and refresh your browser, you can now see new Test01 major type is created.
- Back in back in Visual Studio again, press F5 to go to the next breakpoint and then F10 to step over (almost there!!!).
- Check asset major types and description field is now updated from Description of Test01 to Updated description.
- Press F5 twice to finish the execution, asset major type Test01 is deleted.
- Review the console output.
In summary, I have explained how to configure and consume simple CRUD operations for the asset major type data entity using OData services and .NET application. Obviously we used the asset major type as an example but you can use any standard data entity or even your custom data entity as long as isPublic property is set to Yes.
I did not cover the OData query options or save changes options which are quite powerful. In addition, feel free to have a look at the other projects for JSON, SOAP and tests, I am sure they are worth a try.
I hope you will find this post helpful and I look forward to any feedback or question you may have.
Also it would be interesting to compare data entities features with the AX2012 AIF document services. For example, what are the equivalent of prepareForSave(), processingRecord()… methods? How do you implement external codes for items, customers… as it was previously configuration only in AX2012? How to trigger a packing slip posting after importing multiple lines?
- Consume the OData API from an external console application
- OData services endpoint
- Register a native application with AAD
- Microsoft Dynamics AX Integration: A Customer Perspective (if you have access to Microsoft Dynamics Learning Portal)
I was lucky enough to attend Dynamics 365 Tech Conference couple of weeks ago and there were some really exciting announcements for the Spring Release during the opening key note from Sri Srinivasan:
- On premises for Q2 CY 2017 and hybrid (cloud + edge) deployment options for H2 CY 2017,
- Data upgrade process for AX2012 to Dynamics 365 for Operations,
- Dynamics 365 for Sales and Operations Prospect-to-Cash integration,
- Big focus on application extensions as core models to be hard sealed by Fall Release 2017/Spring Release 2018,
- Dynamics 365 Integration Platform leveraging Common Data Services (CDS),
- Embedded PowerBI dashboards within Dynamics 365 for Operations web client
Stay tuned, more to come…