Wednesday, December 20, 2017

D365 task recorder - capture screenshots

To enable the "capture screenshots" function in the D365 task recorder, you have to use Chrome and install the “D365 for Finance and Operations Task Recorder” Chrome plugin.

This unfortunately is not yet mentioned at all in the Dynamics 365 for Finance and Operations documentation.

Wednesday, December 13, 2017

How to increase ItemId size in D365

You will need to increase the length of ItemIdBase and EcoResProductNumber data types.

EcoResProductNumber data type is used in one of the staging table for a data entity.

Friday, November 3, 2017

How to add a new operating unit type

For AX2012, you can follow the steps in this link https://msdn.microsoft.com/en-us/library/gg989762.aspx
Couple of important points:

  1. The new enum value must use the immediate next number for that enum (you should not skip any number)
  2. The new enum name drives the view name that you need to create. AX will look for a view with "DimAttribute" prefix after the enumeration name.
    For example, if the new enum name is OMBranch then the view name must be DimAttributeOMBranch.

For D365, it is quite similar to the steps in AX2012, however in the view it should have a method called registerDimensionEnabledTypeIdentifier and you should use the view name that you just created. This will then add the new operating unit as a dimension type.

Keep in mind that in D365, there are 3 additional operating unit types (branch, rental location, region) that are part of the Fleet Management model, which is actually a sample model and doesn't get installed by default to tier 2 or production environments.

Wednesday, November 1, 2017

Management Reporter scripts

Few scripts that can be used to check the MR integration tasks:

Check all integration tasks that have been executed:

--List details about each DDM task: state, progress, last...
--List details about each DDM task: state, progress, last/next runtime in local time, and the interval that each runs 
--Status 5=Success; 3=Running; 6=Cancelled
select CIG.[Description] 
, STK.[Name] 
, STS.[Progress] 
, CASE STS.[StateType]  
WHEN 3 THEN 'Processing' 
WHEN 5 THEN 'Complete' 
WHEN 7 THEN 'Error' 
END AS StateType
, DATEADD(minute, DATEDIFF(minute,GETUTCDATE(),GETDATE()), STS.[LastRunTime]) as LocalLastRunTime 
, DATEADD(minute, DATEDIFF(minute,GETUTCDATE(),GETDATE()), STS.[NextRunTime]) as LocalNextRunTime 
, CM.[ContinueOnRecordError] 
, STRG.[Interval] 
, CASE STRG.[UnitOfMeasure] 
WHEN 2 THEN 'Minutes' 
ELSE 'Seconds' 
END AS IntervalTiming
, CASE STRG.[IsEnabled] 
WHEN 1 THEN 'Enabled' 
ELSE 'Disabled' 
END AS NameStatus
from [Connector].[Map] CM with (nolock) 
inner join [Scheduling].[Task] STK with (nolock) on STK.[Id] = CM.[MapId] 
inner join [Scheduling].[TaskState] STS with (nolock) on STK.[Id] = STS.[TaskId] 
inner join [Connector].[IntegrationGroup] CIG with (nolock) on CIG.[IntegrationId] = STK.[CategoryId] 
inner join [Scheduling].[Trigger] STRG with (nolock) on STK.[TriggerId] = STRG.[Id] 
order by CIG.[Description], STK.[Name]


Check the tasks outcomes

select CIG.[Description], ST.[Name], SM.[Text], 
DATEADD(minute, DATEDIFF(minute,GETUTCDATE(),GETDATE()), SL.[StartTime]) as LocalStartTime, 
DATEADD(minute, DATEDIFF(minute,GETUTCDATE(),GETDATE()), SL.[EndTime]) as LocalEndTime, 
SL.[TotalRetryNumber], SL.[IsFailed], STT.[Name] as TaskType 
from [Scheduling].[Log] SL with (nolock) 
inner join [Scheduling].[Task] ST with (nolock) on SL.TaskId = ST.Id 
inner join [Scheduling].[Message] SM with (nolock) on SL.Id = SM.LogId 
inner join [Scheduling].[TaskType] STT with (nolock) on ST.TypeId = STT.Id 
inner join [Connector].[IntegrationGroup] CIG with (nolock) on CIG.[IntegrationId] = ST.[CategoryId] 
order by SL.[StartTime] desc


Check the current integration activities

SELECT sqltext.TEXT, 
req.session_id, 
req.status, 
req.command, 
req.cpu_time, 
req.total_elapsed_time 
FROM sys.dm_exec_requests req 
CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS sqltext

Sunday, October 8, 2017

How to get the field type/size and mandatory fields out of data management import/export project

Whenever we do data migration tasks, there always a need to know the field type/size out of the data entities that will be used, and also to identify which fields are the mandatory ones. Unfortunately in Dynamics 365 for Operation, there is no easy way to get this information from the data management workspace.

I found out that there are "legacy" DMF forms from AX2012 that were upgraded to Dynamics 365 for Operation, however they are not exposed through menus or buttons. These particular forms contain the entity "attributes" and here are the steps to access the forms:

  1. Open D365O and in the url, replace the mi value to DMFDefinitionGroup
    Usually when you first open D365O, the url will be like
    https://<axurl>/?cmp=USMF&mi=DefaultDashboard
    This needs to be changed into
    https://<axurl>/?cmp=USMF&mi=DMFDefinitionGroup
  2. It will then show a form where you can select the import/export project (or back in AX2012 it was called DMF definition group). Select the record and click the "Entities" button
  3. Highlight one of the data entities in the project, and click the "Entity attributes" button
  4. You will then see a form that displays all the entity fields, and shows the field type and size, as well indicate which fields are mandatory


Friday, September 29, 2017

How to do simultaneous request calls using Postman collections

Newman is a command line collection runner for Postman. It allows you to run and test a Postman Collection directly from the command line. It is built with extensibility in mind so that you can easily integrate it with your continuous integration servers and build systems.

With Newman and Async, you can do simultaneous request calls which will be very helpful for performance tests.

Here are what you'll need to do to get it working:


  1. Install Newman by following the instructions on http://blog.getpostman.com/2015/04/09/installing-newman-on-windows/
    In my case, I didn’t have to install Python to get it working.
  2. Create/use a folder for the tests
  3. Export a Postman collection into a file
  4. Export a Postman environment that you want to invoke the calls to
  5. Open a command prompt in administrator mode on the folder from step 2
  6. Install async module by typing npm install --save async
  7. Create a .js file
  8. var path = require('path'),
      async = require('async'), //https://www.npmjs.com/package/async
      newman = require('newman'),

      parametersForTestRun = {
        collection: path.join(__dirname, 'postman_collection.json'), // your collection
        environment: path.join(__dirname, 'postman_environment.json'), //your env
      };

    parallelCollectionRun = function(done) {
      newman.run(parametersForTestRun, done);
    };

    // Runs the Postman sample collection thrice, in parallel.
    async.parallel([
        parallelCollectionRun,
        parallelCollectionRun,
        parallelCollectionRun
      ],
      function(err, results) {
        err && console.error(err);

        results.forEach(function(result) {
          var failures = result.run.failures;
          console.info(failures.length ? JSON.stringify(failures.failures, null, 2) :
            `${result.collection.name} ran successfully.`);
        });
      });

  9. Run the .js file from step 7 from command prompt node fileName.js

Saturday, September 16, 2017

More on Postman functions

Continuing from my posts on Postman, here are some of the Postman functions that I use from time to time.


  1. To assign a value from the response to an environment variable:
    var json = JSON.parse(responseBody);
    postman.setEnvironmentVariable("bearerToken", json.access_token);


    In that example, it is assigning the access_token from the response to a variable called bearerToken
  2. Test conditions to decide the test result. This is to make it easier to see which request is failing and which one is not failing.
    tests["Status OK"] = responseCode.code == 201

    In this case, Status OK will be considered to be failed if the response status code is NOT 201.
  3. Randomise an integer between certain values
    _.random(1,5)

    In this case, it will return any integer number between 1 and 5.
  4. Randomise an integer value
    {{$randomInt}}

    In this case, it will return a random integer number.
  5. When you create a collection of requests in Postman, you can then run them using the collection runner. By default it will go by the request order in the collection. You can use this function to direct the sequence if you need to deviate from the default order:
    postman.setNextRequest("call name");
    If you put call that in the "Tests" script, then after running the current call, it will then run another call named "call name".

    postman.setNextRequest(null);

    The above will stop the execution.

    With this function you can basically call the same request more than once (for example to create multiple lines).
    Please note that this function only works in the Collection Runner.
In the next post, I will explain how to use Postman collection and environment data to do multiple concurrent calls, which will be very useful for performance tests.

Wednesday, August 30, 2017

Testing Dynamics 365 for Operation Recurring Integrations with Postman

Continuing on the last post, let's do some tests on the D365O recurring integrations with Postman.

Initial setup

We still need the "GetToken" call as before. Please check on the previous post for information on how to set this up.

Next on the D365O data management workspace, we need to create an import and an export projects. To make it simple, let's use the "Customer groups" entity.

Create an import project as usual, and you need to create a sample of import file. Once you click the "Upload" button, then click the "Create recurring data job" button at the top. You will then need to:

  • Specify a name
  • Specify the application ID, and tick the checkbox beside it
    (This is the client ID that we use in the GetToken call)
  • Click the "Set processing recurrence" and set the job recurrence
  • Click the "Set monitoring recurrence" and set the monitoring recurrence
  • Keep note of the job ID
  • Click OK
Next, create an export project as usual. After you click the "Add entity" button, then click the "Create recurring data job" button at the top. You will then to do the same as above.

Send a file to the recurring import job

Open Postman, and execute the "GetToken" call to get the access token. Then create a new POST call with this url https://<axurl>/api/connector/enqueue/<activity id>?entity=<entity name>

Activity ID is the job ID that was displayed when you created the recurring jobs.


Then click on the Body tab, choose Binary, and click the "Choose Files" button to choose the import file. After that just click the "Send" button.

If it's successful, it will return with a job ID and HTTP 200.

You then can inquire the status of the job by doing a GET call to this url: https://<axurl>/api/connector/jobstatus/<activity_id>?jobId=<job_id> 
The result will then show you the job status, including the job started/completed date time and execution logs

Please note that you always need to add the authorisation and the access token in the message header with every call.

Receive a file from the recurring export job

Open Postman, and execute the "GetToken" call to get the access token. Then create a new GET call with this url https://<axurl>/api/connector/dequeue/<activity id>
If it's successful, it will return with a download location and HTTP 200.

Then you'll need to create a new GET call and use the download location as the URL, and then instead of clicking the "Send" button, you'll need to click the "Send and Download" button. Postman will then display a dialog where you can use to select the location where you want the file to be saved.

After downloading the file, you should send an acknowledgement:
https://<axurl>/api/connector/ack/<activity_id>
{
  "CorrelationId": "<CorrelationId>",
  "PopReceipt": "<PopReceipt>",
  "DownloadLocation": "<DownloadLocation>"
}

The body (correlationId, popReceipt, downloadLocation) should be the same as the body from the original dequeue request.

Please note that you always need to add the authorisation and the access token in the message header with every call.

Thursday, August 17, 2017

Accessing Dynamics 365 for Operations ODATA services with Postman

Postman is another application that you can use to call ODATA services. The application can do some Javascript scripting, making it more powerful than using Fiddler to call ODATA services.

As before, you will need an application registered through Azure portal so that you'll get a client ID with a client secret key, or a client ID with a username and password.

After that you'll need to download Postman from https://www.getpostman.com/

Open Postman, and if you're going to use your local D365O VM, then you'll need to go to the settings and turn off the "SSL certificate verification".

In Postman, you have something called Environments and Collections. An environment contains settings (ie. credentials) for a particular environment (ie. production, UAT, Dev, or local VM, etc). While a collection contains the ODATA requests. So the idea is you could use requests from any collection against any environment that you have specified.

To create a new environment, click the gear icon on the top right side of the application, then choose the Manage Environments and click the Add button.
Give a name for the new environment, and click the Bulk Edit link. Then type in the environment variables that you want to save.
I use something like:
ClientID:<client_Id>
ClientSecret:<client_secret>
Resource:https://<axurl>
Tenant:https://login.windows.net/<tenant>
You don't have to use the Bulk Edit and just use the table to input the same information, however usually it is faster to gather and format the information in the Bulk Edit mode.

If you don't have a client secret key but have a username and password combination for the client ID, then you can use something like this:
ClientID:<client_Id>
Resource:https://<axurl>
Tenant:https://login.windows.net/<tenant>
username:<username>
password:<password>
Replace those in bold with the valid values and click Add to actually save the new environment.

Get the access token

The next step would be to get the access token to the D365O. There are more than one way to do this, such as using the Postman Get New Access Token function, however I haven't been able to make this work that way. It might be that you'll need to register a new application in the Azure portal and use the specific callback url. If you want to do it that way, please see this post that might be helpful https://community.dynamics.com/ax/b/dynamicsnavax/archive/2017/05/23/dynamics-365-for-operation-web-service-calls-with-postman

I'm going to do the old way so that I don't have to do any special thing in the Azure portal. In the main section, make sure you can see a new tab, otherwise click the plus sign.


See the above screenshot, and make the changes on the highlighted parts. Please note that anything in the double curly brackets is referring to a Postman variable. In this case, they are all referring to the variables that we saved in the environment. 
Please also note that we use "No auth" in the authorization tab.
You can click Send button now, and that will get you the access token, however to make things easier for the next request, go to the Tests tab, and enter these codes:
var json = JSON.parse(responseBody);
postman.setEnvironmentVariable("bearerToken", json.access_token);
This pretty much instructs Postman to get the access_token value from the result and store it as an environment variable called bearerToken.

Click the Send button, and make sure that it returns the result with status HTTP 200. After that click the eye icon (the one before the gear icon), and see if it's added the bearerToken there.

You might want to save this request, and add it into a new collection.

Now that we have the access token, let's do some calls to the LedgerJournalHeaders service, similar to what we did in Fiddler before.

Get ledger journal headers

Create a new request, set the action to GET and enter this as the url: {{Resource}}/data/LedgerJournalHeaders
Then go to the Headers tab, and add a new entry with Authorization as the key, and set the value to Bearer {{bearerToken}}

Click the Send button and you'll get the result back. That's quite easy isn't it?
Anytime the token is expired, just open the get token service that you saved before, and run that again.

Insert a new ledger journal header

Create a new request, set the action to POST, and enter this as the url: {{Resource}}/data/LedgerJournalHeaders
Then go to the Headers tab, and add a new entry with Authorization as the key, and set the value to Bearer {{bearerToken}}
Next go to the Body tab and select raw, change the Text selection to JSON (application/json), and enter this:
{
"dataAreaId":"USMF",
"JournalName":"GenJrn",
"Description":"Test journal"
}
Click the Send button and you'll get the result back.

Update a ledger journal header

Create a new request, set the action to PATCH, and enter this as the url: {{Resource}}/data/LedgerJournalHeaders(JournalBatchNumber='<journalNumber>',dataAreaId='<dataAreaId>')
Replace that journalNumber and dataAreaId with valid values.

Then go to the Headers tab, and add a new entry with Authorization as the key, and set the value to Bearer {{bearerToken}}
Next go to the Body tab and select raw, change the Text selection to JSON (application/json), and enter this:
{
"dataAreaId":"USMF",
"Description":"Edited journal description"
}
Click the Send button and when you'll get status HTTP 204, then that means the request was successful.

Please note that D365O by default will get the values from the integration user's default company, regardless of the dataAreaId that you're actually specifying on the call. If you need to access a record on a different company, then you'll need to add ?cross-company=true


Delete a ledger journal header

Create a new request, set the action to DELETE, and enter this as the url: {{Resource}}/data/LedgerJournalHeaders(JournalBatchNumber='<journalNumber>',dataAreaId='<dataAreaId>')
Replace that journalNumber and dataAreaId with valid values.

Then go to the Headers tab, and add a new entry with Authorization as the key, and set the value to Bearer {{bearerToken}}
Click the Send button and when you'll get status HTTP 204, then that means the request was successful.


Friday, August 4, 2017

Accessing Dynamics 365 for Operations ODATA services with Fiddler

Before you start, you will need to do a new application registration first through Azure portal to get the Client ID and the Client Secret key.

After that, download and install Fiddler from http://www.telerik.com/fiddler

Open Fiddler and go to the Composer [tab] and Options [tab]
Enable "Inspect session" and "Fix content-length header"

Then open the Scratchpad [tab] and paste this

POST https://login.windows.net/<tenant>/oauth2/token HTTP/1.1 
Content-Type: application/x-www-form-urlencoded
Host: login.windows.net

resource=https://<axurl>&client_id=<client-id>&client_secret=<client-secret>&grant_type=client_credentials

Replace the <tenant>, <axurl>, <client-id> and <client-secret> with the valid values, then select/highlight the statements and click the Execute button
The view should be switched to the Inspector [tab] and you can click on the Raw [tab] to see the raw result.
Copy the value of the access_token as you will need this for the subsequent service calls.


If you don't have the client secret key, but you have the username and password, you can use this to get the token:
POST https://login.windows.net/<tenant>/oauth2/token HTTP/1.1
Content-Type: application/x-www-form-urlencoded
Host: login.windows.net

resource=https://<axurl>&client_id=<client-id>&authorityURL=https://login.windows.net/<tenant>&username=<username>&password=<password>&grant_type=password
Replace the <tenant>, <axurl>, <client-id>, <username> and <password> with the valid values, then select/highlight the statements and click the Execute button
The view should be switched to the Inspector [tab] and you can click on the Raw [tab] to see the raw result.
Copy the value of the access_token as you will need this for the subsequent service calls.


After we get the access token, let's try to call the LedgerJournalHeaders service.

To do a GET call:

GET https://<axurl>/data/LedgerJournalHeaders HTTP/1.1
OData-Version: 4.0
OData-MaxVersion: 4.0
Accept: application/json;odata.metadata=minimal
Accept-Charset: UTF-8
Authorization: Bearer <access_token>
Host: <axurl>

Replace the <axurl> and <access_token> with the valid values, then select/highlight the statements and click the Execute button.
The view should be switched to the Inspector [tab] and you can click on the JSON [tab] to see the result.

To insert a new journal header:

POST https://<axurl>/data/LedgerJournalHeaders HTTP/1.1
OData-Version: 4.0
OData-MaxVersion: 4.0
Content-Type: application/json;odata.metadata=minimal
Accept: application/json;odata.metadata=minimal
Accept-Charset: UTF-8
Authorization: Bearer <access_token>
Host: <axurl>
{
"@odata.type":"#Microsoft.Dynamics.DataEntities.LedgerJournalHeader",
"dataAreaId":"USMF",
"JournalName":"GenJrn",
"Description":"Test journal"
}

Replace the <axurl> and <access_token> with the valid values, then select/highlight the statements and click the Execute button.
The view should be switched to the Inspector [tab], if the insert is successful then it will return the newly inserted record as the result.


To update a journal header:

PATCH https://<axurl>/data/LedgerJournalHeaders(JournalBatchNumber='<journalNumber>',dataAreaId='<dataAreaId>') HTTP/1.1
OData-Version: 4.0
OData-MaxVersion: 4.0
Content-Type: application/json;odata.metadata=minimal
Accept: application/json;odata.metadata=minimal
Accept-Charset: UTF-8
Authorization: Bearer <access_token>
Host: <axurl>
{
"@odata.type":"#Microsoft.Dynamics.DataEntities.LedgerJournalHeader",
"dataAreaId":"USMF",
"Description":"Edited journal description"
}

Replace the <axurl>, <access_token>, <journalNumber> and <dataAreaId> with the valid values, then select/highlight the statements and click the Execute button.
Please note that D365O by default will get the values from the integration user's default company, regardless of the dataAreaId that you're actually specifying on the call. If you need to access a record on a different company, then you'll need to add ?cross-company=true

If the update is successful then it will return HTTP status 204, otherwise it will return HTTP status 400 with the error message.


To delete a journal header:

DELETE https://<axurl>/data/LedgerJournalHeaders(JournalBatchNumber='<journalNumber>',dataAreaId='<dataAreaId>') HTTP/1.1
OData-Version: 4.0
OData-MaxVersion: 4.0
Accept: application/json;odata.metadata=minimal
Accept-Charset: UTF-8
Authorization: Bearer <access_token>
Host: <axurl>

Replace the <axurl>, <access_token>, <journalNumber> and <dataAreaId> with the valid values, then select/highlight the statements and click the Execute button.
If the update is successful then it will return HTTP status 204, otherwise it will return HTTP status 400 with the error message.


To insert multiple journal headers in one request:

POST https://<axurl>/data/$batch HTTP/1.1
OData-Version: 4.0
OData-MaxVersion: 4.0
Content-Type: multipart/mixed; boundary=batch_boundary
Accept: multipart/mixed
Accept-Charset: UTF-8
Authorization: Bearer <access_token>
Host: <axurl>
 
--batch_boundary
Content-Type: multipart/mixed; boundary=changeset_boundary
 
--changeset_boundary
Content-Type: application/http
Content-Transfer-Encoding: binary
Content-ID: 1
POST https://<axurl>/data/LedgerJournalHeaders HTTP/1.1
OData-Version: 4.0
OData-MaxVersion: 4.0
Content-Type: application/json;odata.metadata=minimal
Accept: application/json;odata.metadata=minimal
Accept-Charset: UTF-8
Authorization: Bearer <access_token>
{
"@odata.type":"#Microsoft.Dynamics.DataEntities.LedgerJournalHeader",
"dataAreaId":"USMF",
"JournalName":"GenJrn",
"Description":"Test journal 1"
}
 
--changeset_boundary
Content-Type: application/http
Content-Transfer-Encoding: binary
Content-ID: 2
POST https://<axurl>/data/LedgerJournalHeaders HTTP/1.1
OData-Version: 4.0
OData-MaxVersion: 4.0
Content-Type: application/json;odata.metadata=minimal
Accept: application/json;odata.metadata=minimal
Accept-Charset: UTF-8
Authorization: Bearer <access_token>
{
"@odata.type":"#Microsoft.Dynamics.DataEntities.LedgerJournalHeader",
"dataAreaId":"USMF",
"JournalName":"GenJrn",
"Description":"Test journal 2"
}


To call an ODATA action:

POST https://<axurl>/data/<EntityPublicCollectionName>([EntityKey])/Microsoft.Dynamics.DataEntities.<ActionName> HTTP/1.1
OData-Version: 4.0
OData-MaxVersion: 4.0
Content-Type: application/json
Accept: application/json;odata.metadata=minimal
Accept-Charset: UTF-8
Authorization: Bearer <access_token>
Host: <axurl>



As you can see, Fiddler can be used as a simple tool to access the D365O services. I think this is easier and faster than having to build custom codes in Visual Studio to do the calls.
However depending how many tests that you'll need to perform, it can be a bit painful to copy paste the access token values for every call.

Credit to Kalle Sõber on his post http://www.k3technical.com/testing-ax7-odata-services-fiddler/ 

In the next post, I will show how to use Postman to call D365O services. Postman has variable system that makes it easier to do the calls so that you don't have to copy paste the access token, like we did just now with Fiddler.