The Microsoft Lifecycle Services Team has been working on a data loader for Dynamics CRM Online for a while now.  It’s been in ‘Preview’ mode but available for Dynamics CRM Online users to use.  Keep in mind that it is still in Preview mode and is in on-going development work.  In fact, since I started this blog a month or so ago, I’ve had to update several of my screenshots for this publication.

However, though still being tweaked, it is available and can be used.  Again, a cautionary note; while putting it through some paces previously, I was surprised to login one day and find that it was ‘down for development’.  Keep that in mind if you are brave enough to do production work with it at this point.

I believe the point of this tool is to give additional options/flexibility for data migration that the Data Import Wizard does not currently offer.  Plus it boasts of much better performance because all the processing is done in the cloud.  The Connection runtime is created in the cloud and, as you will see, you upload your migration data to the cloud as well, so, all processing is done on the Microsoft side.

Here is the URL for the data loader: https://lcs.dynamics.com/DataLoader/Index.  You must be a Dynamics CRM Administrator to login to Lifecycle Service.

Review Comments

As you will see in the following screenshots, the steps for creating a migration job is fairly straight forward.  One of the nice features is picklist value transformation.  If you have values in your source data that match picklist values within the CRM entity, your source data can be either the test value or the numeric value that resides in CRM.  If you use the text value, provided there is a matching value in CRM, the migration job will automatically transform the text value into the associated numeric value of the picklist.

If you are importing, for example, account data in one job and related contact data in another job, you will need to extract the previously imported account (at least a small subset of columns) with the GUID that was created at runtime of the migration job.  You’ll need to add that ‘parentcustomerid’ to your contact source data to ensure the contacts are correctly related to their parent accounts.  Apply this scenario for any records that have a parent/child relationship.

However, if there is only one contact associated with accounts being migrated, that one contact should be part of the source data and can then be mapped as part of the account migration job.

As mentioned previously, the following will give you a good idea of the steps/process of creating migrations jobs in the Lifecycle Services Data Loader.

Step One - Deploy the Data Loader CRM Connections runtime

  1. Click the ‘CRM Connections’ tile.

2.   Click the plus sign () to add a new or additional CRM connection.  Enter your CRM Online login name and password.  Data Loader will then validate your login and then present a list of instances (organizations) to choose from based on your credentials.  The CRM Instances dropdown will only be populated after your credentials are validated.

Once the CRM Connections runtime has been deployed (which could take up to ½ hour), it will be listed as a deployed CRM Connection with a status of ‘Running’.

Step Two – Configure the File Format

This is where you set the format of the file to be uploaded for either Insert or Upsert to Dynamics CRM.  This does not insert records into CRM nor does it identify the file used for upload.  It only defines the file format that you will use in a later step.

1.    Click the ‘Configure file format’ tile.
Choose the CRM Instance from the dropdown menu (you may have deployed several Runtimes to meet your organizations requirements).

2.    Click the + icon to add a file format to your runtime deployment.  The screenshot below is the format I setup for testing.

3.    Configure the file format.

Click 'Save.'

Step Three – Importing Data

There are limitation to using the Lifecycle Data Loader for anything other than fairly straight forward migrations, but, if you are just looking to slam in some data with good translation rate performance, this tool could be a good option.