3 things which will save time before you do data migration on Salesforce

If you are planning to do a data migration to salesforce from an external database like Oracle, sql server or any other system, there is always a lot of things which you need to do to get the data migration done quickly. Now with my experience, I found that there are 2 bummers which people ignore when they do data migration which hits them hard as they do it and waste a lot of time fixing . So here are 3 key things to consider in a data migration project.
1. AutoNumbers fields need to be set to the right value and deployed with the right value
If you have created an autonumber field in your custom object and want the data to be populated during data migration, salesforce would automtically start incrementing the value for every record which is inserted. So if you have a business need like an application number to start with a number like 9000 or 10,000 you can set the starting value when you create the field. This would work good for testing on a sandbox. But what if you want to deploy to production and would want salesforce auto numbers to start from 9000 again on production?
a. To do this, before you do data migration on production, convert the autonumber field to a text field. This would also mean you have to comment all the code references on apex classes. Again convert the text field to autonumber, set the value to 9000 and deploy to production.
b. Now if you do data migration on production, all values will start from 9001.

2. Beware of Test Methods impact on AutoNumbers fields on production
Now if you have a lot of test classes and each test class inserts a record in the custom object, the auto numbers would change with the test methods. So you will find to your surprise when you do a data migration, that your sequence number is 90025, 90046 etc which is caused by the test methods which inserted test data. How to avoid this?
a. Call salesforce and have them enable a feature called Separate autonumbers for test methods. You might have to fight with the support people on this but worth having a fight on this issue!!!

3. Data population approach for child records after populating master records
With any data migration, there is always the case of exceptional records where you might have to query multiple tables and populate some fields on custom objects after populating master records. Like an e.g if you have an Asset object and you have to populate different types of assets like bank, cd, property and you have a custom object with multiple lookup fields which has to be populated. Now if you are a traditional programmer, you would think you can create an anonymous apex class, which would query some objects and do a loop and populate the custom fields programmatically. But you would see that you will get the governor limit exceptions like 10000 dmls and 101 soql issues. So what is the solution for this?
a. First do a rough count of how many records would you need to populate to salesforce with custom sql or apex classes?
b. If the count is less than 5000, you can create an apex class and run it in the org after conversion.
c. The safest approach is to use sql in your legacy database , extract the records with outerjoins or inner joins , create a data file and load it using data loader or jitter bit
d. If the oracle table uses a combined key with multiple fields, create an external id in your salesforce object and populate it by combining the 2 fields with a – like 100-01, 100-02. The reason i want to provide dashes is that it will solve excel from not screwing up the number fields with exponentials.

By consider the 3 factors on the top, you can save time with your data migration to production peacefully and take a day off on the production launch!!

Leave a Reply

Your email address will not be published. Required fields are marked *

Get free tips on Salesforce
Get free tips on Salesforce
We respect your privacy.