Data migration from one system to another is a bit like moving house. However, unlike moving house, when you migrate to Salesforce from another system, you can’t risk the equivalent of losing a box in the move.
In fact, the potential headaches, disruptions and risks from migrating are one of the main reasons companies stick with legacy systems and don’t take advantage of the opportunities and innovations offered by Salesforce.
Fortunately, migrating to Salesforce doesn’t have to be risky or cause any business disruptions. It is possible with the correct procedure, to maintain business operations, avoid data loss and corruption and ensure full-adoption when migrating to a new system. And that’s exactly what I’ll take you through in today’s post.
To show you how VRP Consulting solves these problems for our clients, let’s go through our process together with the benefit of an example: a large German retailer who migrated their business from SAP С4С to Salesforce Service Cloud.
We all know the famous saying that “failure to plan is planning to fail” for a good reason. It is imperative to start your Salesforce migration by analysing your organisation’s business processes and systems.
For our clients, we organise focus groups of stakeholders and define requirements from their statements. Be sure to include end-users and team leads as well as higher-ups; often surprising insights come from the grassroots.
After the sessions with our German retail client, we discovered their system architecture looked as follows:
The SAP system was integrated with a range of eCommerce, Marketing, ERP, and other systems. This meant that not only was it necessary to migrate from one system to another, but we also had to migrate third-party integrations from SAP to Salesforce. Hardly a unique problem.
Ideally, you would have the architectural designs of the initial system landscape dataflow and business logic, but more often than not, either there’s no tech specification at all, or it’s high-level at best. This makes reverse engineering a necessary process to move forward.
In the case of our client, we proceeded with verifying the results of our reverse engineering efforts with stakeholders in order not to miss a single key consideration.
With the architecture revealed, now we need to configure Salesforce according to the business needs we uncovered.
The first step here is data model configuration – we need to define how data will be stored in the database. What standard Salesforce objects can we use based on the Salesforce licenses that the customer has purchased, and what custom objects we should create for the rest of the data?
As our retail client had a large bank of historical data, we identified what data was crucial to be kept in Salesforce as it was, what historical data could be aggregated or summarized and what data was best kept outside of Salesforce. This kept their data volumes manageable and optimised their licences while still ensuring that every user has access to the data and reports they need.
The second critical step is to configure the Security Model. For this, we need to understand the level of functionality required and data access required for all groups of users and set Roles, Profiles & Permissions for them accordingly.
The third step is applying business logic & user interface elements. This involves configuring Validation and Workflow rules, creating custom logic using Process and Flow builders, as well as developing custom components, pages, and services for a dramatically improved all-round user experience.
Data migration Phase
Once Salesforce is ready for use, it is time to start the Data Migration process. In our client’s case, there were vast quantities of data in the previous system that needed to be carried over.
While manual data entry is theoretically an option, it isn’t realistic due to the extreme time required and the risk of manual data entry. An ETL tool like Jitterbit is needed to transfer data in bulk.
Even with a tool like Jitterbit, you can’t just download a file and upload it to Salesforce. Every system has its own standards, and sometimes it is not possible to simply lift and shift data in its original format. For this reason, a crucial part of the data migration phase includes the transformation of legacy data into the right target format.
This requires creating data templates, where data is converted into the new standard for the Salesforce system. With these data templates in place, we can map the object relationships and plan the proper sequence of data transfer to ensure object relationships are maintained and avoid data loss.
Usually, the data migration phase takes a few weeks, but of course, this depends on data volumes, mapping complexities and requirements for aggregating or archiving certain data sets. In the case of our German client, it took longer than some other data migrations as SAP uses more complicated data structures with multiple tables, unlike Salesforce’s simpler product tables.
Parallel Run Phase
Once the data had been migrated, we needed to synchronise both systems. This allowed us to start a Parallel Run phase – the foundation of our approach.
The Parallel Run phase is a temporary solution when you have both of your systems functioning simultaneously.
This approach enables you to swap between systems seamlessly. During this phase, both of your systems are completely synchronised, and you can find all the recent information via Salesforce. With both systems full-functional, you can start training the first user group with real data. At the same time, untrained users can continue using the old system without any impact.
One of the main benefits of running in parallel for some time is that, in case of emergency or mistakes, you can always get back to the old system and continue working there.
When migrating legacy systems to Salesforce, some great opportunities may present themselves, such as improving data structures or removing unnecessary records and obsolete user interfaces. Taking such actions can lead to a further optimised and streamlined system, which in turn can boost user productivity and engagement dramatically. Better engagement means better data input, which leads to more effective decision making for your business. Everyone wins!
Once we had set up our client’s Service Cloud system and connected it to SAP, we were able to proceed to the User Acceptance Testing (UAT) stage. This was when we asked the client to make sure that the functionality lived up to their expectations in full.
During our client’s UAT, we noticed a couple of UI elements took users longer to find than we had expected. Accordingly, we made them more easily identifiable and discoverable, which sped up several processes.
As a result, the Parallel Run phase enabled our client to transfer all their employees to Salesforce smoothly, fix the issues that arose, and all without stopping business for a single moment.
As soon as we had made sure that everything was functioning correctly, we got down to the Decommissioning phase. The task was to turn the SAP C4C system off, but as the system was the connection between Service Cloud and all the integration, we first had to connect all the third-party systems to Salesforce directly.
We opted to break the task into several steps, integrating the systems to Salesforce one by one. As a result, we came to the following architecture:
The schema above shows all the systems integrating with Salesforce directly. Since we made sure that everything was functioning as expected through our QA team, we went ahead with the step-by-step SAP user deactivation, followed by a complete SAP shutdown.
With this step complete, our client’s Service Cloud system was fully operational, integrated with all their third-party systems, and with full adoption thanks to user onboarding training.
It would be great if changing platform and moving to Salesforce were as simple as swapping cars, but as a business can never stop, it’s more like disassembling and then reassembling the whole car while driving down the motorway. However, with a proven system and Salesforce experts who have gone through the process before, you can mitigate risks, keep your business running and migrate faster.
Chief Technology Officer
About the Author
Dmitry Zhugin is the Chief Technology Officer at VRP Consulting. After joining VRP Consulting as a Senior Software Developer in 2011, he rose through the ranks and has led many Salesforce projects including migrations from other systems.