A Salesforce Administrator’s Guide to Duplicate Management

Admin Configuration, Salesforce

Managing duplicate is an ongoing problem Salesforce administrators are dealing with on a daily basis and as the volume of data grows, this problem only worsens. This is why Salesforce administrators are always on the lookout for new methods and tools that could help them deal with this problem. We created a guide that would allow Salesforce administrators to be more proactive in the duplicate detection process and make sure such issues are identified and resolved before they snowball into big problems. 

What are the True Costs of Duplicates  

While many people tend to limit the damage caused by duplicates to calling somebody over and over again, the impact is a lot greater. First, let’s take a look at the costs. Usually, it costs companies $1 to prevent a duplicate, $10 to correct each one and simply ignoring them will cost $100 per duplicate. These costs can really start to snowball since a recent study revealed that on average, 15% of companies’ sales lead are duplicates. Therefore, if you have hundreds of thousands or even a million records in your Salesforce, you can imagine the costs you are incurring by keeping those duplicate. 

A lot of such costs come in the form of wasted employee productivity and opportunities. In fact, according to MIT’s Sloan School of Management, employees waste as much as 50% of their time on routine data quality tasks. It prevents your sales and marketing teams from getting an accurate picture of the customer journey while increasing the cost of analytics. From this standpoint, everybody in the company should be concerned with the problem of duplicates since they drain company resources and negatively impact customer relationships. 

Should You Start Deduping Today?

A lot of companies tend to put off dealing with duplicates, but this is just a huge problem waiting to happen. For example, when companies notice all of the issues they are having with duplicates, they want to dedupe everything all at once which could be hundreds of thousands of records. This is extremely dangerous because you are putting your data at risk and by rushing all of these processes you are making it very difficult for all of the adjustments to get exactly right. Therefore, when you start the dedupe process, you need to understand that this will not happen overnight 

So leave yourself with some time before your next critical deadline to get all of the duplicates eliminated. 

Calculating the ROI of Deduping

If you are a Salesforce administrator looking to get executive buy-in to purchase new deduping software, you will have to present them with the ROI that such tools would offer. The best way to do this is by adding up all of the lost hour’s sales reps waste sorting through data, identifying duplicates, and merging them manually. Ask them how many times per day or week they spend on such tasks and multiply this how much time it takes them to resolve a single duplicate. 

Step 1: Understand Why Duplicates Occur 

Image credit: https://contentohana.com/

In order to resolve any issues you are dealing with, you need to understand their underlying cause(s). Duplicates in Salesforce might occur:: 

  • Imported external databases.
  • Migrated data.
  • Web-to-Lead entries.
  • Manually entered by Users.

In order to avoid these issues, administrators start creating automated processes to prevent them, but it is not possible to create rules for every scenario. Also, simply piling on more rules can only exacerbate the problem since the system can start merging records that should not be merged. 

Step 2: Know the Limitations of the Out-of-the-Box Product 

We mentioned that the limitations in the out-of-the-box version of Salesforce contribute to the problem of duplicates, but administrators need to be aware of precisely what those limitations are. Salesforce offers some matching rules for the following objects: Accounts, Contacts, and Leads. For most companies, this is simply not enough because there are so many possible duplicate variations and the warnings are not always helpful either. Also, you need to be aware that Salesforce does not perform a duplicate search across these objects. For example, if a record for John Smith already exists in an employee’s Contacts and then they create another record under the same name in Leads, the matching algorithm will not pick this up. 

If your organization is working with large data volumes the duplicate detection capabilities offered by Salesforce will probably fall short of your needs. Salesforce itself admits that this is a problem and warned users about this on their website: “In an org with many records, duplicate jobs can fail”. Also, you need to be mindful of the edition of Salesforce your company is using since a lot of the top duplicate catching capabilities are now limited to the most expensive Salesforce packages.. Therefore, if your organization is using a cheaper edition, you may be out of luck and have to fend against duplicates on your own. 

Step 3: Take Steps to Prevent Users from Manually Inputting Duplicates 

Image credit: https://www.salesforce.com/

Salesforce gives you the ability to create user alerts when they manually input duplicate information, with the option to ignore the alert if they feel that they are not entering duplicate. Such conditions can be applied to particular scenarios with the profile and username being the most common conditions used for such purposes. Now that you have rules in place to catch the duplicates, you also need to know how well you are doing. This is where reports will come in handy and show whether or not these rules are having the desired impact. You can use the duplicate record set object to determine if you are catching the number of duplicates you were hoping for. 

If you are noticing too many false positives showing up in the reports, you have a couple of options. The first option is to redefine the matching rules, however, it will be time-consuming to create a rule for every single scenario. It could also mean that more user training is needed if you notice that the right alerts are popping up, but they are being ignored. Finally, you can opt for third-party software that will help you and the end-users catch the duplicates. This takes us into selecting the right tools for the job.  

Step 4: Select the Right Tools 

Image credit: https://www.express-journal.com/

When you are ready to start looking at tools to help you manage all of the duplicates, you should look at the ones that offer a comprehensive solution. For example, a lot of tools out there will only identify 100% carbon copies, or they will not perform a cross-object search. Also, it is a good idea to address the problem of constantly creating new rules to detect duplicates. Even if a tool addresses the so-called fuzzy duplicate issue, you will still have to create rules that identify each scenario employees encounter. 

What you need to be aware of is that machine learning has advanced to the point where it can spot duplicates and learn to identify more hidden ones as well. The way it works is that the system presents the user with two records it suspects of being duplicates in a side-by-side comparison. If the user labels them as duplicates the system will automatically detect such records in the future and merge them. Imagine how much time such automation will save and all of the confusion it would eliminate. 

Step 5: Get Feedback 

Image credit: https://energycentral.com/

Now that the user has a chance to compare and contrast their experience with using the duplicate detection system offered by Salesforce and the third-party tool, it is important to get feedback in terms of usability. For example, you may notice that a particular tool is effective as far as catching the duplicates are concerned, but it is such a pain to use that people are avoiding it altogether. Striking the right balance here is very important because this is not strictly a technical issue and user feedback always needs to be taken into account before deciding to invest in a particular product. 

This is why tools powered by machine learning work best in such scenarios. They literally work and learn with your employees to identify duplicates and will be able to prevent them in the future with all of the knowledge they accumulated. On the technical side, this eliminates the need to keep creating new rules to catch duplicates since the system learns on its own which increases your ROI even more. 

Datagroomr Takes Duplicate to a Whole New Level 

Datagroomr is one of the easiest duplicate detection tools to use since it is powered by machine learning. Your employees will appreciate its capabilities in detecting fuzzy duplicates and it will also take the burden off your Salesforce admins since Datagroomr will free up their time to focus on more critical issues. As we have seen from some of the data mentioned above, if duplicates inside Salesforce are left as they are, the cost of each one can start to snowball and end up eating up a lot of resources and wasting the time of your employees. Take the necessary preemptive steps to cut the costs of your duplicate issues before they spiral out of control. 

Thanks for reading 🙂 Happy Learning

#AwesomeAdmins #SFDCPanther #Salesforce

Leave a Reply