Two Steps to Optimize Your Data Model and Avoid CRM Performance Degradation

By Tom Leddy | September 27, 2018 | Ask an Architect, Higher Education, Nonprofit

“In our organization, each member of our staff amasses hundreds (sometimes thousands) of records daily for the cases they work on. It’s an overflowing amount of data! What’s the best way to manage this volume of data to ensure there is no degradation in Salesforce performance?”

Improve your nonprofit and higher ed data management with these Salesforce tips

This question is more common than you may think! Though not all of you deal with thousands of records on a daily basis, understanding how to resolve data management issues is something that all Salesforce customers should know. A solid data management and architecture structure to deal with large volumes of data sets you up for success now and in the future.

The key thing to remember is: You, yes you, should worry about Large Data Volume (LDV).

Though there’s no one measurement that can be used to determine whether or not your organization that has a large volume of data, both large and small orgs are at risk. While the number of records in an org is a reliable indicator, the way the data is structured is another factor. Organizations with smaller data volumes but poorly architected data models can still have performance issues related to data. In general, though, an organization will most likely be considered to have large volumes of data if meets one or more of the following qualifications:

  • More than 5 million Records
  • Thousands of users with concurrent access
  • Parent Objects with more than 10,000 child records
  • More than 100GB of used storage space

Issues that can be caused by Large Data Volumes

Large amounts of data volume typically show themselves as performance-related issues. This includes issues around longer than expected search times, long wait times for fields on a screen to populate when a record is accessed, long record saving times, and other actions taking excessively long amounts of time to complete.

In addition to performance issues, large data volumes can also lead to record locking issues on parent records with large number of child records. Every time a child record is saved, its parent record is temporarily locked, so if thousands of child records are being accessed and updated simultaneously, their parent records will remain locked for extended periods of time. A huge inconvenience.

So let’s talk now about some solutions and proactive steps you can take to protect your org’s performance and see success with Salesforce.org Nonprofit Cloud, Education Cloud, and more.

1. Develop an Effective Archiving and Reporting Strategy

Salesforce provides a wide variety of tools to help organizations with large data volumes get a handle on their data. These tools can do wonders for an organization, but they won’t be nearly as effective without an underlying plan for their proper utilization. So if you’re going to take one thing away from this blog post, it should be that the most important aspect of handling any amount of data is to create an effective data management and reporting strategy before implementing a technical solution.

Some of the data management related questions you’re going to want to consider are:

What does my current data model look like? Create a data model diagram and roadmap that show your current and potential future states for any objects you’re utilizing. Then highlight areas where large numbers of records may be a concern.

For help: Trailhead: Data Modeling

How long do I need to retain my data? Think about retention requirements for each object. Donations may need to be retained for tax reporting purposes. Student Records may need to be retained for a certain period of time. Document your retention policy.
What other systems does my Salesforce data need to flow to? Create a data flow diagram to show source and target applications, which data elements flow between them and their associated volumes other attributes, to ensure you won’t cause any downstream effects in other systems.
What are the source systems for my Salesforce data? Create a data flow diagram to display data that’s created in other systems and replicated to Salesforce, or whether external data really should remain in its source system and be accessed via Salesforce lookups.
How will archiving or purging an object affect related objects? Document whether your organization has any scenario where child records would be required to be retained longer than their parents, and create a plan.

For help: Data Management Trailhead

Does all of your data to be stored in Salesforce or can it be stored externally and accessed through reporting? Many Salesforce customers maintain a year of data in Salesforce itself and store additional years of data in a data warehouse, accessing older records by simply running a report.

Objects that can typically be considered for archiving by .org customers:

In addition to the standard Salesforce objects, nonprofits and Higher Ed institutions have special sets of additional objects that should be taken into consideration during any discussions about data. It’s up to each individual organization to determine what their processes should be around these objects based on their own unique use cases, but they should be included in any properly put together data management strategy:

Higher Ed Organizations:

  • Former Students who have graduated, withdrawn or haven’t enrolled in any new classes for a certain amount of time.
  • Unaccepted applicants
  • Old Course Enrollments

Nonprofits:

  • Inactive Donors
  • Donations older than a certain time period (e.g. LYBUNT) and their related records

2. Choose a Data Management Tool

Once a data management strategy has been put in place, the next step is to determine the best way to execute it. Salesforce provides a number of tools that can help in this area. Check the table below to find the solution that fits your needs the best.

Data Management Tool Good for… Except that…
Big Objects
  • Data structures that allow you to store and manage massive amounts of data
  • Part of the Salesforce platform
  • A great way to archive data without having to utilize an external application
  • No standard reports, search, sharing rules or a UI
  • Data is inserted into and queried from Big Objects via manually created Async SOQL Queries
Data Storage Optimizer
  • New product, designed exclusively for Salesforce.org customers
  • Based on Big Objects, includes new features that improve the user experience
  • Best for an organization with over a million opportunities
  • Supports a limited number of Objects
  • A paid add-on that’s available for purchase (after Dreamforce 2018)
  • Data migration into the Data Storage Optimizer is manual
Off Platform Data Archiving
  • Move org data to an external platform for archiving and reporting as well
  • Best for larger organizations that have a centralized data warehouse
  • A middleware tool is required to periodically migrate data from Salesforce to an external database
  • An external reporting tool is needed to combine archived Salesforce data with archived data from other applications
Indexes and Skinny Tables
  • Data isn’t removed from Salesforce
  • Speed up processing using efficient searching and copying of data
  • Skinny Tables are read only tables that mirror database tables in Salesforce and contain frequently used fields to help avoid joins.
  • These tables can help improve performance
  • Additional indexes can only be added to a table by Salesforce Support.
  • Skinny table setup requires the involvement of Salesforce support
  • With skinny tables, additional processing is done during record saves to keep the data in sync
  • The system will determine if and when skinny tables are used during reporting and Apex code execution (i.e. developers have no control over this)
  • Field types and column counts are limited
  • If new fields are added to a source table, they are not automatically added to its associated Skinny Table

Helpful Resources for Your Salesforce Data Management

About the author
Tom LeddyTom Leddy is a Principal Customer Success Architect at Salesforce.org. He plays a critical role within Salesforce.org Advisory Services to help Higher Ed and Nonprofit customers accelerate their use of Salesforce technology and best practices.

This blog is part of our larger “Ask an Architect” content series. To learn more about engaging a Salesforce.org Customer Success Architect in your organization, please contact your Account Executive.