For many companies, Salesforce has become an integral part of their business processes. Although losing Salesforce data would be devastating for these businesses, many entrust their Salesforce data backup to Salesforce, at their own risk.
Salesforce does provide some backup, including real-time data replication to disk at each data center and between the production data center and the disaster recovery center. However, it doesn’t provide comprehensive backup to protect against all forms of data loss, including data corruption and data migration rollback.
Recover from data corruption (unintended user error or malicious activity)
While Salesforce will most likely not lose a customer’s data, nearly half of data loss in the cloud is because of a user error, including accidental deletion or data overwrites. When a Salesforce user deletes company data within Salesforce, it is only retained in Salesforce’s recycle bin for 15 days. Salesforce does provide a Data Recovery Service that extends to lost data up to 90 days old, but it can be expensive.
Other reasons to archive Salesforce data include:
- Prepare for a data migration rollback
- Archive data to reduce volumes
- Replicate data to a data warehouse/BI
- Take snapshots of development versions
Implementing a robust, Salesforce backup and restore solution has many technical challenges and there are many questions that need answering before you decide which method to choose, including:
What sort of backup do you want?
When it comes to the type of Salesforce backup, you can go with a full, incremental or partial backup. Each has pros and cons that you should consider:
|Full||Contains all data||Contains all information that may be required||Can represent a large volume|
Takes more time to retrieve a subset of data and be reactive to handle a data incident
|Incremental||Backup differences since last full backup (daily incremental backup, weekly, monthly, etc.)|
The data replication API is particularly adapted to this type of backup
|Efficient for retrieving a change that took place on a specific date|
Smaller files that are easier to handle
|May lack related information and take longer to rebuild a complete picture (merge full backup and last incremental backups)|
|Partial||Backup a subset of data (for example, closed cases only)||Efficient for retrieving a record from a subset of data, ideal approach for archiving purposes (i.e. records older than 5 years)||May lack related information|
It’s also possible to use several backup types to optimize backup time and target business critical information. A strong Salesforce backup plan could include a weekly full backup, daily incremental backup and a monthly partial backup.
When selecting a backup method to suit your organization’s needs, you should also consider if you want to use Salesforce native features (Weekly Data Export, Data loader), build your own solution leveraging APIs, or use a dedicated AppExchange app. There are also pros and cons these approaches, so it’s first best to consider the following factors.
Security should be top of mind when implementing a local backup, which could involve encryption of local data and/or other security measures. Consider the following security factors when to ensure a backup is secure.
- File system storage (data location, disk capacity, redundancy, availability, durability, encryption of the data at rest, encryption key management, physical security access, authentication, access logs, and other security requirements)
- Backup and Restore server configuration, run and maintenance (availability, redundancy, encryption of data in transit, encryption of cached data, network/CPU/memory capacity, etc.)
- Backup archives retention (retain backups for a specific amount of time)
Restoration of data is key to backup. Consider the following:
- Capability to quickly Restore a few records
- Capability to mass Restore and rebuild all relationships between objects
- Restore process and data quality
- Restore process and system integrations
- Restore process and automation (validation rules, workflows, triggers, etc.)
Fault Tolerance and Scalability
Backup solutions that run on regular basis should handle backup failures with minimal user interaction. Asynchronous solutions (such as a Bulk API) have some basic fault tolerance built-in (automatic retries for failed records). If the following factors are important for your backup process, consider a solution that provides fault tolerance.
- Volumes (need for extreme data volumes strategy)
- Backup and Restore required stability & retry capacity
- Backup and Restore required degree of automation
- Backup monitoring and optimizing capacities
- Backup and Restore required performance (i.e. need for parallel processing)
Customization and Automation
With Salesforce APIs, you have control over the entire backup process. If the following factors are important for your backup process, consider a solution that provides maximum flexibility, like Salesforce APIs.
- Backup and Restore scope (files, metadata, data)
- Backup automation frequency
- Need for Backup plan personalization (mix full backups and incremental backups, give higher priority to specific objects/fields/type of records, etc)
- Backup plan maintenance (environment change detection, support new Salesforce releases and API changes)