“It’s all in the cloud isn’t it?” is the biggest refrain I hear from any nearly acquired customer that I look after. But Salesforce is not like Google. It is a cloud-based service, but that doesn’t mean that it hoovers up all your data and stores it whether you like it or not. Backups are a choice.
Still. It’s mainly a point and click platform. How bad can things really get?
Read on to find out what 99% of people get wrong about Salesforce’s (Weekly) Data Export, and some suggestions of paid-for products.
Bonus note: Salesforce used to offer an emergency backup facility but this cost upwards of $20k as and when required, sometimes took weeks to perform, and it was withdrawn a few years ago anyway. Since then they’ve introduced a new product which I discuss below.
The problem is that humans are involved. Despite the best of intentions we make errors: whether it’s typos, misunderstandings, not thinking through or the situation changing dramatically once we’ve taken a course of action, they happen to all of us. Especially typos.
So, what to do? Decide what data needs to be backed up, and how frequently.
Note: I’m not covering how to restore. If you need to restore and you don’t know how to, speak to a Salesforce consultant/practitioner but at least you have the data and they’ll be able to do something for you!
What to back up
There are two types of data you need to think about.
Normal “input” data*
*there is probably a nicer name for this, but I can’t recall it at the moment, so needs must!
This is the stuff we enter manually day after day, insert via Data Loader or via API. If you make a simple typo it’s (usually) easy enough to spot and fix, but if you overwrite an number field that doesn’t have tracking enabled, you could be in a sticky situation. And a simple Data Loader operation gone wrong can overwrite or delete masses of data in seconds. So, yes this data needs to be backed up.
This is the business logic that is the gold dust Salesforce is based on. Whether it’s page layouts, fields, Flow or Apex, without this your Salesforce is a Microsoft Access table from 1990. But get one changeset wrong and you can reset it all back to scratch, or at least a prior state from three months ago, for example. I’ve seen it happen, and hearing a developer say the equivalent of “Houston, we have a problem” isn’t something I wish on anyone.
When to back up
Ok, so now we know what we’re backing up. The next question is how often:
If you are only tracking opportunities (a bit of a waste of Salesforce, but it happens) then perhaps a weekly backup is fine. But, even then, I think this is suitable for only a few organisations.
For anyone else, we must be talking daily. Who’s going to remember which Accounts, Contacts, Opportunities and any number of custom objects were edited yesterday, let alone seven days ago? Especially if the data is arriving in Salesforce via webform or an integration; not even by manual intervention from anyone within your company.
What are the options?
I’m going to name names! All prices correct as of April 2023.
BRONZE STANDARD: Weekly is Fine
Salesforce Export. Salesforce’ own export utility.
Input Data: This tool creates a manual backup of all your data but restoring requires manual effort – there’s no automation to help you. Also beware the weekly backup option. This option does not update automatically so if you add any new objects, either ones you create or via a managed package, they will not be added to the automatic weekly backup.
Metadata: Salesforce Export does not cover metadata, but the workaround is to regularly create sandboxes instead as that does copy all metadata. Just beware that if no one logs into a sandbox for 180 days it gets automatically deleted.
Final points are that any data backed up is only as secure as where you store it – it’s got everything, it should only go where (only) Admins have access; and the backup frequency is realistically weekly at best.
Instructions on how to use Salesforce’ Export follow in the Appendix at the end of this article.
SILVER STANDARD: Input Data is important, but budget is limited
Cost: $3 per user per month, with no minimum spend. 44% discount for nonprofits!
Backed up to EU AWS servers (other regions also available), and secured by Multi Factor Authentication (only if you turn it on though!), your data is backed up automatically – and stored in perpetuity – without having to think about it.
A metadata backup is included in the service, but I’ve found it to be impractical/immature for a number of reasons as it lacks granularity and certain types of metadata (e.g. page layouts) struggle where fields have been deleted. I would recommend going with sandbox-cheat method, as listed with Salesforce’s Data Exporter, to be on the safe side.
Advanced features include data/metadata comparing and sandbox seeding, although I haven’t tried these.
GOLD STANDARD: Input Data and Meta Data
I’m only listing the options where the pricing information is publicly available. For more choices please see this SalesforceBen article.
Cost: $2.50 per user per month, with a monthly minimum spend of $250 per month. 10% discount for nonprofits.
Gearset Backup is usually purchased as part of a DevOps offering (so looking after a full build & development cycle); it also has other features included in the basic price which I haven’t investigated because it operates beyond my scale. They even offer an hourly backup service which is still cheaper than OwnBackup’s daily backup offering. The only thing I particularly note is that their sandbox seeding appears to be considerably more expensive.
Cost: $3.65 per user per month, with a monthly minimum spend of $500. 15% discount for nonprofits.
This product is the word-of-mouth gold standard within the Salesforce ecosystem; marketing spend perhaps has its advantages. That said, I would strongly caution against OwnBackup’s “Enterprise”-named offering as it doesn’t include metadata restore, so I’ve listed the “Unlimited” price instead. Would you trust an operation that claimed to look after your business, but ignores your business processes? At least they publish their minimum price and nonprofit discount these days, whereas they didn’t previously, so it shows that they do listen to the market.
n.b. If you look at daily backup retention you could also argue that CloudAlly has a better policy, but I suspect (but do not know!) that the restore features are richer on OwnBackup.
Cost: Not published any more, but I’m including it for historical reference as the cost used to be published, and I’ve used the product.
I used to recommend Spanning; they used to be competitively priced, but when my UK-based CEO started to pull their hair out because the billing people didn’t speak to the sales people, and Spanning were sending chasing emails without accepting any UK-based methods of payment, I knew there was trouble.
Spanning has been through a few owners but has been owned by Kesaya for a number of years. Having previously spoken to some staff there, investment does not appear to be a priority.
Although it’s a robust product for my basic needs – and technically meets my criteria for “gold standard” – its glory days seem to be long past and I’m not aware of any feature development.
As with many things Salesforce, a product was announced but it’s hard to find any further details about it. I also operate on the maxim that if I need to ask for a price, I can’t afford it. With that said, I include it for completeness and also to highlight that there’s strong reasons to use, and not to use, a backup provider from the product you rely 100% on!
e.g. If Salesforce’s infrastructure takes a critical hit, it’s likely to take out the backup with it; on the converse side, if they don’t understand about backing up their system, does anyone else (actually, on this, the answer is probably “yes!”).
So, is that all you need to know about backups? Not if you’re serious. As with everything, it looks easy from the outside – everyone else’s job is always simple – but here are some additional considerations:
- Best bonus feature of backups: they can be used for (basic) point-in-time data comparisons; you’ll often get a request asking to compare data, this way those situations are catered for in a way that native Salesforce cannot cope with unless you have the foresight to run that specific snapshot. If you have a backup, this is sometimes an extremely easy request to pull off!
- Can it cope with complex loops? (e.g. when an object looks up to one – or more – records from the same object)
- Data storage location is, as we’ve hinted above, a high priority for Europe-based customers.
- Restore time. Not so much point in having a restore if it takes weeks to get access to the data.
- On platform restore or off-platform? If Salesforce goes completely down, can you still access the data.
- Speaking of restores, as a second bonus: how about restoring to a different Salesforce org – useful for populating sandboxes or even complete data migrations. Many, but not all, backup products support this as standard.
- GDPR: how do you handle “request to delete” requests. Theoretically if someone requests you delete their data you should delete it from your backups too(!), but perhaps you don’t necessarily need to over-engineer this. One apparently “reasonable” workaround is to keep a literal paper file to list and (re)fulfil these requests should the worst happen.
- Retention policy: If you want to be really awesome about this, GDPR says that you shouldn’t keep data for longer than necessary, so will your backup (purposely) expire after a certain period.
If you haven’t already worked it out: get your backups up and running. Anything is better than nothing.
You can do it for free with Salesforce Export, but weekly is never going to be enough if Salesforce is one of your core systems, and it’s not like it’s user-friendly in terms of restoring data.
For my money, and with no financial incentive, CloudAlly is a very cost-effective option at the more budget-constrained end of the market.
Appendix: Backing up using Data Loader
So, if you are going to go down the Data Loader/free route, this is how to do it.
Put a weekly recurring event/reminder in your calendar called “Salesforce Backup”. Don’t schedule it for a Monday due to bank holidays/long weekends.
Within the calendar entry put the following text:
- Go to Setup | Data | Data Export
- Click “Export Now” – do not use “Scheduled Export” for reasons discussed earlier in this article
- Change the Export File Encoding to “Unicode (UTF-8)” as this will ensure the broadest support for accents etc (top row in the screenshot above)
- Tick the two boxes at the top of the page which are to do with Images and Salesforce Files (as highlighted in the screenshot above)
- Tick “Include all data” (as, again, highlighted in the screenshot above)
- Press “Start Export”
- You’ll then get an email alert when the export is ready, which can be anything between 2 and 24 hours later, depending on the size of your org.
- Make sure you click and download the zip files within a timely fashion – the link expires after 48 hours.
- Make sure you move the zip files from your downloads folder to a secure area where only Salesforce Admins can access it, otherwise all the work on Profile and Permissions sets has been wasted!
- And if there’s loads of files and it’s all too much of a hassle… this is a clear sign you need an automated backup solution, such as one of the ones I’ve mentioned elsewhere in this guide.
That said, once you run Data Export, if you have more than (just) two or three backup files, the cost of any eventual restore will be extremely high due to the likely complexity of stitching the files back together. You organisation is highly likely to have the volume of data to really necessitate Silver or Gold options. To be blunt: Data Export isn’t suitable for you.
Also add to your diary to create a new Developer Sandbox once a month. This will back up your metadata, although it is – by default – only kept for 180 days. If you log in to your developer sandbox within that period, you can keep the metadata backup as long as you want (until you run out of sandboxes!).
- Go to Sandbox
- Click “New Sandbox”
- Name: backupXX (e.g. backup01)
- Description: “Metadata backup – will autodelete in 180 days”
- Click “Next” under the column which has “Developer” as the header
- Click “Create”
- All done – the configuration backup will stay within the Salesforce system, but a certified Salesforce Admin should know how to restore it.
If you have gone for the manual Data Export approach seven years appears to be a good rule of thumb for UK legal records, although I’m not going to defend you in a court of law! With that in mind:
- Keep weekly backups for three months
- Keep monthly backups for a year (keep only the first backup per month, once they are three months old)
- Thereafter keep quarterly backups for seven years
- If using the Sandbox strategy to retain metadata/configuration information, I tend to keep backups made just before “big releases” (lots of changes made in one go) for two years, and allow let the rest auto-expire after 180 days. After two years, if people haven’t highlighted that something is broken, it is unlikely that they ever will!
Better to have it and not need it, than the alternative!
If you liked this sort of content, there’s some more relatively random Salesforce-related items queued up, just Subscribe (details usually on the right hand side, but mobile views may differ), if you haven’t done so already, to make sure you don’t miss out!