For this particular use case where your org has a large amount of transactional data like Tasks and Activities in Salesforce you need a cheaper custom archiving solution.
This is when we can look as some of the Azure products and services, Azure Blob Storage is usually the first landing zone. From there, Azure Data Factory/Logic Apps/Functions can load the data into Azure SQL or a data warehouse.
This post walks through:
- Creating an Azure Storage Account
- Creating a Blob Container
- Generating a SAS token (so Salesforce can write files)
- Testing upload from the Azure Portal
- Wiring up Salesforce: config + Apex callout
- Putting it together in a batch-style pattern
1. Create an Azure Storage Account
- Log in to the Azure portal.
- In the search bar, type Storage accounts and click + Create.
- On the Basics tab:
- Subscription: choose your organisation’s subscription.
- Resource group: existing or new, e.g.
rg-salesforce-archive. - Storage account name:
- Must be unique, lowercase, no special characters (e.g.
archivestorage).
- Must be unique, lowercase, no special characters (e.g.
- Region: ideally the same region as your other Azure services.
- Performance: Standard.
- Redundancy: LRS (Locally redundant) is often enough for archiving.
- Leave the rest as default unless your infra team has standards.
- Click Review + create
You now have a Storage Account that exposes REST APIs for containers and blobs.
2. Create a Blob Container
Think of a container like a folder where Salesforce will drop CSV/JSON files.
- Open your new Storage Account in the Azure portal.
- In the left-hand menu, select Containers.
- Click + Container:
- Name: e.g.
salesforce-archives - Public access level: Private (No anonymous access)
- Name: e.g.
- Click Create.
This container will host your archive blobs.
3. Generate a SAS Token (Shared Access Signature)
Salesforce doesn’t know your storage account key (and shouldn’t). Instead, we grant it limited permissions via a SAS token – essentially a signed URL with time-bound and permission-bound access.
3.1 Generate container SAS in the portal
- In your Storage Account, go to Containers ? open
salesforce-archives. - Click Generate SAS (sometimes called Shared access tokens depending on portal UX).
- Configure:
- Permissions:
- Write (w)
- Create (c)
- (Optional) ? List (l) if you want Salesforce to list blobs.
- Start time: now (or a few minutes in the past to avoid clock skew).
- Expiry time: depends on policy – e.g. +1 year for a first version.
- Allowed protocols: HTTPS only.
- Permissions:
- Click Generate SAS.
You’ll get:
- Blob SAS URL (container URL with query string)
- SAS token (the query part, starting with
sv=...)
Example shape:
Blob SAS URL:
https://archivestorage.blob.core.windows.net/salesforce-archives?sv=2023-08-01&ss=b&srt=sco&sp=rcw&se=2026-01-01T00:00Z&sig=...
SAS token :
sv=2023-08-01&ss=b&srt=sco&sp=rcw&se=2026-01-01T00:00Z&sig=...
We’ll use this SAS token in Salesforce to build upload URLs.
For production, consider user delegation SAS with Entra ID for stronger security, but container SAS from the portal is fine to get started.
4. Sanity Check: Upload a File From the Portal
Before pulling Salesforce into the mix, make sure Blob is working.
- Go into the
salesforce-archivescontainer. - Click Upload.
- Choose a small file (e.g.
test.txt) and upload. - Confirm it appears in the blob list.
If this works, your Storage Account and container are correctly set up.
5. Salesforce Side – Storing the SAS and Endpoint
5.1 Decide where to store configuration
You want to avoid hard-coding secrets in Apex.
Typical pattern:
- Custom Metadata Type or Custom Settings to store:
- Storage account name (e.g.
archivestorage) - Container name (e.g.
salesforce-archives) - SAS token (the query string without
?)
- Storage account name (e.g.
Or you can use a Named Credential with the SAS embedded in the URL, but custom metadata is nice and flexible.
For this blog we’ll assume:
- A Custom Metadata Type
Azure_Storage_Config__mdtwith a single recordDefault:Storage_Account__c = archivestorageContainer_Name__c = salesforce-archivesSas_Token__c = sv=2023-08-01&ss=b&srt=sco&sp=rcw&se=2026-01-01T00:00Z&sig=...
5.2 Remote Site or Named Credential
You must allow callouts to Azure:
- Either add a Remote Site Setting:
- Name:
Azure_Storage - URL:
https://archivestorage.blob.core.windows.net
- Name:
- Or configure a Named Credential (unauthenticated) with that URL and then use
callout:Azure_Storagein Apex.
For simplicity, we’ll assume Remote Site + direct URL.
6. Apex: Uploading a CSV to Azure Blob via SAS
Azure’s REST API for Put Blob requires the x-ms-blob-type header and supports BlockBlob, AppendBlob, or PageBlob. For simple file uploads, BlockBlob is standard.
6.1 Helper class: AzureBlobArchiver
This helper:
- Reads config from custom metadata
- Builds the full SAS URL
- Uploads the content as a Block Blob
public with sharing class AzureBlobArchiver {
private class Config {
String storageAccount;
String container;
String sasToken;
}
private static Config getConfig() {
Azure_Storage_Config__mdt cfg = [
SELECT Storage_Account__c, Container_Name__c, Sas_Token__c
FROM Azure_Storage_Config__mdt
WHERE DeveloperName = 'Default'
LIMIT 1
];
Config c = new Config();
c.storageAccount = cfg.Storage_Account__c;
c.container = cfg.Container_Name__c;
c.sasToken = cfg.Sas_Token__c;
return c;
}
public static void uploadCsv(String fileName, String csvBody) {
Config cfg = getConfig();
// Encode file name for URL
String encodedName = EncodingUtil.urlEncode(fileName, 'UTF-8')
.replace('+', '%20'); // Azure + vs space nuance
String endpoint = 'https://' + cfg.storageAccount + '.blob.core.windows.net/' +
cfg.container + '/' + encodedName +
'?' + cfg.sasToken;
HttpRequest req = new HttpRequest();
req.setMethod('PUT');
req.setEndpoint(endpoint);
req.setHeader('x-ms-blob-type', 'BlockBlob');
req.setHeader('Content-Type', 'text/csv');
req.setBody(csvBody);
Http http = new Http();
HTTPResponse res = http.send();
if (res.getStatusCode() < 200 || res.getStatusCode() >= 300) {
System.debug('Azure Blob upload failed: ' + res.getStatus() + ' - ' + res.getBody());
throw new CalloutException('Azure Blob upload failed: ' + res.getStatus());
}
}
}
6.2 Quick test from Anonymous Apex
String csv = 'Id,Name\n001xx000003DhpX,Example Account';
AzureBlobArchiver.uploadCsv('test-upload.csv', csv);
If it succeeds, refresh the container in the Azure portal – you should see test-upload.csv.
7. Putting It Together With Batch Apex (Archiving Tasks)
Now let’s show a realistic pattern: archive old Tasks to Blob in batches.
7.1 Batch Apex skeleton
global class TaskArchiveBatch implements Database.Batchable<SObject>, Database.AllowsCallouts {
global Database.QueryLocator start(Database.BatchableContext ctx) {
return Database.getQueryLocator([
SELECT Id, Subject, Status, ActivityDate, OwnerId, CreatedDate, LastModifiedDate
FROM Task
WHERE ActivityDate < :Date.today().addMonths(-6)
]);
}
global void execute(Database.BatchableContext ctx, List<Task> scope) {
String csvBody = buildCsv(scope);
String fileName =
'task-archive-' +
Date.today().format() + '-' +
String.valueOf(System.currentTimeMillis()) + '-' +
ctx.getJobId().substring(0, 8) +
'.csv';
AzureBlobArchiver.uploadCsv(fileName, csvBody);
// Optional: delete the tasks *after* you’re confident in the archive
// delete scope;
}
global void finish(Database.BatchableContext ctx) {
// Log results, send email, etc.
}
private String buildCsv(List<Task> scope) {
List<String> rows = new List<String>();
rows.add('Id,Subject,Status,ActivityDate,OwnerId,CreatedDate,LastModifiedDate');
for (Task t : scope) {
List<String> cols = new List<String>{
escapeCsv(t.Id),
escapeCsv(t.Subject),
escapeCsv(t.Status),
escapeCsv(String.valueOf(t.ActivityDate)),
escapeCsv(String.valueOf(t.OwnerId)),
escapeCsv(String.valueOf(t.CreatedDate)),
escapeCsv(String.valueOf(t.LastModifiedDate))
};
rows.add(String.join(cols, ','));
}
return String.join(rows, '\n');
}
private String escapeCsv(String value) {
if (value == null) return '';
Boolean mustQuote = value.contains('"') || value.contains(',') || value.contains('\n');
String result = value.replace('"', '""');
return mustQuote ? '"' + result + '"' : result;
}
}
Then schedule it with a Schedulable or via the UI to run nightly/weekly.
Full source can be downloaded from github here
8. Where to Go Next
Once Blob uploads are working from Apex, you can:
- Use Azure Data Factory or Logic Apps to automatically load those CSVs into Azure SQL, using a Copy activity
- Add APIM or an Azure Function as a front-door later if you want to hide SAS usage from Salesforce completely.
- Build Power BI or other reporting on top of Azure SQL / Synapse later.