Salesforce Guide: Salesforce Data Developer APEX Best practices
Salesforce Data Developer APEX Best Practices GuideSalesforce Data Developer APEX Best Practices Guide
Note: This guide focuses on Apex best practices specifically for interacting with Salesforce data (SOQL, DML, SOSL). While general Apex best practices (e.g., test coverage, security) are also critical, they are outside the primary scope of this data-centric guide.
1. Introduction
In the Salesforce ecosystem, data is at the heart of everything. As a Salesforce Data Developer, you are responsible for writing Apex code that interacts with this data – querying, inserting, updating, and deleting records. Unlike traditional development environments, Salesforce operates on a multi-tenant architecture, meaning your code shares resources with thousands of other customers. This necessitates a strict set of limitations, known as Governor Limits, to ensure fair resource allocation and system stability.
Adhering to APEX best practices for data operations isn't just about writing "good" code; it's about writing efficient, scalable, and robust code that avoids hitting these governor limits, performs optimally, maintains data integrity, and provides a seamless user experience. Neglecting these practices can lead to unhandled exceptions, slow performance, data corruption, and ultimately, a poor application. This guide will walk you through the essential principles and techniques to master data manipulation with Apex.
2. Step-by-Step Instructions with Examples
2.1. Bulkification: The Golden Rule
Always assume your Apex code will handle multiple records. Operations that work for a single record will often fail for many. This principle is called bulkification.
Example 1: Avoiding SOQL/DML in Loops
This is the most critical concept. Executing SOQL queries or DML statements inside a for or while loop will quickly exhaust governor limits (e.g., 100 SOQL queries, 150 DML statements per transaction).
❌ Anti-Pattern (Bad):
// Trigger context example
for (Account acc : Trigger.new) {
// Bad: SOQL inside a loop
Contact con = [SELECT Id FROM Contact WHERE AccountId = :acc.Id LIMIT 1];
// Bad: DML inside a loop
if (con != null) {
con.Description = 'Updated from Trigger';
update con;
}
}
✅ Best Practice (Good):
// Step 1: Collect all necessary IDs into a Set
Set<Id> accountIds = new Set<Id>();
for (Account acc : Trigger.new) {
accountIds.add(acc.Id);
}
// Step 2: Query all related records in a single SOQL statement
List<Contact> contactsToUpdate = [SELECT Id, Description, AccountId FROM Contact WHERE AccountId IN :accountIds];
// Step 3: Use a Map for efficient lookup (optional but recommended)
Map<Id, Contact> accountIdToContactMap = new Map<Id, Contact>();
for (Contact con : contactsToUpdate) {
accountIdToContactMap.put(con.AccountId, con);
}
// Step 4: Iterate through Trigger.new, perform logic, and collect records for DML
List<Contact> contactsForDML = new List<Contact>();
for (Account acc : Trigger.new) {
Contact relatedContact = accountIdToContactMap.get(acc.Id);
if (relatedContact != null) {
// Perform logic
relatedContact.Description = 'Updated from Bulkified Trigger - Account: ' + acc.Name;
contactsForDML.add(relatedContact);
}
}
// Step 5: Perform DML operation once outside the loop
if (!contactsForDML.isEmpty()) {
update contactsForDML;
}
2.2. Query Selectivity for Large Data Volumes
For queries on large objects (100k+ records), Salesforce ensures query performance through selectivity. A query is selective if it can use an index to retrieve results efficiently.
- A custom index is automatically created for external ID fields and unique fields.
- Standard fields often have indexes (e.g., Id, Name, CreatedDate, LastModifiedDate, Lookup/Master-Detail fields).
Example: Selective Query
// Good: Uses an indexed field (AccountId is a lookup field) List<Contact> specificContacts = [SELECT Id, Name FROM Contact WHERE AccountId = '001XXXXXXXXXXXXXXX'];// Good: Uses an indexed custom field (assuming My_External_Id__c is marked as External ID or Unique) List<MyObject__c> myRecords = [SELECT Id, Name FROM MyObject__c WHERE My_External_Id__c = 'EXT-12345'];
Considerations: Avoid conditions like LIKE '%searchTerm%' without other selective filters, or using non-indexed fields in WHERE clauses for large datasets.
2.3. Efficient SOQL/SOSL Techniques
2.3.1. Query Only Necessary Fields
Retrieving too many fields, especially large text fields, can contribute to heap size limits.
❌ Anti-Pattern (Bad):
// Querying all fields when only Id and Name are needed Account acc = [SELECT Id, Name, Phone, Website, AnnualRevenue, Description /* ... many more fields ... */ FROM Account WHERE Id = '001XXXXXXXXXXXXXXX'];
✅ Best Practice (Good):
// Only query Id and Name if that's all you need Account acc = [SELECT Id, Name FROM Account WHERE Id = '001XXXXXXXXXXXXXXX'];
2.3.2. Use SOQL FOR Loops for Large Result Sets
A SOQL for loop retrieves records in batches of 200, allowing you to process large query results without hitting heap limits.
// Process all active accounts without loading all into memory at once
for (Account activeAccount : [SELECT Id, Name FROM Account WHERE IsActive__c = true]) {
// Perform operations on 'activeAccount'
System.debug('Processing Account: ' + activeAccount.Name);
// Be careful with DML/SOQL inside this loop, still needs bulkification!
}
2.3.3. Understanding Relationship Queries (Parent-to-Child & Child-to-Parent)
Relationship queries are powerful but remember the limits (e.g., 2 levels for parent-to-child, 5 levels for child-to-parent).
Child-to-Parent:
// Query Contacts and their related Account's Name and Industry
List<Contact> contacts = [SELECT Id, Name, Email, Account.Name, Account.Industry
FROM Contact WHERE AccountId != NULL LIMIT 10];
for (Contact con : contacts) {
System.debug('Contact: ' + con.Name + ', Account: ' + con.Account.Name + ', Industry: ' + con.Account.Industry);
}
Parent-to-Child (Subquery):
// Query Accounts and their related Contacts
List<Account> accountsWithContacts = [SELECT Id, Name,
(SELECT Id, Name, Email FROM Contacts)
FROM Account WHERE Id IN :accountIds LIMIT 5]; // Limit accounts as well
for (Account acc : accountsWithContacts) {
System.debug('Account: ' + acc.Name);
for (Contact con : acc.Contacts) {
System.debug(' - Contact: ' + con.Name);
}
}
2.4. Using Maps for Efficient Data Access
Maps are invaluable for quickly associating records by ID or other keys, especially when processing related records in bulk.
// Scenario: Update Opportunity Contact Roles for a list of Opportunities List<Opportunity> oppsToProcess = [SELECT Id, Name, StageName FROM Opportunity WHERE StageName = 'Closed Won' LIMIT 50];Set<Id> oppIds = new Set<Id>(); for(Opportunity opp : oppsToProcess){ oppIds.add(opp.Id); }
// Query related Opportunity Contact Roles in bulk List<OpportunityContactRole> ocrList = [SELECT Id, OpportunityId, Role FROM OpportunityContactRole WHERE OpportunityId IN :oppIds];
// Create a Map to group OCRs by OpportunityId Map<Id, List<OpportunityContactRole>> oppIdToOcrsMap = new Map<Id, List<OpportunityContactRole>>(); for (OpportunityContactRole ocr : ocrList) { if (!oppIdToOcrsMap.containsKey(ocr.OpportunityId)) { oppIdToOcrsMap.put(ocr.OpportunityId, new List<OpportunityContactRole>()); } oppIdToOcrsMap.get(ocr.OpportunityId).add(ocr); }
// Now you can easily access OCRs for each Opportunity without additional queries for (Opportunity opp : oppsToProcess) { List<OpportunityContactRole> relatedOcrs = oppIdToOcrsMap.get(opp.Id); if (relatedOcrs != null) { System.debug('Opportunity: ' + opp.Name + ' has ' + relatedOcrs.size() + ' contact roles.'); // Perform logic on relatedOcrs } }
2.5. Robust Error Handling and Transactions
DML operations can fail. Implement proper error handling to manage exceptions and ensure data integrity.
2.5.1. Using try-catch for DML
List<Account> newAccounts = new List<Account>(); newAccounts.add(new Account(Name = 'Test Account 1')); newAccounts.add(new Account(Name = 'Test Account 2'));try { insert newAccounts; System.debug('Accounts inserted successfully.'); } catch (DmlException e) { System.debug('Error inserting accounts: ' + e.getMessage()); // Log the error, send email, or display to user // e.g., ApexPages.addMessages(e); }
2.5.2. Partial DML Success with Database Methods
Using Database.insert(records, false) allows some records to succeed even if others fail, and returns a list of Database.SaveResult objects.
List<Account> accountsToInsert = new List<Account>(); accountsToInsert.add(new Account(Name = 'Valid Account')); accountsToInsert.add(new Account(Name = null)); // This will cause an error on Name fieldDatabase.SaveResult[] results = Database.insert(accountsToInsert, false); // false allows partial success
for (Database.SaveResult sr : results) { if (sr.isSuccess()) { System.debug('Successfully inserted account with ID: ' + sr.getId()); } else { for(Database.Error err : sr.getErrors()) { System.debug('Error inserting account: ' + err.getMessage()); System.debug('Fields: ' + err.getFields()); System.debug('StatusCode: ' + err.getStatusCode()); } } }
2.5.3. Savepoints and Rollbacks for Complex Transactions
For operations involving multiple DML statements where you need an "all or nothing" approach, use savepoints.
Savepoint sp = Database.setSavepoint();try { // DML operation 1 insert new Account(Name = 'Main Account');
// DML operation 2 (might fail) insert new Contact(FirstName = 'Test', LastName = 'Contact', AccountId = 'InvalidId'); // This will fail // DML operation 3 insert new Opportunity(Name = 'New Opp', StageName = 'Prospecting', CloseDate = Date.today().addDays(30));} catch (DmlException e) { System.debug('Transaction failed. Rolling back: ' + e.getMessage()); Database.rollback(sp); // Roll back all DML operations since the savepoint }
2.6. Preventing Recursive Triggers
Triggers can fire other triggers or themselves, leading to infinite loops and governor limit exceptions. Use a static variable in a helper class to prevent this.
// MyTriggerHandler.cls
public class MyTriggerHandler {
public static boolean hasRun = false; // Static flag
public void onAfterUpdate(List<Account> newAccounts, Map<Id, Account> oldMap) {
if (!hasRun) {
hasRun = true; // Set flag to true to prevent re-entry
List<Contact> contactsToUpdate = new List<Contact>();
for (Account acc : newAccounts) {
if (acc.Name != oldMap.get(acc.Id).Name) {
// Example: update related contacts if account name changes
List<Contact> relatedContacts = [SELECT Id, Description FROM Contact WHERE AccountId = :acc.Id];
for (Contact con : relatedContacts) {
con.Description = 'Account Name changed to ' + acc.Name;
contactsToUpdate.add(con);
}
}
}
if (!contactsToUpdate.isEmpty()) {
update contactsToUpdate;
}
}
}
}
// AccountTrigger.trigger
trigger AccountTrigger on Account (after update) {
if (Trigger.isAfter && Trigger.isUpdate) {
new MyTriggerHandler().onAfterUpdate(Trigger.new, Trigger.oldMap);
}
}
3. Best Practices and Common Pitfalls
3.1. Best Practices
- Bulkification is King: Always design your code to handle collections of records, not just single records.
- SOQL/DML Outside Loops: This is the most crucial rule for avoiding governor limits. Gather all data/records first, then perform operations.
- Query Selectivity: Optimize SOQL queries by using indexed fields in
WHEREclauses, especially for large datasets, to ensure efficient performance. - Leverage Maps and Sets: Use these collections for efficient lookups and to avoid repeated queries or loops when correlating data.
- Robust Error Handling: Implement
try-catchblocks for DML operations. UseDatabase.insert(records, false)for partial success and detailed error reporting. - Prevent Recursion: Employ static variables in trigger handler classes to prevent infinite loops.
- Query Only Necessary Fields: Avoid
SELECT *(or selecting all fields). Only retrieve the fields you intend to use to conserve heap size and bandwidth. - Delegate to Asynchronous Apex: For operations that process extremely large data volumes, involve callouts, or are non-critical, use
@futuremethods, Queueable Apex, or Batch Apex. - Code Reusability: Create helper classes and service layers to encapsulate data logic, making your code modular, maintainable, and testable.
- Use
WITH SECURITY_ENFORCED(API v59.0+): Apply this clause to SOQL queries to automatically enforce Field-Level Security (FLS) and Object-Level Security (OLS) for the running user, reducing the need for manual security checks. - Validate Inputs: Always validate data inputs before performing DML to prevent errors and ensure data quality.
- Unit Testing: Write comprehensive unit tests that cover bulk data scenarios and error conditions to ensure your data operations are robust.
3.2. Common Pitfalls
- SOQL/DML in Loops: The absolute most common cause of governor limit exceptions (e.g., "Too many SOQL queries", "Too many DML statements").
- Non-Selective Queries: Queries on large objects without selective filters can lead to performance issues and query timeout exceptions.
- Lack of Error Handling: Unhandled DML exceptions can result in partial data saves and a poor user experience.
- Recursive Triggers: Triggers firing themselves repeatedly, leading to infinite loops and governor limit breaches.
- Hardcoding IDs: Embedding Salesforce record IDs (e.g., '001XXXXXXXXXXXXXXX') directly in code is a bad practice. Use custom settings, custom metadata, or query records dynamically.
- Querying Too Many Fields: Selecting every field on an object can lead to heap size limit exceptions, especially with large query results.
- Ignoring Governor Limits: Assuming your code will always work just because it's handling a small number of records in development.
- Not Considering Future Scalability: Writing code that works for current data volumes but breaks as data grows.
- Inefficient String Operations: Excessive string concatenation or manipulation on large strings can consume heap memory.
4. Use Cases and Scenarios Where This Is Applied
These best practices are fundamental across almost all Apex development scenarios involving data, but here are some specific use cases:
- Trigger Development:
- Updating child records when a parent record changes (e.g., updating contact addresses when an account address is updated).
- Rolling up data from child to parent (e.g., calculating sum of opportunity amounts on an account).
- Performing complex data validations before records are saved.
- Batch Apex:
- Processing large datasets for data migration, cleanup, or periodic updates.
- Recalculating fields or statuses across many records on a schedule.
- Integrations (API Calls):
- Receiving data from external systems and inserting/updating multiple records in Salesforce.
- Extracting large volumes of data from Salesforce to send to an external system.
- Custom UI with Apex Controllers (Visualforce/LWC):
- Saving multiple records from a custom form or data table.
- Displaying data from multiple related objects efficiently.
- Data Migration Scripts:
- Writing Apex scripts to transform and load data from legacy systems.
- Cleaning up existing data or backfilling missing information.
- Apex Scheduled Jobs:
- Running nightly jobs to synchronize data, send scheduled reports, or perform automated tasks.
5. Final Summary / Conclusion
Mastering Salesforce data development with Apex is a continuous journey that hinges on understanding and respecting the platform's multi-tenant architecture and governor limits. By consistently applying the best practices outlined in this guide – primarily bulkification, efficient querying, robust error handling, and preventing recursion – you will write Apex code that is not only functional but also scalable, performant, and maintainable.
These practices are not merely suggestions; they are critical for building reliable applications on the Salesforce platform. They ensure your solutions can handle growing data volumes, provide a consistent user experience, and prevent common pitfalls that lead to unexpected failures. Always remember to test your code thoroughly, especially with bulk data scenarios, and to continuously refine your approach as you encounter new challenges and leverage new platform features. Embrace these principles, and you'll become a highly effective Salesforce Data Developer.
Related Articles
Salesforce Guide: Salesforce Data Cloud and how it supports AI
A detailed Salesforce guide on Salesforce Data Cloud and how it supports AI.
Salesforce Guide: Salesforce Data Cloud Best Practices
A detailed Salesforce guide on Salesforce Data Cloud Best Practices.
Salesforce Earnings Beat, Offers Upbeat Guidance: ETFs in Focus - TradingView
Salesforce reported stronger-than-expected earnings and provided optimistic future guidance, leading to a focus on relevant Exchange Traded Funds.