Salesforce data grows with an increase in the number of transactions, customer sales and services records, leads, opportunities, and other data. As data volume increases, queries can take longer time to execute. Large data sets can cause governor limits to be reached more quickly, leading to errors and reduced functionality. Large data sets can result in slower page loads and reports. This post will explain all the ways to query large data sets efficiently.
1. SOQL for Loops
Initially, Salesforce Org has very few records so SOQL returns very few records very quickly. As organization grows these records keep increasing with customer data. This can lead to Governer limit error for SOQL query execution. Let us take an example Organization is using a Case object to store all support services.
List<Case> cases=[Select id, status, subject from Case];
for (Case cs : cases) {
// DML code here
}
The above query will return fewer records initially but after years of usage of the application, records will exceed 50 thousand. It will start throwing an exception in SOQL itself. If SOQL is executed properly then we will get the issue in the loop.
To handle SQOL where we need to use a loop to iterate and work on the record, we can use the below approach instead of the above one. The below loop can be used to iterate when local variable assignment or computation is required.
for (Case c : [Select id, status, subject from Case WHERE subject LIKE 'Inquiry%']) {
// Your code without DML statements here
}
To use a loop to iterate and handle a DML statement, we can use the below code.
for (List<Case> cases: [Select id, status, subject from Case WHERE subject LIKE 'Inquiry%']) {
for (Case c : cases) {
// Your code here
}
update cases;
}
If we assume that this SOQL can hit the Governer Limit better to use Batch Apex to do such operations. Refer post Optimizing Loop in Apex Code to learn more about how to efficiently use loops.
2. Use Selective Query
Selective filters reduce the number of records returned by a query, resulting in faster query execution times. Salesforce can retrieve relevant data more quickly by narrowing the dataset, which improves the application’s overall performance. By limiting the number of records returned, selective filters help to reduce memory consumption and avoid governer limits. This is critical in avoiding heap size limits, which can result in errors and performance degradation.
Check out the post Optimize SOQL Filter in Apex Code to learn more about selective filters in Salesforce Apex.
3. Avoid Unwanted Fields in SOQL
Avoid unwanted fields in SOQL and use only required fields. This will reduce the amount of data retrieved and processed. Sometimes we put unwanted fields and when we do a DML operation, it may fire a trigger that should not fire. This will add extra time for operation and might throw other governer limit exceptions.
Try to change existing code where unwanted fields are mentioned. If it is hard to change the existing codebase then atleast avoid it in new code.
4. Use Relationship Query Wisely
Relationship queries allow us to traverse relationships between objects (for example, fetching Contacts associated with an Account) in a single query. This helps to minimize the query overhead and improve query execution speed.
We can improve performance by efficiently processing related data in a single query, particularly when dealing with large datasets. It minimizes database round-trips and improves data retrieval, resulting in faster execution and better resource utilization.
Let us take an example, we want to process an account and its related contacts. The below code uses multiple queries to fetch data.
We can efficiently query data of accounts and related contacts using the below approach.
5. Use LIMIT to return the number of records
Salesforce imposes a variety of governor limits to ensure that resources are used efficiently in a multitenant environment. Querying too many records can exceed these limits, resulting in runtime exceptions. Limiting the number of records returned can improve query performance by reducing processing time and resource usage.
6. Use Batch Apex
If we have large no of records to process then we can use Asyncrnouse process like Queueable Apex or Batch Apex. These process can be used in different scenerio. Batch Apex breaks the process into smaller chunks so it can handle large no of records.
Below batch apex will process all account and its associated contact. System will divide all records in chunks and process each record asynchronously.
Refer post Avoid Batch Apex and Use Queueable Class to learn more about Batch Apex and Queueable Job.
7. Use SOSL for Text Searches
SOSL (Salesforce Object Search Language) should be used to search text in Salesforce Apex for large datasets. It allows for quick full-text searches across multiple objects and fields. SOSL queries can return data from multiple objects and are designed for search operations, making them an effective tool for dealing with large datasets.
8. Use Custom Indexes
Custom indexes significantly reduce the time required to search and retrieve records from large datasets. Without indexes, queries might require full table scans, which are resource-intensive and slow. Custom indexes help to ensure that queries are selective. It means they return a smaller subset of records rather than the entire dataset. Salesforce has a threshold for query selectivity, and non-selective queries can cause performance issues. Custom indexes are created by Salesforce support team, so when it is required we need to raise a request with complete detail.
Checkout our post Optimize SOQL Filter in Apex Code to understand about selective query in Salesforce.
References
Related Posts
- Optimizing Salesforce Apex Code
- Optimizing Loop in Apex Code
- Handle Heap Size for Apex Code Optimization
- Enhance Apex Performance with Platform Caching
- Apex Trigger Code Optimization
- Optimize Code by Disabling Debug Mod
- Optimize Apex Code by Metadata Caching