To handle business requirements, we have to do a lot of customization in Salesforce Application. Mostly we customize apex code on record DML operation such as when the record will be created or updated perform some business logic or on deleting record do some validation logic. This customization adds complexity to our application and if it is not coded well then it will impact our application’s performance. This post will help in Optimizing Salesforce Apex Code which is added for handling business requirements.
Below are all possible code optimization that can be done while querying it or doing DML operations in apex code.
- DML or SOQL Inside Loops
- Limiting data rows for lists
- Disabling Debug Mode for production
- Stop Describing every time and use Caching of the described object
- Avoid .size() and use .isEmpty()
- Use Filter in SOQL
- Limit depth relationship code
- Avoid Heap Size
- Optimize Trigger
- Bulkify Code
- Foreign Key Relationship in SOQL
- Defer Sharing Rules
- Use collections like Map
- Querying Large Data Sets
- Use Asynchronous methods
- Avoid Hardcode in code
- Use Platform Caching
- Use property initializer for test class
- Unwanted Code Execution
We will discuss all above approaches one by one in the code practice series. Let us see the first good code practice (DML or SOQL Inside Loop) in this post.
DML or SOQL Inside Loops:
SOQL is a query language that retrieves data from salesforce objects and DML is a data manipulation language that inserts/updates or deletes records in Salesforce Object.
What problem can occur If we use DML or SOQL in loops?
Salesforce has a government limit of 100 SOQL and 150 DML in one transaction. If we write code that will execute query over 100 or DML over 150, we will get Error: System.LimitException: Too many SOQL queries: 101 or System.LimitException: Too many DML statements. So we should code in such a way that it should not reach that threshold.
So let us see examples where we can face limit exception and what are ways we can avoid DML/SOQL inside the loop.
Take an example, we have Account object which holds insurance customer information. When a record is updated, we will update the related address object. This is a very simple requirement and we can write the below code for this. Yes, we can do this using flow but for understanding, we will write code.
If we see the above code, it will perfectly work when you take action from UI like the lightning page, as it will take action for single record only. But assume later on, Account object will be updated from any data import tool or it will be updated in a batch job which will update thousands of records at once. In that case above code will work?
No, it will not work and throw the above-mentioned System.LimitException error. Why? If you see the above code carefully, we are using SOQL in a loop and we can only query 100 SOQL in one transaction so the above code will not work. So how to handle this SOQL limit exception.
We can resolve SQOL limit exception but getting all accout ids in loop and use that account ids in SOQL once (like line# 3-10). Now above code will not throw exceptions as we have handled SOQL inside for loop issue. But when this trigger executes from the data update tool, it will still throw Too many DML statements exception when records are more than 150 to update. Let us handle that exception also
Instead of updating record in loop we can update fiels in collection and update in object atonce (code line# 11-15). Now our code is good in respect to DML or SOQL inside a loop. Our SOQL is not in the loop and DML also is not in the loop. There is still one issue in code related to unwanted code execution which we will see in later posts.
Apex code executed in an atomic transaction. When we code, we should avoid more than 100 SOQL or 150 DML statements in single transaction. While coding, we should never think about single record processing, always code which will work in bulk record processing. We should also consider that particuler code block can be excuted from other record insertion or updation. Like account trigger can be fired from contact record upation also, so no of SQOL can increase if we have multiple depedent objects. We will see these fixes in later posts.
This optimization technnque is also applicable for flow execution. If we have created a record trigger flow then that flow will execute for each record when the bulk record is processed. So a similar approach is required in flow also.