When designing a Salesforce data model, one of the first architectural decisions involves choosing the right type of relationships between objects. Lookups and Master-Detail relationships are powerful tools to link data, enforce business rules, and drive reporting. However, overusing them can cause performance, scalability, and maintenance challenges as your org grows.
Did you know?
- Salesforce allows a maximum of 40 relationship fields per object (2 Master-Detail by default, and the rest are lookups). It can be extended by a maximum of 50 by opening a case with Salesforce.
- A single report type can only include relationships up to 3 object levels deep.
- Orgs with 100M+ records often face query timeouts and locking issues if too many relationships are used.
Let’s explore why this matters, with examples from real-world scenarios.
1. Performance & Scalability Issues
1.1. Query Performance:
SOQL queries that join across 5+ related objects often slow down drastically. Imagine you have a custom Transaction__c
object with 15 Lookup fields to different reference tables (Branch, Customer, Region, Product, etc.). A report that pulls data from all these related objects becomes extremely slow, especially when the table has 50M records. Even a simple query like SELECT Id, Customer__r.Name, Region__r.Name FROM Transaction__c
could timeout.
Solution:
- Use Skinny Tables for frequently accessed fields.
- Denormalize data by storing high-usage attributes (Region, Segment) directly on the child object.
- Create custom indexes on filter fields to speed up queries.
1.2 Record Locking:
A Master-Detail relationship enforces cascade updates. In high-volume transaction systems, this leads to frequent “UNABLE_TO_LOCK_ROW” errors, especially when 100K+ child records point to a single parent. Suppose your Invoice__c
object has a Master-Detail relationship with Account
. If 100,000 invoices are linked to one Account, and a bulk update job tries to change the Account owner, Salesforce will attempt to update all 100K invoices. This results in row lock errors (UNABLE_TO_LOCK_ROW).
Solution:
- Avoid Master-Detail for very high-volume child objects; use Lookup + custom roll-ups (DLRS) instead.
- Distribute ownership to reduce skew (don’t assign all records to one owner).
- Break bulk operations into smaller batch sizes.
1.3 Data Skew:
Salesforce officially warns about ownership and lookup skew when more than 10,000 child records reference the same parent. This can make inserts and updates painfully slow. For example, A retail customer stored all orders under a single “Central Account” record for reporting convenience. With more than 1M child records pointing to one parent, inserts and updates on Order__c
took several minutes, breaking integrations.
Solution:
- Distribute children across multiple parent records.
- Use junction objects for many-to-many instead of dumping all into one parent.
- Enable Deferred Sharing Calculation for large data changes.
2. Complex Sharing & Security Model
2.1 Inherited Sharing:
Master-Detail automatically inherits parent sharing rules. With too many such relationships, this can unintentionally grant visibility to thousands of records. Let’s say your Case__c
object has a Master-Detail to Product__c
. If a user has access to the Product, they now automatically see all related Cases — even if Cases were meant to be restricted by region. That’s unintended data exposure.
Solution:Adding or modifying a single Lookup can require regression testing across dozens of profiles, permission sets, and sharing rules.
- Use Lookup with “Controlled by Parent” sharing only when truly required.
- For sensitive objects, prefer Lookup with independent sharing rules.
- Regularly review Field-Level Security & Sharing Reports.
2.2 Complex Security Rules
Orgs with 15+ Lookups on a core object often need layered sharing rules and permission sets. Admins spend more time debugging “why can/can’t this user see this record?” scenarios. A Project__c
object with 12+ Lookup fields (to Account, Contact, Department, Vendor, etc.) required admins to manage dozens of sharing rules and permission sets. Each new Lookup field created cascading impacts on visibility, confusing both admins and auditors.
Solution:
- Simplify design: collapse reference data into picklists or denormalized fields if security isn’t critical.
- Group users with Role Hierarchies instead of object-specific sharing.
- Use Enterprise Territory Management (ETM) or Criteria-Based Sharing Rules for scalable visibility.
2.3 Maintenance Risk
Adding or modifying a single Lookup can require regression testing across dozens of profiles, permission sets, and sharing rules. When a new Region__c
Lookup was added to the Project, it broke 7 existing sharing rules. Regression testing took 3 weeks, delaying a release.
Solution:
- Document security model clearly (use ERD + sharing diagram).
- Introduce new relationships in sandboxes with regression testing.
- Use Permission Set Groups instead of piling on multiple rules.
3. Reporting Limitations
3.1. Standard Report Types
Salesforce reports allow a maximum of 3 levels deep in relationships. If you overuse relationships, business users will constantly run into reporting gaps. For example, you can report on Account → Opportunity → Opportunity Product
, but not easily add a related Supplier__c
(4th level). If your schema is full of Lookups, business users hit these limits quickly.
Solution:
- Flatten schema for reporting by denormalizing common attributes.
- Build Custom Report Types for critical cross-object use cases.
- Use Data Cloud or Analytics (Tableau CRM) for complex reporting.
3.2 Performance in Reports & Dashboards
In an org with 50M+ records, related-object reports can take minutes to run, frustrating business users. A dashboard displaying revenue by region and product was taking 10 minutes to load because the underlying report retrieved data from five lookup object. Users stopped trusting Salesforce for reporting and exported data to Excel—defeating the purpose.
Solution:
- Pre-aggregate data with scheduled jobs or ETL pipelines.
- Leverage Summary fields, roll-ups, or Big Objects for historical data.
- Educate users to filter reports with indexed fields.
4. Data Integrity & Maintenance
4.1 Cascading Deletes
Deleting a parent with thousands of child records in Master-Detail can trigger massive deletes, putting strain on governor limits and causing accidental data loss. Suppose your Campaign__c
has a Master-Detail with Lead__c
. If someone deletes the Campaign, Salesforce automatically deletes all related Leads. In one case, a marketing team accidentally lost 20,000 leads in a single click.
Solution:
- For critical objects, use Lookup instead of Master-Detail.
- Enable “Restrict Deletion” custom validation before parent deletion.
- Keep archival backups (weekly export, Big Objects).
4.2 Orphan Records
Lookup fields without the required attribute often lead to 10–15% orphan records in real-world orgs, damaging data quality. Suppose, A Ticket__c
object had an optional Lookup to Customer__c
. Over time, 15% of Tickets ended up without a Customer because users skipped filling in the field. Reports on “Tickets per Customer” became unreliable.
Solution:
- Enforce required lookups with validation rules or flows.
- Run data integrity dashboards to monitor orphan rate.
- Use before-save flows to auto-populate missing lookups.
4.3 Harder Auditing
With 20+ relationships on a core object, a change like “make Customer__c mandatory” required checking all integrations, flows, and reports to ensure they wouldn’t fail. This auditing took dozens of hours.
Solution:
- Maintain ERDs (Entity Relationship Diagrams) in tools like Lucidchart/ERBuilder.
- Use Salesforce Schema Builder or metadata dependency APIs to analyze impacts.
- Establish a Data Governance Council to approve schema changes.
5. Design Best Practices: Normalization vs. Denormalization
5.1 Over-Normalization
While reducing redundancy, over-normalization in Salesforce creates highly fragmented data. A schema with 25+ Lookups on a core object often confuses both developers and end-users.
A team modeled their schema academically, splitting Address into Street__c
, City__c
, Zip__c
, and linking it via lookups to every transaction object. This created over 20 joins just to display a simple customer order in a report. The model was normalized but unusable.
Solution:
- Denormalize—keep Address as fields on the core object.
- Use Text/Formula fields instead of Lookup when reference data won’t change often.
5.2 De-Normalization
In some cases, it’s more efficient to store key information directly on an object rather than creating a chain of 10+ Lookups. For example, instead of linking an Order__c
object to Customer → Region → Country
through multiple Lookups, you can store the Region
and Country
fields directly on the Order__c
record. This reduces query complexity, speeds up reporting, and simplifies data access. This reduced reporting time from 6 minutes to under 30 seconds.
Solution:
- Use denormalized fields for reporting convenience and performance.
- Regularly review schema against data volume forecasts.
Summary
Lookups and master-detail relationships are foundational to Salesforce data modeling, but like any powerful tool, they should be used thoughtfully. Overuse can cause performance bottlenecks, reporting challenges, security complexity, and maintenance headaches.
When designing your Salesforce architecture:
- Think about the volume of data and growth patterns.
- Plan for reporting and analytics needs upfront.
- Balance normalization with practical denormalization.
By following these principles, you’ll ensure your Salesforce org remains scalable, secure, and maintainable as it evolves.
References
Related Posts
- Multi Select Lookup in Screen Flow
- Unlocking Success: Salesforce Interview Questions Explained
- The Ultimate Guide to Data Cleanup Techniques for Salesforce
- How to Elevate Your Career to Salesforce Architect
- Understanding the Salesforce Well-Architected Framework to Enhance Business Outcome
- Best Code Analysis Tools For Salesforce Development
- Automating data synchronization between Salesforce and Amazon Seller
- Steps for Successful Salesforce data migration
Need Help in Database Modelling?
Connect with us to discuss your requirements and handle your database modelling work.