Exploring Big Objects in Salesforce

I recently embarked on a journey through the landscape of Salesforce, tasked with storing specific field values from the Opportunity and Campaign objects every time a Contact gets linked to them.

My solution? Big Objects.

Their capacity to store vast amounts of data without affecting the standard data limit made them an obvious choice. If, like me, you’re new to Big Objects, allow me to share my insights.



Understanding Big Objects
Big Objects in Salesforce are built for managing colossal volumes of data.

They ensure rapid search and return performances, regardless of the data size.

They’re ideal for archiving Salesforce data or even housing data from external systems. The underlying foundation of Big Objects includes renowned big data technologies, such as hBase and Phoenix.

  • Standard Big Objects: These are pre-included by Salesforce, like the FieldHistoryArchive, which is a part of the Field Audit Trail. They come as default in all editions and aren’t customizable.
  • Custom Big Objects: As the name suggests, these are tailor-made by Salesforce users. For instance, they can be employed to retain Order transaction information for compliance or auditing purposes for long durations.

Distinguishing Big Objects from Regular Salesforce Objects
There are a few key differences:

  • Data Infrastructure: Big Objects use a horizontally scalable distributed database based on cutting-edge big data technologies. In contrast, Salesforce objects lean on relational databases, which could be overwhelmed beyond millions of records.
  • Permission Levels: Big Objects focus on Object and Field permissions, excluding sharing capabilities. Regular Salesforce objects encompass both.
  • UI Differences: Big Objects lean more towards custom Salesforce Lightning components or Visualforce pages, unlike Salesforce objects that cater to standard UI elements.
  • Record Creation: Inserting the same record in Big Objects yields a single result. For regular Salesforce objects, each insertion attempt creates a new record.
  • Limitations: Big Objects don’t align with features like Flows, Triggers, Processes, and the Salesforce mobile app. Moreover, they don’t support REST API or Salesforce connect external objects for access in another Org. Lastly, Big Objects do not offer encryption.

Harnessing Big Objects
Creating a custom Big Object can be done through Setup or the Metadata API.

After defining fields, the next crucial step is indexing. This is paramount since the index stands as the composite primary key, aiding in data queries. However, be wary: post-creation, indexes can’t be modified or removed.

Best Practices with Big Objects
Considering the vastness of data in Big Objects, certain practices can ensure smooth operations:

  • Retry Mechanism: For API or Apex batch records, retry mechanisms can salvage failed attempts.
  • Asynchronous Apex: Use it to mitigate mixed DML errors, ensuring seamless reading/writing from a standard object.
  • Exception Logging: Create custom objects to log anomalies or errors, providing clarity on any issues.

Interacting with Big Object Data
Given the vast data storage, direct report generation isn’t pragmatic. Instead, Async SOQL queries are a lifeline, although be mindful, they’re set for retirement by summer ’23. Alternatives include Bulk API 2.0 Query and batch Apex.

For writing into a Big Object, options like Bulk API, Apex, and Database.insertImmediate() method come in handy. Also, always sanitize your inputs to ensure query effectiveness. And for deletions, Apex and SOAP, with methods like Database.deleteImmediate(), serve well.


Navigating the vast seas of Salesforce, it’s apparent that Big Objects offer a robust solution for managing enormous data sets efficiently. Their intricate design, backed by modern technologies, ensures they’re both scalable and reliable. Whether you’re archiving vast data archives or streamlining business processes, harnessing the power of Big Objects can redefine your Salesforce experience.

As technology continues to evolve, staying updated and adaptable is the key.

Feel free to reach out to us if you have any questions or insights to share. Here’s to a future with better data management, one Big Object at a time!