Salesforce Certified Data Architecture Practice Test

Disable ads (and more) with a membership for a one time $2.99 payment

Prepare for the Salesforce Certified Data Architecture Test. Access comprehensive flashcards and multiple choice questions, each with hints and explanations. Get exam-ready today!

Each practice test/flash card set has 50 randomly selected questions from a bank of over 500. You'll get a new set of questions each time!

Practice this question and more.


Which solutions should be recommended to manage performance and storage issues for historical case records?

  1. Export data out of Salesforce to Flat files

  2. Create a custom object to store case history

  3. Leverage on-premise data archival

  4. Leverage big object to archive case data

The correct answer is: Leverage on-premise data archival

The recommended solution of leveraging on-premise data archival addresses performance and storage issues effectively by removing less frequently accessed data from the Salesforce environment while still maintaining accessibility for reporting and compliance purposes. This approach allows historical case records to be stored in a more suitable environment where storage can be managed more cost-effectively and performance isn't impacted by the presence of large volumes of historical data. On-premise data archival serves to alleviate performance concerns by ensuring that the live Salesforce instance remains responsive and fast, as it will contain only the most relevant and currently needed data. By archiving historical data off-platform, organizations can adhere to best practices in data management while still being able to access that data through appropriate systems when needed. The other options may not provide the same level of effectiveness in managing both performance and storage. For example, exporting data to flat files may relieve storage in the moment but creates challenges with data accessibility, integrity, and consistency. Creating a custom object to store case history could lead to further complications, as it might not optimize performance or storage effectively without a structured data lifecycle approach. Utilizing big objects can also help with storage, but it typically requires specific use cases and may not seamlessly integrate with standard reporting requirements, making data access more challenging for users.