UC has a requirement to migrate 100 million order records from a legacy ERP application into the salesforce platform. UC does not have any requirements around reporting on the migrated data.
What should a data architect recommend to reduce the performance degradation of the platform?
A. Create a custom object to store the data.
B. Use a standard big object defined by salesforce.
C. Use the standard "Order" object to store the data.
D. Implement a custom big object to store the data.
All accounts and opportunities are created in Salesforce. Salesforce is integrated with three systems:
An ERP system feeds order data into Salesforce and updates both Account and Opportunity records.
An accounting system feeds invoice data into Salesforce and updates both Account and Opportunity records.
A commission system feeds commission data into Salesforce and updates both Account and Opportunity records.
How should the architect determine which of these systems is the system of record?
A. Account and opportunity data originates in Salesforce, and therefore Salesforce is the system of record.
B. Whatever system updates the attribute or object should be the system of record for that field or object.
C. Whatever integration data flow runs last will, by default, determine which system is the system of record.
D. Data flows should be reviewed with the business users to determine the system of record per object or field.
A customer is facing locking issued when importing large data volumes of order records that are children in a master-detail relationship with the Account object. What is the recommended way to avoid locking issues during import?
A. Import Account records first followed by order records after sorting order by OrderID.
B. Import Account records first followed by order records after sorting orders by AccountID.
C. Change the relationship to Lookup and update the relationship to master-detail after import.
D. Import Order records and Account records separately and populate AccountID in orders using batch Apex.
UC needs to load a large volume of leads into salesforce on a weekly basis. During this process the validation rules are disabled.
What should a data architect recommend to ensure data quality is maintained in salesforce.
A. Activate validation rules once the leads are loaded into salesforce to maintain quality.
B. Allow validation rules to be activated during the load of leads into salesforce.
C. Develop custom APEX batch process to improve quality once the load is completed.
D. Ensure the lead data is preprocessed for quality before loading into salesforce.
Universal Containers (UC) is facing data quality issues where Sales Reps are creating duplicate customer accounts, contacts, and leads. UC wants to fix this issue immediately by prompting users about a record that possibly exists in Salesforce. UC wants a report regarding duplicate records. What would be the recommended approach to help UC start immediately?
A. Create an after insert and update trigger on the account, contact and lead, and send an error if a duplicate is found using a custom matching criteria.
B. Create a duplicate rule for account, lead, and contact, use standard matching rules for these objects, and set the action to report and alert for both creates and edits.
C. Create a duplicate rule for account, lead, and contact, use standard matching rules for these objects, and set the action to block for both creates and edits.
D. Create a before insert and update trigger on account, contact, and lead, and send an error if a duplicate is found using a custom matching criteria.
A manager at Cloud Kicks is importing Leads into Salesforce and needs to avoid creating duplicate records.
Which two approaches should the manager take to achieve this goal? (Choose two.)
A. Acquire an AppExchange Lead de-duplication application.
B. Implement Salesforce Matching and Duplicate Rules.
C. Run the Salesforce Lead Mass de-duplication tool.
D. Create a Workflow Rule to check for duplicate records.
Universal Containers (UC) is implementing Salesforce Sales Cloud and Service Cloud. As part of their implementation, they are planning to create a new custom object (Shipments), which will have a lookup relationship to Opportunities. When creating shipment records, Salesforce users need to manually input a customer reference, which is provided by customers, and will be stored in the Customer_Reference__c text custom field. Support agents will likely use this customer reference to search for Shipment records when resolving shipping issues. UC is expecting to have around 5 million shipment records created per year. What is the recommended solution to ensure that support agents using global search and reports can quickly find shipment records?
A. Implement an archiving process for shipment records created after five years.
B. Implement an archiving process for shipment records created after three years.
C. Set Customer-Reference_c as an External ID (non-unique).
D. Set Customer-Reference_c as an External ID (unique).
A company has 12 million records, and a nightly integration queries these records.
Which two areas should a Data Architect investigate during troubleshooting if queries are timing out? (Choose two.)
A. Make sure the query doesn't contain NULL in any filter criteria.
B. Create a formula field instead of having multiple filter criteria.
C. Create custom indexes on the fields used in the filter criteria.
D. Modify the integration users' profile to have View All Data.
Get Cloudy Consulting monitors 15,000 servers, and these servers automatically record their status every 10 minutes. Because of company policy, these status reports must be maintained for 5 years. Managers at Get Cloudy Consulting need access to up to one week's worth of these status reports with all of their details.
An Architect is recommending what data should be integrated into Salesforce and for how long it should be stored in Salesforce.
Which two limits should the Architect be aware of? (Choose two.)
A. Data storage limits
B. Workflow rule limits
C. API Request limits
D. Webservice callout limits
Cloud Kicks needs to purge detailed transactional records from Salesforce. The data should be aggregated at a summary level and available in Salesforce.
What are two automated approaches to fulfill this goal? (Choose two.)
A. Third-party Integration Tool (ETL)
B. Schedulable Batch Apex
C. Third-party Business Intelligence system
D. Apex Triggers