option
Questions
ayuda
daypo
search.php

Certified Data Architect

COMMENTS STATISTICS RECORDS
TAKE THE TEST
Title of test:
Certified Data Architect

Description:
Salesforce

Creation Date: 2022/12/21

Category: Others

Number of questions: 65

Rating:(2)
Share the Test:
Nuevo ComentarioNuevo Comentario
New Comment
NO RECORDS
Content:

Universal Containers keeps its Account data in Salesforce and its Invoice data in a third - party ERP system. They have connected the Invoice data through a Salesforce external object. They want data from both Accounts and Invoices visible in one report in one place. What two approaches should an architect suggest for achieving this solution? Choose 2 answers. Create a report combining data from the Account standard object and the Invoices external object. Create a Visualforce page combining Salesforce Account data and Invoice external object data. Create a report in an external system combining Salesforce Account data and Invoice data from the ERP. Create a separate Salesforce report for Accounts and Invoices and combine them in a dashboard.

UC is planning a massive SF implementation with large volumes of data. As part of the org's implementation, several roles, territories, groups, and sharing rules have been configured. The data architect has been tasked with loading all of the required data, including user data, in a timely manner. What should a data architect do to minimize data load times due to system calculations?. Leverage the Bulk API and concurrent processing with multiple batches. Load the data through data loader, and turn on parallel processing. Enable granular locking to avoid "UNABLE _TO_LOCK_ROW" error. Enable defer sharing calculations, and suspend sharing rule calculations.

Universal Containers has a legacy system that captures Conferences and Venues. These Conferences can occur at any Venue. They create hundreds of thousands of Conferences per year. Historically, they have only used 20 Venues. Which two things should the data architect consider when denormalizing this data model into a single Conference object with a Venue picklist? Choose 2 answers. Limitations on master-detail relationships. Bulk API limitations on picklist fields. Standard list view in-line editing. Org data storage limitations.

Get Cloudy Consulting is migrating their legacy system's users and data to Salesforce. They will be creating 15,000 users, 1.5 million Account records, and 15 million Invoice records. The visibility of these records is controlled by a 50 owner and criteria-based sharing rules. Get Cloudy Consulting needs to minimize data loading time during this migration to a new organization. Which two approaches will accomplish this goal? Choose 2 answers. Create the users, upload all data, and then deploy the sharing rules. Contact Salesforce to activate indexing before uploading the data. First, load all account records, and then load all user records. Defer sharing calculations until the data has finished uploading.

Ursa Major Solar has 4 million rows of data in Salesforce that are used in reports to assess historical trends. Both performance and data storage limits have become an issue. Which two strategies are appropriate when discussing the issue with stakeholders? Choose 2 answers. Utilize Data Loader to extract data, aggregate it, and write it back to a custom object, then delete the original records. Utilize scheduled batch Apex to copy aggregate information into a custom object and delete the original records. Combine Analytics Snapshots with a purging plan by reporting on the snapshot data and deleting the original records. Configure the Salesforce Archiving feature to archive older records and remove them from the data storage limits.

Universal Containers (UC) is planning to launch its Customer Community. The community will allow users to register shipment requests which are then processed by UC employees. Shipment requests contain header information, and then a list of no more than 5 items being shipped. UC will initially roll out its community to 5,000 customers in Europe, and will ultimately roll out to 20,000 customers worldwide within the next two years. UC expects an average of 10 shipment requests per week per customer. UC wants customers to be able to view up to three years of shipment requests and use Salesforce reports. What is the recommended solution for UC's Data Architect to address the requirements?. Create a custom object to track shipment requests and a child custom object to track shipment items. Implement an archiving process that moves data off-platform after three years. Create an external custom object to track shipment requests with five lookup custom fields for each item being shipped. External objects are stored off-platform in Heroku's Postgres database. Create an external custom object to track shipment requests and a child external object to track shipment items. External objects are stored off-platform in Heroku's Postgres database. Create a custom object to track shipment requests with five lookup custom fields for each item being shipped Implement an archiving process that moves data off-platform after three years.

Northern Trail Outfitters (NTO) is in the process of evaluating big objects to store large amounts of asset data from an external system. NTO will need to report on this asset data weekly. Which two native tools should a data architect recommend to achieve this reporting requirement? Choose 2 answers. Standard reports and dashboards. Einstein Analytics. Async SOQL with a custom object. Standard SOQL queries.

A large insurance provider is looking to implement Salesforce. The following exist. 1. Multiple channel for lead acquisition 2. Duplication leads across channels 3. Poor customer experience and higher costs On analysis, it found that there are duplicate leads that are resulting to mitigate the issues? Choose 3 answers. Build a custom solution to identify and merge duplicate leads. Standard lead information across all channels. Build process is manually search and merge duplicates. Implement a de-duplication strategy to prevent duplicate leads. Implement a third-party solution to clean and event lead data.

Universal Containers has a large number of Opportunity fields (100) that they want to track field history on. Which two actions should an architect perform in order to meet this requirement? Choose 2 answers. Create a custom object to store a copy of the record when changed. Select the 100 fields in the Opportunity Set History Tracking page. Create a custom object to store the previous and new field values. Use Analytic Snapshots to store a copy of the record when changed.

Which three options can prevent your SOQL queries from being selective? Choose 3 answers. Performing large loads and deletions. Using trailing % wildcards. Using leading % wildcards. Using a custom index on a deterministic formula field. Using NOT and != operators.

A large retail company has recently chosen Salesforce as its CRM solution. They have the following record counts: * 2,500,000 Accounts * 25,000,000 Contacts When doing an initial performance test, the data architect noticed an extremely slow response for reports and list views. What should a data architect do to solve the performance issues?. Load only data that the user is permitted to access. Add Custom Indexes on frequently searched Account and Contact objects fields. Create a Skinny Table to represent Account and Contact objects. Limit data loading to the 2,000 most recently created records.

Northern Trail Outfitter has implemented Salesforce for its associates nationwide, Senior management is concerned that the executive dashboard is not reliable for their real-time decision-making. On analysis, the team has the following issues with data entered in Salesforce. - Information in certain records is incomplete. - Incorrect entry in certain fields causes records to be excluded in report fitters. - Duplicate entries cause incorrect counts. Which three steps should a data architect recommend to address the issues? Choose 3 answers. Periodically export data to cleanse data and import them back into Salesforce for executive reports. Build a sales data warehouse with purpose-built data marts for dashboards and senior management reporting. Explore third-party data providers to enrich and augment information entered in the salesforce. Leverage Salesforce features, such as validate rules, to avoid incomplete and incorrect records. Design and implement a data-quality dashboard to monitor and act on records that are incomplete or incorrect.

UC has to build a B2C e-commerce site on Heroku that shares customer and order data with a Heroku Postgres database. UC is currently utilizing Postgres as the single source of truth for both customers and orders. UC has asked a data architect to replicate the data into salesforce so that salesforce can now act as the system of record. What are the 3 considerations that data architects should weigh before implementing this requirement? Choose 3 answers. Ensure there is a tight relationship between order data and an enterprise resource planning (ERP) application. Determine if the data is a driver of key processes implemented within Salesforce. Consider whether the data is required for sales reports, dashboards, and KPIs. Ensure the data is CRM-centric and able to populate standard or custom objects. A selection of the tools required to replicate the data.

Universal Containers (UC) has multi-level account hierarchies that represent departments within their major Accounts. Users are creating duplicate Contacts across multiple departments. UC wants to clean the data so as to have a single Contact across departments. What two solutions should UC implement to cleanse its data? Choose 2 answers. Use Data.com to standardize Contact address information to help identify duplicates. Make use of the Merge Contacts feature of Salesforce to merge duplicates for an Account. Use Workflow rules to standardize Contact information to identify and prevent duplicates. Make use of a third -party tool to help merge duplicate Contacts across Accounts.

Northern Trail Outfitters (NTO) runs its entire business out of an enterprise data warehouse (EDW). NTO's sales team is starting to use Salesforce after a recent implementation, but currently lacks the data required to advance leads and opportunities to the next stage. NTO's management has researched Salesforce Connect and would like to use it to virtualize and report on data from the EDW within Salesforce. NTO will be running thousands of reports per day across 10 to 15 external objects. What should a data architect consider before implementing Salesforce Connect for reporting?. OData callout limits per day. Maximum external objects per org. Maximum number of records returned. Maximum page size for server-driven paging.

Universal Containers (UC) is migrating from a legacy system to Salesforce CRM, UC is concerned about the quality of data being entered by users and through external integrations. Which two solutions should a data architect recommend to mitigate data quality issues? Choose 2 answers. Leverage picklist and lookup fields where possible. Leverage Apex to validate the format of data being entered via a mobile device. Leverage validation rules and workflows. Leverage third-party- AppExchange tools.

Northern Trail Outfitters uses Salesforce to manage relationships and track sales opportunities. It has 10 million customers and 100 million opportunities. The CEO has been complaining that a dashboard is taking 10 minutes to run and sometimes fails to load, throwing a time-out error. Which three options should help improve the dashboard performance? Choose 3 answers. Run the dashboard for the CEO and send it via email. Denormalize the data by reducing the number of joins. Remove widgets from the dashboard to reduce the number of graphics loaded. Reduce the amount of data queried by archiving unused opportunity records. Use selective queries to reduce the amount of data being returned.

Universal Containers has 30 million case records. The Case object has 80 fields. Agents are reporting performance issues and time-outs while running case reports in the Salesforce org. Which solution should a data architect recommend to improve reporting performance?. Create a custom object to store aggregate data and run reports. Build reports using custom Lightning components. Move data off of the platform and run reporting outside Salesforce, and give access to reports. Contact Salesforce support to enable a skinny table for cases.

UC has a roll-up summary field on Account to calculate the count of contacts associated with an account. During the account load, SF is throwing an "Unable to lock a row" error. Which solution should a data architect recommend to resolve the error?. Perform batch job in parallel mode, and reduce batch size. Defer rollup summary fields calculation during data migration. Leverage data loader platform API to load data. Perform batch job in serial mode, and reduce batch size.

Northern Trail Outfitters (NTO) has the following systems: - Customer master source of truth for customer information - Service cloud-customer support - Marketing cloud-marketing support - Enterprise data warehouse-business reporting The customer data is duplicated across all these systems and are not kept in sync. Customers are also complaining that they get repeated marketing emails and have to call to update their information. NTO is planning to implement a master data management (MDM) solution across the enterprise. Which three data will an MDM tool solve? Choose 3 answers. Data standardization. Data Completeness. Data accuracy and quality. Data loss and recovery. Data duplication.

NTO (Northern Trail Outlets) has a complex Salesforce org that has been developed over the past 5 years. Internal users are complaining about multiple data issues, including incomplete and duplicate data in the org. NTO has decided to engage a data architect to analyze and define data quality standards. Which 3 key factors should a data architect consider while defining data quality standards? Choose 3 answers. Define data duplication standards and rules. Define key fields in a staging database for data cleansing. Finalize an extract transform load (ETL) tool for data migration. Measure data timeliness and consistency. Measure data completeness and accuracy.

UC has a salesforce org with multiple automated processes defined for the group membership process. UC also has multiple admins on staff that perform manual adjustments to the role hierarchy. The automated task and manual task overlap daily and UC is experiencing "Lock errors" consistently. What should a data architect recommend to mitigate these errors?. Enable granular locking. Ask Salesforce support for additional CPU power. Remove SOQL statements from Apex Loops. Enable sharing recalculations.

Universal Containers (UC) has deployed Salesforce to manage Marketing. Sales, and Support efforts in a multi-system ERP environment After reaching the limits of native reports & dashboards. UC leadership is looking to understand what options can be used to provide more analytical insights. What two approaches should an architect recommend? Choose 2 answers. Setup Audit Trails. Wave Analytics. AppExchange Apps. Weekly Snapshots.

DreamHouse Realty needs an Architect to develop a solution that will integrate data and resolve duplicates and discrepancies between Salesforce and one or more external systems. What are two important questions the Architect should answer when determining whether to use Master Data Management in the solution? Choose 2 answers. How many systems are integrating with each other?. Will Salesforce replace a legacy system?. Does the system of record change for different tables?. Are the systems cloud-based or on-premise?.

Universal Containers (UC) wants to ensure their data on 100,000 Accounts pertaining mostly to US-based companies is enriched and cleansed on an ongoing basis. UC is looking for a solution that allows easy monitoring of key data quality metrics. What should be the recommended solution to meet this requirement?. Implement Batch Apex that calls out a third-party data quality API in order to monitor Account data quality. Use a declarative approach by installing and configuring Data.com Prospector to monitor Account data quality. Use a declarative approach by installing and configuring Data.com Clean to monitor Account data quality. Implement an Apex Trigger on Account that queries a third-party data quality API to monitor Account data quality.

UC has one SF org (Org A) and recently acquired a secondary company with its own Salesforce org (Org B). UC has decided to keep the orgs running separately but would like to bidirectionally share opportunities between the orgs in near-real time. Which 3 options should a data architect recommend to share data between Org A and Org B? Choose 3 answers. Develop an Apex class that pushes opportunity data between orgs daily via the Apex schedule. Leverage Heroku Connect and Heroku Postgres to bidirectionally sync Opportunities. Leverage middleware tools to bidirectionally send Opportunity data across orgs. Install a 3rd party AppExchange tool to handle the data sharing. Use Salesforce Connect and the cross-org adapter to visualize Opportunities in external objects.

Universal Containers (UC) has 1,000 accounts and 50,000 opportunities. UC has an enterprise security requirement to export all sales data outside of Salesforce on a weekly basis. The security requirement also calls for exporting key operational data that includes events such as file downloads, logins, logouts, etc. Which two recommended approaches would address the above requirement? Choose 2 answers. Use Weekly Export to extract transactional data to on-premise systems. Use Event Monitoring to extract event data to on-premise systems. Use a custom-built extract job to extract operational data to on-premise systems. Use Field Audit History to capture operational data and extract it to on-premise systems.

Universal Containers wishes to send data from Salesforce to an external system to generate invoices from their Order Management System (OMS). They want a Salesforce administrator to be able to customize which fields be sent to the external system without modifying code. What two approaches should an architect recommend to deliver the desired solution? Choose 2 answers. An Outbound Message to determine which fields to send to the OMS. A set<sobjectFieldset> to determine which fields to send in an HTTP callout. Enable the field-level security permissions for the fields to send. A Field Set that determines which fields to send in an HTTP callout.

Universal Containers (UC) is in the process of selling half of its company. As part of this split, UC's main Salesforce org will be divided into two orgs: Org A and Org B. UC has delivered these requirements to its data architect: 1. The data model for Org B will drastically change with different objects, fields, and picklist values. 2. Three million records will need to be migrated from Org A to Org B for compliance reasons. 3. The migration will need to occur within the next two months, prior to the split. Which migration strategy should a data architect use to successfully migrate the data?. Use an ETL tool to orchestrate the migration. Write a script to use the Bulk API. Use Data Loader for export and Data Import Wizard for import. Use the Salesforce CLI to query, export, and import.

Universal Container has a Sales Cloud implementation for a sales team and an enterprise resource planning (ERP) as a customer master Sales team are complaining about duplicate account and data quality issues with account data. Which two solutions should a data architect recommend to resolve the complaints? Choose 2 answers. Integrate Salesforce with ERP, and make ERP the system of truth. Implement a de-dupe solution and establish account ownership in Salesforce. Build a nightly sync job from ERP to Salesforce. Build a nightly batch job to de-dupe data, and merge account records.

Universal Containers is exporting 40 million Account records from Salesforce using Informatica Cloud. The ETL tool fails and the query log indicates a full table scan time-out failure. What is the recommended solution?. Modify the export query that includes standard index fields(s). Modify the export job header to specify Sforce-Enable-PKChunking. Modify the export job header to specify Export-in-Parallel. Modify the export query with LIMIT clause with Batch size 10,000.

A custom pricing engine for a Salesforce customer has to be decided by factors with the following hierarchy: 1. State in which the customer is located 2. City in which the customer is located if available 3. Zip code in which the customer is located if available 4. Changes to this information should have minimum code changes. What should a data architect recommend to maintain this information for the custom pricing engine that is to be built in Salesforce?. Create a custom object to maintain the pricing criteria. Maintain required pricing criteria in custom metadata types. Assign the pricing criteria within custom pricing engine. Configure the pricing criteria in price books.

Universal Containers (UC) has implemented Sales Cloud for its entire sales organization. UC has built a custom object called Projects_c that stores customer project details and employee billable hours. The following requirements are needed: 1. A subset of individuals from the finance team will need access to the Projects object for reporting and adjusting employee utilization. 2. The finance users will not need access to any sales objects, but they will need to interact with the custom object. Which license type should a data architect recommend for the finance team that best meets the requirements?. Service Cloud. Sales Cloud. Light Platform Starter. Lighting platform plus.

Northern Trail Outfitters (NTO) has a loyalty program to reward repeat customers. The following conditions exist: 1. Reward levels are earned based on the amount spent during the previous 12 months. 2. The program will track every item a customer has bought and grant them points for discounts. 3. The program generates 100 million records each month. NTO Customer Support would like to see a summary of a customer's recent transactions and the reward level(s) they have attained. Which solution should the data architect use to provide the information within Salesforce for the customer support agents?. Provide a button so that the agent can quickly open the point-of-sale system that displays the customer's history. Capture the reward program data in an external data store, and present the 12-month trailing summary in Salesforce using Salesforce Connect and an external object. Create a custom big object to capture the reward program data, display it on the contact record, and update it nightly from the point-of-sale system. Create a custom object in Salesforce to capture and store all reward programs, populate nightly from the point-of-sale system, and present on the customer record.

(NTO) has multiple salesforce orgs based on geographical reports (AMER, EMEA, APAC). NTO products are in the AMER org and need to be created in the EMEA and APAC after the products are approved. Which two features should a data architect recommend to share records between salesforce orgs? Choose 2 answers. Federation search. Salesforce 2 Salesforce. Salesforce connect. Change data capture (CDC).

A customer is operating in a highly regulated industry and is planning to implement Salesforce. The customer information maintained in Salesforce, includes the following: 1. Personally identifiable information (PII) 2. IP restrictions on profiles organized by geographic location 3. Financial records that need to be private and accessible only by the assigned sales associate 4. Users should not be allowed to export information from Salesforce Enterprise Security has mandated access to be restricted to users within a specific geography and detailed monitoring of user activity. Which three Salesforce Shield capabilities should a data architect recommend? Choose 3 answers. Event monitoring to monitor all user activity. Prevent sales users access to customer PII information. Restrict access to Salesforce from users outside specific geography. Encrypt sensitive customer information maintained in Salesforce. Transaction Security policies to prevent export of Salesforce data.

Universal Containers has deployed Salesforce for case management The company is having difficulty understanding what percentage of cases are resolved from the initial call to their support organization. What first step is recommended to implement a reporting solution to measure the support rep's case closure rates?. Create a report on Case analytic snapshots. Enable field history tracking on the Case object. Create Contact and Opportunity Reports and Dashboards. Install AppExchange packages for available reports.

UC is having issues using Informatica Cloud Louder to export +10M Order records. Each Order record has 10 Order Line Items. What two steps can you take to help correct this? Choose 2 answers. Limit Batch to 10K records. Export Bulk API in parallel mode. Export in multiple batches. Use PK Chunking.

Universal Containers (UC) has users complaining about reports timing out or simply taking too long to run. What two actions should the data architect recommend to improve the reporting experience? Choose 2 answers. Enable Divisions for large data objects. Create one skinny table per report. Share each report with fewer users. Index key fields used in report criteria.

Northern Trail Outfitters (NTO) has one million customer records spanning 25 years. As part of its new Salesforce project, NTO would like to create a master data management strategy to help preserve the history and relevance of its customer data. Which 3 activities will be required to identify a successful master data management strategy? Choose 3 answers. Define the system of record for critical data. Install a data warehouse. Identify data to be replicated. Create a data archive strategy. Choose a business intelligence tool.

Universal Containers (UC) wants to capture information on how data entities are stored within the different applications and systems used within the company. For that purpose, the architecture team decided to create a data dictionary covering the main business domains within UC. Which two common techniques are used in building a data dictionary to store information on how business entities are defined? Choose 2 answers. Use Salesforce Object Query Language. Use a data definition language. Use an entity relationship diagram. Use the Salesforce Metadata API.

UC has a legacy client-server app that has a relational database that needs to be migrated to salesforce. What are the 3 key actions that should be done when data modeling in salesforce? Choose 3 answers. Identify data elements to be persisted in salesforce. Map legacy data to salesforce objects. Map legacy data to salesforce custom objects. Work with legacy application owner to analyze legacy data model. Implement legacy data model within salesforce using custom fields.

Universal Containers (UC) provides shipping services to its customers. They use Opportunities to track customer shipments. At any given time, shipping status can be one of the 10 values. UC has 200,000 Opportunity records. When creating a new field to track the shipping status on the opportunity, what should the architect do to improve data quality and avoid data skew?. Create a picklist field, with values sorted alphabetically. Create a Master-Detail to custom object ShippingStatus__c. Create a Lookup to the custom object ShippingStatus__c. Create a text field and make it an external ID.

Universal Containers is planning out their archiving and purging plans going forward for their custom objects Topic__c and Comment__c. Several options are being considered, including analytics snapshots, offsite storage, scheduled purges, etc. Which three questions should be considered when designing an appropriate archiving strategy? Choose 3 answers. Which profiles and users currently have access to these custom object records?. If reporting is necessary, can the information be aggregated into fewer, summary records?. How many fields are defined on the custom objects that need to be archived?. Will the data being archived need to be reported on or accessed in any way in the future?. Are there any regulatory restrictions that will influence the archiving and purging plans?.

Universal Container is Implementing salesforce and needs to migrate data from two legacy systems. UC would like to clean and duplicate data before migrating to Salesforce. Which solution should a data architect recommend for a clean migration?. Define external IDs for an object, Insert data from one database, and use upsert for a second database. Set up staging database, and define external IDs to merge, clean duplicate data, and load into Salesforce. Define duplicate rules in Salesforce, and load data into Salesforce from both databases. Define external IDs for an object, migrate the second database to the first database, and load into Salesforce.

Universal Containers (UC) loads bulk leads and campaigns from third-party lead aggregators on a weekly and monthly basis. The expected lead record volume is 500K records per week, and the expected campaign record volume is 10K campaigns per week. After the upload, Lead records are shared with various sales agents via sharing rules and added as Campaign members via Apex triggers on Lead creation. UC agents work on leads for 6 months but want to keep the records in the system for at least 1 year for reference. Compliance requires them to be stored for a minimum of 3 years. After that, data can be deleted. What statement is true with respect to a data archiving strategy for UC?. UC can leverage the Salesforce Data Backup and Recovery feature for data archival needs. UC can leverage recycle bin capability, which guarantees record storage for 15 days after deletion. UC can leverage a "tier"-based approach to classify the record storage need. UC can store long-term lead records in custom storage objects to avoid counting against storage limits.

Northern Trail Outfitters (NTO) has outgrown its current Salesforce org and will be migrating to a new org shortly. As part of this process, NTO will be migrating all of its metadata and data. NTO’s data model in the source org has a complex relationship hierarchy with several master-detail and lookup relationships across objects, which should be maintained in the target org. Which three things should a data architect do to maintain the relationship hierarchy during migration? Choose 3 answers. Keep the relationship fields populated with the source record IDs in the import file. Create an external ID field for each object in the target org and map source record IDs to this field. Replace source record IDs with new record IDs from the target org in the import file. Redefine the master-detail relationship fields to lookup relationship fields in the target org. Use data loader to export the data from the source org and then import/upsert into the target org in sequential order.

Universal Containers wants to develop a dashboard in Salesforce that will allow Sales Managers to do data exploration using their mobile device (i.e., drill down into sales-related data) and have the possibility of adding ad-hoc filters while on the move. What is a recommended solution for building data exploration dashboards in Salesforce?. Create a standard Salesforce Dashboard and connect it to reports with the appropriate filters. Create a Dashboard in an external reporting tool, export data to the tool, and add link to the dashboard in Salesforce. Create a Dashboard in an external reporting tool, export data to the tool, and embed the dashboard in Salesforce using the Canvas toolkit. Create a Dashboard using Analytics Cloud that will allow the user to create ad-hoc lenses and drill down.

Universal Containers (UC) is implementing its new Internet of Things technology, which consists of smart containers that provide information on container temperature and humidity updated every 10 minutes back to UC. There are roughly 10,000 containers equipped with this technology with the number expected to increase to 50,000 across the next five years. It is essential that Salesforce users have access to current and historical temperature and humidity data for each container. What is the recommended solution?. Create new custom fields for temperature and humidity in the existing Container custom object, as well as an external ID field that is unique for each container. These custom fields are updated when a new measure is received. Create a new Container Reading custom object, which is created when a new measure is received for a specific container. The Container Reading custom object has a master-detail relationship to the container object. Create a new Lightning Component that displays the last humidity and temperature data for a specific container and can also display historical trends obtaining relevant data from UC’s existing data warehouse. Create a new Container Reading custom object with a master-detail relationship to Container which is created when a new measure is received for a specific container. Implement an archiving process that runs every hour.

Universal Containers (UC) has a custom discount request object set as a detail object with a custom product object as the master. There is a requirement to allow the creation of generic discount requests without the custom product object as its master record. What solution should an Architect recommend to UC?. Remove the master-detail relationship and keep the objects separate. Create a placeholder product record for the generic discount request. Mandate the selection of a custom product for each discount request. Change the master-detail relationship to a lookup relationship.

UC has multiple SF orgs that are distributed across regional branches. Each branch stores local customer data inside its orgs Account and Contact objects. This creates a scenario where UC is unable to view customers across all orgs. UC has the initiative to create a 360-degree view of the customer, as UC would like to see Account and Contact data from all orgs in one place. What should a data architect suggest to achieve this 360-degree view of the customer?. Use Salesforce Connect's cross-org adapter. Use an ETL tool to migrate gap Accounts and Contacts into each org. Consolidate the data from each org into a centralized datastore. Build a bidirectional integration between all orgs.

Get Cloudy Consulting uses an invoicing system that has specific requirements. One requirement is that attachments associated with the Invoice_c custom object be classified by Types (i.e., "Purchase Order", "Receipt", etc.) so that reporting can be performed on invoices showing the number of attachments grouped by Type. What should an Architect do to categorize the attachments to fulfill these requirements?. Create a custom picklist field for the Type on the standard Attachment object with the values. Create a custom object related to the Invoice object with a picklist field for the Type. Add a ContentType picklist field to the Attachment layout and create additional picklist options. Add additional options to the standard ContentType picklist field for the Attachment object.

UC is trying to switch from legacy CRM to salesforce and wants to keep legacy CRM and salesforce in place till all the functionality is deployed in salesforce. They want to keep data in synch b/w Salesforce, legacy CRM and SAP. What is the recommendation? Choose 2 answers. Integrate SAP with Salesforce, SAP to legacy CRM but not legacy CRM to Salesforce. Do not integrate legacy CRM to Salesforce, but integrate salesforce to SAP. Suggest MDM solution and link MDM to salesforce and SAP. Integrate legacy CRM to salesforce and keep data in synch till new functionality is in place.

Northern Trail Outfitters (NTO) need to extract 50 million records from a custom object every day from its Salesforce org. NTO is facing query time-out issues while extracting these records. What should a data architect recommend in order to get around the time-out issue?. Ask Salesforce support to increase the query time-out value. Use extract transform tool (ETL) tool for the extraction of records. Use Rest API to extract data as it automatically chunks records by 200. Use custom auto number and formula field and use that to chunk records while extracting data.

Cloud Kicks has the following requirements: - Their Shipment custom object must always relate to a Product, a Sender, and a Receiver (all separate custom objects). - If a Shipment is currently associated with a Product, Sender, or Receiver, deletion of those records should not be allowed. - Each custom object must have separate sharing models. What should an Architect do to fulfill these requirements?. Create a Master-Detail relationship to each of the three parent records. Create two Master-Detail and one Lookup relationship to the parent records. Associate the Shipment to each parent record by using a VLOOKUP formula field. Create a required Lookup relationship to each of the three parent records.

Universal Containers (UC) is implementing Salesforce Sales Cloud and Service Cloud. As part of their implementation, they are planning to create a new custom object (Shipments), which will have a lookup relationship to Opportunities. When creating shipment records, Salesforce users need to manually input a customer reference, which is provided by customers and will be stored in the Customer_Reference__c text custom field. Support agents will likely use this customer reference to search for Shipment records when resolving shipping issues. UC is expecting to have around 5 million shipment records created per year. What is the recommended solution to ensure that support agents using global search and reports can quickly find shipment records?. Implement an archiving process for shipment records created after three years. Set Customer_Reference__c as an External ID (unique). Implement an archiving process for shipment records created after five years. Set Customer_Reference__c as an External ID (non-unique).

Universal Containers (UC) has a very large and complex Salesforce org with hundreds of validation rules and triggers. The triggers are responsible for system updates and data manipulation as records are created or updated by users. A majority of the automation tool within UC'' org was not designed to run during a data load. UC is importing 100,000 records into Salesforce across several objects over the weekend. What should a data architect do to mitigate any unwanted results during the import?. Ensure duplication and matching rules and defined. Ensure validation rules, triggers and other automation tools are disabled. Import the data in smaller batches over a 24-hour period. Bulkily the trigger to handle import leads.

Get Cloudy Consulting uses Salesforce for tracking opportunities (Opportunity) and currently has the following environment: – An internal ERP system is in place for tracking services and invoicing. – The ERP system supports SOAP API and CData for bi-directional integration between Salesforce and the ERP system. – 950,000 opportunities exist; for each opportunity, the company sends one invoice per month during a 12-month period. Get Cloudy Consulting sales reps must view the current invoice status and invoice amount from the opportunity page. When creating an object to model invoices, what should the Architect suggest, considering performance and data storage space?. Create an external object Invoice_x with a Lookup relationship with Opportunity. Create a custom object Invoice_c with a Lookup relationship with Opportunity. Retrieve the current status from the ERP by using Streaming API, and display on the Opportunity page. Create a custom object Invoice_c with a master-detail relationship with Opportunity.

Every year, Ursa Major Solar has more than 1 million orders. Each order contains an average of 10 line items. The Chief Executive Officer (CEO) needs the Sales Reps to see how much money each customer generates year-over-year. However, data storage is running low in Salesforce. Which approach for data archiving is appropriate for this scenario?. 1. Annually export and delete order line items. 2. Store them in a zip file in case the data is needed later. 1. Annually aggregate order amount data to store in a custom object. 2. Delete those orders and order line items. 1. Annually export and delete orders and order line items. 2. Store them in a zip file in case the data is needed later. 1. Annually delete orders and order line items. 2. Ensure the customer has order information in another system.

Company S was recently acquired by Company T. As part of the acquisition, all of the data for Company S’s Salesforce instance (source) must be migrated into Company T’s Salesforce instance (target). Company S has 6 million Case records. An Architect has been tasked with optimizing the data load time. What should the Architect consider to achieve this goal?. Pre-process the data, then use Data Loader with SOAP API to upsert with zip compression enabled. Directly leverage Salesforce-to-Salesforce functionality to load Case data. Load the data in multiple sets using Bulk API parallel processes. Utilize the Salesforce Org Migration Tool from the Setup Data Management menu.

Cloud Kicks needs to optimize data stewardship engagement for a Salesforce instance. Before proposing design recommendations, the Data Architect is first assessing relevant areas of Salesforce. Which three areas are appropriate to assess? Choose 3 answers. Assess the metadata xml files for redundant fields to consolidate. Determine if any integration points create records in Salesforce. Export the setup audit trail to review what fields are being used. Run key reports to determine what fields should be required. Assess the sharing model to determine the impact on duplicate records.

Cloud Kicks has a Salesforce instance with 12,000 Account records. Managers at the company have noticed similar, but not identical, Account names and addresses. The Chief Technology Officer (CTO) at Cloud Kicks is concerned about proper data quality. Which steps should the CTO take to address this issue?. 1. Use a service to standardize Account addresses. 2. Use a 3rd-party tool to merge Accounts based on rules. 1. Run a report. 2. Find Accounts whose name starts with the same five characters, and merge those Accounts. 1. Have the Account Owner clean their Accounts’ addresses. 2. Merge Accounts with the same address. 1. Enable Account de-duplication by creating matching rules in Salesforce. 2. The system will then mass merge duplicate Accounts.

Universal Containers developers have created a new Lightning component that uses an Apex controller using a SOQL query to populate a custom list view. Users are complaining that the component often fails to load and returns a time-out error. What tool should a data architect use to identify why the query is taking too long?. Enable and use the Query Plan tool in the developer console. Use Splunk to query the system logs looking for transaction time and CPU usage. Use Salesforce’s query optimizer to analyze the query in the developer console. Open a ticket with Salesforce support to retrieve transaction logs to be analyzed for processing time.

Northern Trail Outfitters (NTO) has multiple systems across its enterprise landscape, including Salesforce, with disparate versions of the customer record. In Salesforce, the customer is represented by the Contact object. NTO utilizes a master data management (MDM) solution with these attributes: 1. The MDM solution keeps track of the Customer Master with a Master Key. 2. The Master Key is a map of the record IDs from each external system that customer data is stored within. 3. The MDM solution provides deduplication features, so it acts as the Single Source of Truth. How should a Data Architect implement the storage of the Master Key within Salesforce?. Create a custom object to store the Master Key with a lookup field to Contact. Store the Master Key on the Contact object as an External ID field for referential integrity. Store the Master Key in Heroku Postgres and use Heroku Connect for synchronization. Create an external object to store the Master Key with a lookup field to Contact.

Universal Containers (UC) requires 2 years of customer-related cases to be available on Salesforce for operational reporting. Any cases older than 2 years and up to 7 years need to be available on demand to service agents. UC creates 5 million cases per year. Which two data archiving strategies should a data architect recommend? Choose 2 answers. Use Custom objects for cases older than 2 years and use nightly batch to move them. Sync cases older than 2 years to an external database, and provide access to service agents to the database. Use Big objects for cases older than 2 years, and use nightly batches to move them. Use Heroku and External objects to display cases older than 2 years and Bulk API to hard delete from Salesforce.

Report abuse