BDC_TRY
![]() |
![]() |
![]() |
Title of test:![]() BDC_TRY Description: BDC CHECK |




New Comment |
---|
NO RECORDS |
Which programming language is used for scripting in an SAP Analytics Cloud story?. Wrangling Expression Language. ABAP. Python. JavaScript. Which SAP Analytics Cloud feature uses natural language processing. Smart insight. Just Ask feature. Data analyzer. Digital boardroom. What features are supported by the SAP Analytics Cloud data analyzer? : There are 3 correct answers to this question. Calculated measures. Input controls. Conditional formatting. Charts. Linked dimensions. In SAP Analytics Cloud, you have a story based on an import model. The transactional data in the model's data source changes. How can you update the data in the model?. Refresh the story. Allow model import. Refresh the data source. Schedule the import. For a model in SAP Analytics Cloud you are using a live connection. Where is the data stored. Public dataset. SAP Analytics Cloud model. Source system. Embedded dataset. Which automatically created dimension type can you delete from an SAP Analytics Cloud analytic data model?. Generic. Date. Version. Organization. In an SAP Analytics Cloud planning data model, which dimensions are included by default? Note: There are 2 correct answers to this question. Organization. Version. Entity. Date. What is required to use version management in an SAP Analytics Cloud story?. Analytic model. Classic mode. Optimized mode. Planning model. What source system can you connect to with an SAP Analytics Cloud live connection that is provided by SAP BDC?. SAP Business ByDesign Analytics. SAP Datasphere. SAP SuccessFactors. SAP ERP. Related to data management, what are some capabilities of SAP Business Data Cloud? Note: There are 2 correct answers to this question. Store customer business data in 3rd party hyperscaler environments. Integrate and enrich customer business data for different analytics use cases. Delegate the integration of business data to partners and customers. Harmonize customer business data across different Line of Business applications. Which steps are executed when an SAP Business Data Cloud Intelligent Application is installed? Note: There are 2 correct answers to this question. Connection of SAP Datasphere with SAP Analytics Cloud. Creation of a dashboard for visualization. Execution of a machine-learning algorithm. Replication of data from the business applications to Foundation Services. What is the main storage type of the object store in SAP Business Data Cloud?. SAP HANA extended tables. SAP BW/4HANA DataStore objects (advanced. SAP HANA data lake files. SAP BW/4HANA InfoObjects. Which of the following activities does SAP Business Data Cloud cockpit support? Note: There are 2 correct answers to this question. Enhance Analytic Model. Debug authorization issue. Configure SAP Business Data Cloud. Discover and activate data products. What is a purpose of SAP Datasphere in the context of SAP Business Data Cloud. To install an intelligent application. To define a data product. To provide analytic models for intelligent applications. To maintain the system landscape for SAP Business Data Cloud. Which operation is implemented by the Foundation Services of SAP Business Data Cloud?. Execution of machine learning algorithms to generate additional insights. Generation of an analytic model by adding semantic information. Data transformation and enrichment to generate a data product. Storage of raw data inside a CDS view. What are some features of the out-of-the-box reporting with intelligent applications in SAP Business Data Cloud? Note: There are 2 correct answers to this question. Automated data provisioning from business application to dashboard. Services for transforming and enriching data. Manual creation of artifacts across all involved components. Al-based suggestions for intelligent applications in the SAP Business Data Cloud Cockpi. Which of the following data source objects can be used for an SAP Datasphere Replication Flow? Note: There are 2 correct answers to this question. Google Big Query dataset. ABAP CDS view. Oracle database table. MS Azure SQL table. How can you create a local table with a custom name in SAP Datasphere? Note: There are 2 correct answers to this question. By creating an intelligent lookup. By importing a CSV file. By creating a persistent snapshot of a view. By adding an output of a data flow. Which options do you have when using the remote table feature in SAP Datasphere? Note: There are 3 correct answers to this question. Data access can be switched from virtual to persisted, but not the other way around. Data can be loaded using advanced transformation capabilities. Data can be persisted in SAP Datasphere by creating a snapshot (copy of data). Data can be persisted by using real-time replication. Data can be accessed virtually by remote access to the source system. What do you use to write data from a local table in SAP Datasphere to an outbound target. Transformation Flow. Data Flow. Replication Flow. CSN Export. Why would you choose the "Validate Remote Tables" feature in the SAP Datasphere repository explorer?. To test if data has been replicated completely. To detect if remote tables are defined that are not used in Views. To preview data of remote tables. To identify structure updates of the remote source. For which purposes is a database user required in SAP Datasphere? Note: There are 2 correct answers to this question. To directly access the SAP HANA Cloud database of SAP Datasphere. To create a graphical view in SAP Datasphere. To access all schemas in SAP Datasphere. To provide a secure method for data exchange for 3rd party tools. What are some use cases for an SAP Datasphere task chain? Note: There are 3 correct answers to this question. Create or Refresh View Persistency. Upload a CSV file into a local table. Execute a Replication Flow and Transformation Flow in sequence. Run an Open SQL Schema Procedure. Execute a data action for a planning function. Which semantic usage type does SAP recommend you use in an SAP Datasphere graphical view to model master data?. Analytical Dataset. Relational Dataset. Fact. Dimension. What are the prerequisites for loading data using Data Provisioning Agent (DP Agent) for SAP Datasphere? Note: There are 2 correct answers to this question. The DP Agent is installed and configured on a local host. The data provisioning adapter is installed. The Cloud Connector is installed on a local host. The DP Agent is configured for a dedicated space in SAP Datasphere. How can you join two existing artifacts in SAP Datasphere? Note: There are 2 correct answers to this question. Create an Analytic Model based on the first artifact and add the second artifact as the Used in property. Create a graphical view and select the Join node icon. Create an SQL view with a JOIN operation. Create a graphical view, drag an artifact to the canvas, and the second one on top of the first one. Which entity can be used as a direct source of an SAP Datasphere analytic model?. Business entities of semantic type Dimension. Views of semantic type Fact. Tables of semantic type Hierarchy. Remote tables of semantic type Text. You want to combine external data with internal data via product ID. Although the data may be inconsistent, such as the external data contains the letter "O" where the internal data contains the digit 0, you still want to combine them. Which artifact should you use for matching?. Analytic Model. Entity Relationship Model. Graphical View. Intelligent Lookup. Which of the following SAP Datasphere objects can you create in the Data Builder? Note: There are 3 correct answers to this question. Intelligent Lookups. Spaces. Connection. Task Chains. Replication Flows. Which of the following can you do with an SAP Datasphere Data Flow? Note: There are 3 correct answers to this question. Write data to a table in a different SAP Datasphere tenant. Integrate data from different sources into one table. Delete records from a target table. Fill different target tables in parallel. Use a Python script for data transformation. |