option
My Daypo

DP-900 Practice - Part 3

COMMENTS STADISTICS RECORDS
TAKE THE TEST
Title of test:
DP-900 Practice - Part 3

Description:
Azure Data Fundamentals - DP-900

Author:
ECS
(Other tests from this author)

Creation Date:
21/11/2021

Category:
Others

Number of questions: 47
Share the Test:
Facebook
Twitter
Whatsapp
Share the Test:
Facebook
Twitter
Whatsapp
Last comments
No comments about this test.
Content:
91. You need to design and model a database by using a graphical tool that supports project-oriented offline database development. What should you use? Microsoft SQL Server Data Tools (SSDT) Microsoft SQL Server Management Studio (SSMS) Azure Databricks Azure Data Studio.
92. Match the security components to the appropriate scenarios. Authetication Firewall Encryption.
93. For each of the following statements, select Yes if the statement is true. Otherwise, select No. Azure Table Storage supports multiple read replicas. Azure Table Storage supports multiple write regions. The Azure Cosmos DB Table API supports multiple read replicas. The Azure Cosmos DB Table API supports multiple write regions.
94. Match the types of data stores to the appropriate scenarios. Key/value Object Graph.
95. You have an Azure Cosmos DB account that uses the Core (SQL) API. Which two settings can you configure at the container level? Each correct answer presents a complete solution. (Choose two.) the throughput the read region the partition key the API.
96. Your company is designing a data store that will contain student data. The data has the following format. Which type of data store should you use? graph key/value object columnar.
97. Which storage solution supports role-based access control (RBAC) at the file and folder level? Azure Disk Storage Azure Data Lake Storage Azure Blob Storage Azure Queue Storage.
98. You need to store data in Azure Blob storage for seven years to meet your company's compliance requirements. The retrieval time of the data is unimportant. The solution must minimize storage costs. Which storage tier should you use? Archive Hot Cool.
99. Which type of non-relational data store supports a flexible schema, stores data as JSON files, and stores the all the data for an entity in the same document? document columnar graph time series.
100. Match the Azure Cosmos DB APIs to the appropriate data structures. Cassandra API Gremlin API MongoDB API Table API.
101. To configure an Azure Storage account to support both security at the folder level and atomic directory manipulation, enable the hierarchical namespace. set Account kind to BlobStorage. set Performance to Premium. set Replication to Read-access geo-redundant storage (RA-GRS).
102. You can query a graph database in Azure Cosmos DB: as a JSON document by using a SQL-like language. as a partitioned row store by using Cassandra Query Language (CQL) as a partitioned row store by using Language-Integrated Query (LINQ). as nodes and edges by using the Gremlin language.
103. You manage an application that stores data in a shared folder on a Windows server. You need to move the shared folder to Azure Storage. Which type of Azure Storage should you use? Queue Blob File Table.
104. Your company is designing a database that will contain session data for a website. The data will include notifications, personalization attributes, and products that are added to a shopping cart. Which type of data store will provide the lowest latency to retrieve the data? Key/value Graph Columnar Document.
105. For each of the following statements, select Yes if the statement is true. Otherwise, select No. When ingesting data from Azure Data Lake Storage across Azure regions, you will incur costs for bandwidth. You can use blod, table and file storage in the same Azure Storage account. You implement Azure Data Lake Storage by creating an Azure Storage account.
106. When using the Azure Cosmos DB Gremlin API, the container resource type is projected as a: Graph. Table. Partition Key. Document.
107. Which scenario is an example of a streaming workload? sending transactions that are older than a month to an archive. sending transactions daily from point of sale (POS) devices. sending telemetry data from edge devices. sending cloud infrastructure metadata every 30 minutes.
108. Batch workloads process data in memory, row-by-row. collect and process data at most once a day. process data as new is received in near real-time. collect data and the process the data when a condition is met.
109. For each of the following statements, select Yes if the statement is true. Otherwise, select No. Processing salary payments once a month is an example of a batch workload. A wind turbine that sends 50 sensor readings per second is an example or a streaming workload. A home electricity meter that sends reading once a day to an energy provider is an example of a streaming workload.
110. You need to gather real-time telemetry data from a mobile application. Which type of workload describes this scenario? Online Transaction Processing (OLTP) Batch Massively parallel processing (MPP) Streaming.
111. You have a SQL pool in Azure Synapse Analytics that is only used actively every night for eight hours. You need to minimize the cost of the SQL pool during idle times. The solution must ensure that the data remains intact. What should you do on the SQL pool? Scale down the data warehouse units (DWUs). Pause the pool. Create a user-defined restore point. Delete the pool.
112. Which Azure Data Factory component initiates the execution of a pipeline? A control flow A trigger A parameter An activity.
113. Match the types of activities to the appropriate Azure Data Factory activities. Control Data movement Data transformation.
114. What are three characteristics of an Online Transaction Processing (OLTP) workload? Each correct answer presents a complete solution. (Choose three.) Denormalized data Heavy writes and moderate reads Light writes and heavy reads Schema on write Schema on read Normalized data.
115. Which two activities can be performed entirely by using the Microsoft Power BI service? Each correct answer presents a complete solution. (Choose two.) Report and dashboard creation Report sharing and distribution Data modeling Data acquisition and preparation.
116. In Azure Data Factory, you can use ____________ to orchestrate pipeline antivities that depend on the output pipeline activities. a control flow a dataset a linked service an integration runtime.
117. You have a quality assurance application that reads data from a data warehouse. Which type of processing does the application use? Online Transaction Processing (OLTP) Batch processing Online Analytical Processing (OLAP) Stream processing.
118. Which three objects can be added to a Microsoft Power BI dashboard? Each correct answer presents a complete solution. ( Choose three.) A report page A Microsoft PowerPoint slide A visualization from a report A dataflow A text box.
119. For each of the following statements, select Yes if the statement is true. Otherwise, select No. A Microsoft Power BI dashboard is associated with a single workspace. A Microsoft Power BI dashboard can only display visualizations from a single dataset. A Microsoft Power BI dashboard can display visualizations from a Microsoft Excel workbook.
120. Which Azure Data Factory component provides the compute environment for activities? A linked service An integration runtime A control flow A pipeline.
121. You need to use Transact-SQL to query files in Azure Data Lake Storage from an Azure Synapse Analytics data warehouse. What should you use to query the files? Azure Functions Microsoft SQL Server Integration Services (SSIS) PolyBase Azure Data Factory.
122. For each of the following statements, select Yes if the statement is true. Otherwise, select No. Azure Databricks is an Apache Spark-based collaborative analytics. Azure Analytics Services is used for transactional workloads. Azure Data Factory orchestrates data integration workflows.
123. For each of the following statements, select Yes if the statement is true. Otherwise, select No. Batch processing can output data to a file store. Batch processing can output data to a relational database. Batch processing can output data to a NoSQL database.
124. Match the types of visualizations to the appropriate descriptions. Treemap Key influencer Scatter.
125. For each of the following statements, select Yes if the statement is true. Otherwise, select No. Platform as a service (PaaS) database offerings in Azure require less setup and configuration effort than infrastructure as a service (IaaS) database offerings. Platform as a service (PaaS) database offerings in Azure provide administrators with the control and update the operating system. All platform as a service (PaaS) database offerings in Azure can be paused to reduce costs.
126. For each of the following statements, select Yes if the statement is true. Otherwise, select No. Platform as a service (PaaS) database offerings in Azure provide built-in high availability. Platform as a service (PaaS) database offerings in Azure provide configurable scaling options. Platform as a service (PaaS) database offerings in Azure reduce the administrative overhead for managing hardware.
127. You have an application that runs on Windows and requires access to a mapped drive. Which Azure service should you use? Azure Files Azure Blob Storage Azure Cosmos DB Azure Table Storage.
128. For each of the following statements, select Yes if the statement is true. Otherwise, select No. The Azure Cosmos DB API is configured separately for each database in an Azure Cosmos DB account. Partition keys are used in Azure Cosmos DB to optimize queries. Items contained in the same Azure Cosmos DB logical partition can different partition keys.
129. Which two Azure services can be used to provision Apache Spark clusters? Each correct answer presents a complete solution. (Choose two.) Azure Time Series Insights Azure HDInsight Azure Databricks Azure Log Analytics.
130. Match the Azure services to the appropriate locations in the architecture. Azure Cognitive Search Azure Data Catalog Azure Data Factory Azure Synapse Analytics.
131. In a data warehousing workload, data __________________ from a single source is distributed to multiple locations from multiple sources is combined in a single location is added to a queue for multiple systems to process is used to train machine learning models.
132. For each of the following statements, select Yes if the statement is true. Otherwise, select No. A pipeline is a representation of a data structure within Azure Data Factory Azure Data Factory pipelines can execute other pipelines A processing step within an Azure Data Factory pipeline is an activity.
133. Match the Azure services to the appropriate requirements. Azure Data Factory Azure Data Lake Sorage Azure SQL Database Azure Synapse Analytics.
134. For each of the following statements, select Yes if the statement is true. Otherwise, select No. Azure Synapse Analytics scales storage and compute independently Azure Synapse Analytics can be paused to reduce compute costs An Azure Synapse Analytics data warehouse has a fixed storage capacity.
135. For each of the following statements, select Yes if the statement is true. Otherwise, select No. Azure Data Studio can be used to query an Azure SQL database from a device that runs macOS. Microsoft SQL Server Management Studio (SSMS) enables users to create and use SQL Notebooks. Azure Data Studio can be used to restore a database.
136. For each of the following statements, select Yes if the statement is true. Otherwise, select No. Azure Databricks can consume data from Azure SQL Database Azure Databricks can consume data from Azure Event Hubs Azure Databricks can consume data from Azure Cosmos DB.
137. Match the datastore services to the appropriate descriptions. Azure Blob Storage Azure Cosmos DB Azure Files Azure Table Storage.
Report abuse Terms of use
HOME
CREATE TEST
COMMENTS
STADISTICS
RECORDS
Author's Tests