option
Questions
ayuda
daypo
search.php

Azure

COMMENTS STATISTICS RECORDS
TAKE THE TEST
Title of test:
Azure

Description:
Azure questions

Creation Date: 2025/11/27

Category: Others

Number of questions: 26

Rating:(0)
Share the Test:
Nuevo ComentarioNuevo Comentario
New Comment
NO RECORDS
Content:

What is a Linked Service in ADF?. A linked service is a connection string that defines how ADF connects to data sources such as SQL DB, Storage, Blob, It is similar to a "connection manager" in SSIS. nothing.

How to improve read and write parquet in ADF. Enable V-order. Enable Optimize write. Enable sql.

What is flowlet transform. Flowlet permet de réutiliser la même logique de transformation à travers plusieurs flux de données. nothing.

What are the core components of ADF?. Integration Runtime (IR) – Compute used for data movement/transform. Pipelines – Logical containers of activities. Activities – Tasks such as copy, lookup, data transformation. Data Flows – Visual data transformation (Mapping & Wrangling). Datasets – Metadata representing data sources. Linked Services – Connection information. Triggers – Pipeline scheduling.

What is Integration Runtime (IR) in ADF?. IR is the compute infrastructure for running pipelines. Infrastructure de calcul pour exécuter les pipelines. Nothing.

What is the difference between Copy Activity and Data Flow?. Copy Activity. Data Flow.

How do you secure credentials in ADF?. Store secrets in Azure Key Vault. Nothing.

What types of triggers does ADF support?. Schedule trigger – Run at specific times. Tumbling window trigger – Time-based with stateful intervals. Event trigger – Blob created/deleted events. Manual trigger – On-demand execution.

A company wants to organize its data in OneLake with a clear separation between raw, transformed, and curated data. What is the recommended approach?. Use different workspaces for each stage of the data lifecycle. Use different folders within a single lakehouse. Use different Fabric capacities for each stage.

You need to grant a group of analysts read access to all the tables in a specific Lakehouse within a workspace. What is the most efficient permission to assign?. Lakehouse Viewer. Workspace Member. Workspace Admin.

18- Which of the following data file formats are optimized for analytical querying in a Fabric Lakehouse? (Choose two). Delta table. Parquet. XML. CSV.

You are loading data from an on-premises SQL Server database into a Fabric Lakehouse. Which two Fabric data engineering tools can you use for this incremental ETL/ELT process?. Data Pipeline. Dataflow Gen2. Power BI Dataset.

What is the primary purpose of a Shortcut in OneLake?. To create a virtual reference to data stored in a different location without moving or duplicating it. To define a relationship between two tables.

In which two scenarios would you use a Dataflow Gen2 over a Data Pipeline?. To perform complex data transformations using a Power Query graphical interface. To create reusable data transformation logic. To simply copy data from a source to a sink with minimal transformation.

You need to ensure that sensitive data in your Lakehouse is automatically classified and labeled. Which Fabric service should you use?. Microsoft Purview. Azure Policy.

Which SQL endpoint allows you to run T-SQL queries directly against the tables in your Lakehouse?. The T-SQL endpoint. The Power Query endpoint.

A Spark job you are running is failing due to out-of-memory errors. Which two actions can help resolve this?. Increase the Spark pool's node size. Optimize your Spark code to use fewer shuffles. Decrease the number of nodes in the Spark pool.

You want to run a PySpark script to clean and transform data in your Lakehouse every night. Where can you schedule this job to run?. In a Fabric Notebook activity within a Data Pipeline. In the T-SQL Endpoint.

What is the default and recommended method for storing table data in a Fabric Lakehouse?. Using the Delta format. Use csv format.

You need to provide a business user with the ability to refresh a Dataflow Gen2 on an ad-hoc basis. What is the minimum role they need in the workspace?. Workspace Contributor. Workspace Admin.

A Shortcut can be created to data in which of the following locations? (Choose two). Another Lakehouse in a different Fabric workspace. An Azure Data Lake Storage Gen2 account. An on-premises SQL Server database.

Which command is used in a Spark notebook to read a Delta table named 'Sales' into a DataFrame?. spark.read.table("Sales"). spark.read.load("Files/Sales.delta"). Select * from sales.

You want to track changes to a Delta table over time to enable point-in-time queries (time travel). Which feature do you use?. Delta Lake Time Travel. Table Shortcuts.

What is the primary benefit of using a Mirror Database in Fabric?. It provides real-time, seamless access to your operational databases (like Azure SQL DB) without ETL. It transforms your data into a star schema.

In the Fabric Warehousing experience, what is the default and primary language for querying and transforming data?. T-SQL. V-SQL.

Which feature allows you to create a logical representation of data from a Lakehouse inside a Warehouse, without moving the data?. Shortcut. Dataflow.

Report abuse