Weekend Sale 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: sale65best

Free Microsoft DP-600 Practice Exam with Questions & Answers

Questions 1

You need to migrate the Research division data for Productline2. The solution must meet the data preparation requirements. How should you complete the code? To answer, select the appropriate options in the answer area

NOTE: Each correct selection is worth one point.

DP-600 Question 1

Options:
Microsoft DP-600 Premium Access
Questions 2

Which workspace rote assignments should you recommend for ResearchReviewersGroupl and ResearchReviewersGroupZ? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

DP-600 Question 2

Options:
Questions 3

You need to refresh the Orders table of the Online Sales department. The solution must meet the semantic model requirements. What should you include in the solution?

Options:
A.

an Azure Data Factory pipeline that executes a dataflow to retrieve the minimum value of the OrderlD column in the destination lakehouse

B.

an Azure Data Factory pipeline that executes a Stored procedure activity to retrieve the maximum value of the OrderlD column in the destination lakehouse

C.

an Azure Data Factory pipeline that executes a dataflow to retrieve the maximum value of the OrderlD column in the destination lakehouse

D.

an Azure Data Factory pipeline that executes a Stored procedure activity to retrieve the minimum value of the OrderiD column m the

destination lakehouse

Questions 4

You need to recommend which type of fabric capacity SKU meets the data analytics requirements for the Research division. What should you recommend?

Options:
A.

EM

B.

F

C.

P

D.

A

Questions 5

You need to recommend a solution to group the Research division workspaces.

What should you include in the recommendation? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

DP-600 Question 5

Options:
Questions 6

You need to ensure the data loading activities in the AnalyticsPOC workspace are executed in the appropriate sequence. The solution must meet the technical requirements.

What should you do?

Options:
A.

Create a pipeline that has dependencies between activities and schedule the pipeline.

B.

Create and schedule a Spark job definition.

C.

Create a dataflow that has multiple steps and schedule the dataflow.

D.

Create and schedule a Spark notebook.

Questions 7

You have a Fabric tenant that contains a lakehouse named Lakehouse1

Readings from 100 loT devices are appended to a Delta table in Lakehouse1. Each set of readings is approximately 25 KB. Approximately 10 GB of data is received daily.

All the table and SparkSession settings are set to the default.

You discover that queries are slow to execute. In addition, the lakehouse storage contains data and log files that are no longer used.

You need to remove the files that are no longer used and combine small files into larger files with a target size of 1 GB per file.

What should you do? To answer, drag the appropriate actions to the correct requirements. Each action may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

DP-600 Question 7

Options:
Questions 8

You have a Fabric tenant that contains a warehouse named DW1 and a lakehouse named LH1. DW1 contains a table named Sales.Product. LH1 contains a table named Sales.Orders.

You plan to schedule an automated process that will create a new point-in-time (PIT) table named Sales.ProductOrder in DW1. Sales.ProductOrder will be built by using the results of a query that will join Sales.Product and Sales.Orders.

You need to ensure that the types of columns in Sales. ProductOrder match the column types in the source tables. The solution must minimize the number of operations required to create the new table.

Which operation should you use?

Options:
A.

CREATE TABLE AS SELECT (CTAS)

B.

INSERT INTO

C.

CREATE MATERIALIZED VIEW AS SELECT

D.

CREATE TABLE AS CLONE OF

Questions 9

You have a Microsoft Fabric tenant that contains a dataflow.

You are exploring a new semantic model.

From Power Query, you need to view column information as shown in the following exhibit.

DP-600 Question 9

Which three Data view options should you select? Each correct answer presents part of the solution. NOTE: Each correct answer is worth one point.

Options:
A.

Enable column profile

B.

Show column quality details

C.

Show column profile in details pane

D.

Enable details pane

E.

Show column value distribution

Questions 10

You have a Fabric tenant that contains a lakehouse named lakehouse1. Lakehouse1 contains a table named Table1.

You are creating a new data pipeline.

You plan to copy external data to Table1. The schema of the external data changes regularly.

You need the copy operation to meet the following requirements:

• Replace Table1 with the schema of the external data.

• Replace all the data in Table1 with the rows in the external data.

You add a Copy data activity to the pipeline. What should you do for the Copy data activity?

Options:
A.

From the Source tab, add additional columns.

B.

From the Destination tab, set Table action to Overwrite.

C.

From the Settings tab, select Enable staging

D.

From the Source tab, select Enable partition discovery

E.

From the Source tab, select Recursively