DP-200 Exam - Implementing an Azure Data Solution

certleader.com

Your success in Microsoft DP-200 is our sole target and we develop all our DP-200 braindumps in a way that facilitates the attainment of this target. Not only is our DP-200 study material the best you can find, it is also the most detailed and the most updated. DP-200 Practice Exams for Microsoft Data and AI DP-200 are written to the highest standards of technical accuracy.

Also have DP-200 free dumps questions for you:

NEW QUESTION 1

You develop data engineering solutions for a company.
You must integrate the company’s on-premises Microsoft SQL Server data with Microsoft Azure SQL Database. Data must be transformed incrementally.
You need to implement the data integration solution.
Which tool should you use to configure a pipeline to copy data?

  • A. Use the Copy Data tool with Blob storage linked service as the source
  • B. Use Azure PowerShell with SQL Server linked service as a source
  • C. Use Azure Data Factory UI with Blob storage linked service as a source
  • D. Use the .NET Data Factory API with Blob storage linked service as the source

Answer: C

Explanation:
The Integration Runtime is a customer managed data integration infrastructure used by Azure Data Factory to provide data integration capabilities across different network environments.
A linked service defines the information needed for Azure Data Factory to connect to a data resource. We have three resources in this scenario for which linked services are needed:
DP-200 dumps exhibit On-premises SQL Server
DP-200 dumps exhibit Azure Blob Storage
DP-200 dumps exhibit Azure SQL database
Note: Azure Data Factory is a fully managed cloud-based data integration service that orchestrates and automates the movement and transformation of data. The key concept in the ADF model is pipeline. A pipeline is a logical grouping of Activities, each of which defines the actions to perform on the data contained in Datasets. Linked services are used to define the information needed for Data Factory to connect to the data resources.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/team-data-science-process/move-sql-azure-adf

NEW QUESTION 2

You develop data engineering solutions for a company. An application creates a database on Microsoft Azure. You have the following code:
Which database and authorization types are used? To answer, select the appropriate option in the answer area.
NOTE: Each correct selection is worth one point.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
Box 1: Azure Cosmos DB
The DocumentClient.CreateDatabaseAsync(Database, RequestOptions) method creates a database resource as an asychronous operation in the Azure Cosmos DB service.
Box 2: Master Key
Azure Cosmos DB uses two types of keys to authenticate users and provide access to its data and resources: Master Key, Resource Tokens
Master keys provide access to the all the administrative resources for the database account. Master keys: DP-200 dumps exhibitProvide access to accounts, databases, users, and permissions.
DP-200 dumps exhibit Cannot be used to provide granular access to containers and documents.
DP-200 dumps exhibit Are created during the creation of an account.
DP-200 dumps exhibit Can be regenerated at any time.

NEW QUESTION 3

You plan to create a new single database instance of Microsoft Azure SQL Database.
The database must only allow communication from the data engineer’s workstation. You must connect directly to the instance by using Microsoft SQL Server Management Studio.
You need to create and configure the Database. Which three Azure PowerShell cmdlets should you use to develop the solution? To answer, move the appropriate cmdlets from the list of cmdlets to the answer area and arrange them in the correct order.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
Step 1: New-AzureSqlServer Create a server.
Step 2: New-AzureRmSqlServerFirewallRule
New-AzureRmSqlServerFirewallRule creates a firewall rule for a SQL Database server. Can be used to create a server firewall rule that allows access from the specified IP range. Step 3: New-AzureRmSqlDatabase
Example: Create a database on a specified server
PS C:>New-AzureRmSqlDatabase -ResourceGroupName "ResourceGroup01" -ServerName "Server01"
-DatabaseName "Database01
References:
https://docs.microsoft.com/en-us/azure/sql-database/scripts/sql-database-create-and-configure-database-powersh

NEW QUESTION 4

You need to develop a pipeline for processing data. The pipeline must meet the following requirements.
•Scale up and down resources for cost reduction.
•Use an in-memory data processing engine to speed up ETL and machine learning operations.
•Use streaming capabilities.
•Provide the ability to code in SQL, Python, Scala, and R.
•Integrate workspace collaboration with Git. What should you use?

  • A. HDInsight Spark Cluster
  • B. Azure Stream Analytics
  • C. HDInsight Hadoop Cluster
  • D. Azure SQL Data Warehouse

Answer: B

NEW QUESTION 5

You manage a solution that uses Azure HDInsight clusters.
You need to implement a solution to monitor cluster performance and status. Which technology should you use?

  • A. Azure HDInsight .NET SDK
  • B. Azure HDInsight REST API
  • C. Ambari REST API
  • D. Azure Log Analytics
  • E. Ambari Web UI

Answer: E

Explanation:
Ambari is the recommended tool for monitoring utilization across the whole cluster. The Ambari dashboard shows easily glanceable widgets that display metrics such as CPU, network, YARN memory, and HDFS disk usage. The specific metrics shown depend on cluster type. The “Hosts” tab shows metrics for individual nodes so you can ensure the load on your cluster is evenly distributed.
The Apache Ambari project is aimed at making Hadoop management simpler by developing software for provisioning, managing, and monitoring Apache Hadoop clusters. Ambari provides an intuitive, easy-to-use Hadoop management web UI backed by its RESTful APIs.
References:
https://azure.microsoft.com/en-us/blog/monitoring-on-hdinsight-part-1-an-overview/ https://ambari.apache.org/

NEW QUESTION 6

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
A company uses Azure Data Lake Gen 1 Storage to store big data related to consumer behavior. You need to implement logging.
Solution: Use information stored m Azure Active Directory reports.
Does the solution meet the goal?

  • A. Yes
  • B. No

Answer: B

NEW QUESTION 7

A company is planning to use Microsoft Azure Cosmos DB as the data store for an application. You have the following Azure CLI command:
az cosmosdb create -–name "cosmosdbdev1" –-resource-group "rgdev"
You need to minimize latency and expose the SQL API. How should you complete the command? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
Box 1: Eventual
With Azure Cosmos DB, developers can choose from five well-defined consistency models on the consistency spectrum. From strongest to more relaxed, the models include strong, bounded staleness, session, consistent prefix, and eventual consistency.
The following image shows the different consistency levels as a spectrum.
DP-200 dumps exhibit
Box 2: GlobalDocumentDB
Select Core(SQL) to create a document database and query by using SQL syntax.
Note: The API determines the type of account to create. Azure Cosmos DB provides five APIs: Core(SQL) and MongoDB for document databases, Gremlin for graph databases, Azure Table, and Cassandra.
References:
https://docs.microsoft.com/en-us/azure/cosmos-db/consistency-levels https://docs.microsoft.com/en-us/azure/cosmos-db/create-sql-api-dotnet

NEW QUESTION 8

A company plans to use Azure Storage for file storage purposes. Compliance rules require: A single storage account to store all operations including reads, writes and deletes
Retention of an on-premises copy of historical operations You need to configure the storage account.
Which two actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

  • A. Configure the storage account to log read, write and delete operations for service type Blob
  • B. Use the AzCopy tool to download log data from $logs/blob
  • C. Configure the storage account to log read, write and delete operations for service-type table
  • D. Use the storage client to download log data from $logs/table
  • E. Configure the storage account to log read, write and delete operations for service type queue

Answer: AB

Explanation:
Storage Logging logs request data in a set of blobs in a blob container named $logs in your storage account. This container does not show up if you list all the blob containers in your account but you can see its contents if you access it directly.
To view and analyze your log data, you should download the blobs that contain the log data you are interested in to a local machine. Many storage-browsing tools enable you to download blobs from your storage account; you can also use the Azure Storage team provided command-line Azure Copy Tool (AzCopy) to download your log data.
References:
https://docs.microsoft.com/en-us/rest/api/storageservices/enabling-storage-logging-and-accessing-log-data

NEW QUESTION 9

You develop data engineering solutions for a company.
You need to ingest and visualize real-time Twitter data by using Microsoft Azure.
Which three technologies should you use? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

  • A. Event Grid topic
  • B. Azure Stream Analytics Job that queries Twitter data from an Event Hub
  • C. Azure Stream Analytics Job that queries Twitter data from an Event Grid
  • D. Logic App that sends Twitter posts which have target keywords to Azure
  • E. Event Grid subscription
  • F. Event Hub instance

Answer: BDF

Explanation:
You can use Azure Logic apps to send tweets to an event hub and then use a Stream Analytics job to read from event hub and send them to PowerBI.
References:
https://community.powerbi.com/t5/Integrations-with-Files-and/Twitter-streaming-analytics-step-by-step/td-p/95

NEW QUESTION 10

You are the data engineer tor your company. An application uses a NoSQL database to store data. The database uses the key-value and wide-column NoSQL database type.
Developers need to access data in the database using an API.
You need to determine which API to use for the database model and type.
Which two APIs should you use? Each correct answer presents a complete solution. NOTE: Each correct selection s worth one point.

  • A. Table API
  • B. MongoDB API
  • C. Gremlin API
  • D. SQL API
  • E. Cassandra API

Answer: BE

Explanation:
B: Azure Cosmos DB is the globally distributed, multimodel database service from Microsoft for mission-critical applications. It is a multimodel database and supports document, key-value, graph, and columnar data models.
E: Wide-column stores store data together as columns instead of rows and are optimized for queries over large datasets. The most popular are Cassandra and HBase.
References:
https://docs.microsoft.com/en-us/azure/cosmos-db/graph-introduction https://www.mongodb.com/scale/types-of-nosql-databases

NEW QUESTION 11

You need to ensure that phone-based poling data can be analyzed in the PollingData database. How should you configure Azure Data Factory?

  • A. Use a tumbling schedule trigger
  • B. Use an event-based trigger
  • C. Use a schedule trigger
  • D. Use manual execution

Answer: C

Explanation:
When creating a schedule trigger, you specify a schedule (start date, recurrence, end date etc.) for the trigger, and associate with a Data Factory pipeline.
Scenario:
All data migration processes must use Azure Data Factory
All data migrations must run automatically during non-business hours References:
https://docs.microsoft.com/en-us/azure/data-factory/how-to-create-schedule-trigger

NEW QUESTION 12

You need to set up access to Azure SQL Database for Tier 7 and Tier 8 partners.
Which three actions should you perform in sequence? To answer, move the appropriate three actions from the list of actions to the answer area and arrange them in the correct order.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
Tier 7 and 8 data access is constrained to single endpoints managed by partners for access Step 1: Set the Allow Azure Services to Access Server setting to Disabled
Set Allow access to Azure services to OFF for the most secure configuration.
By default, access through the SQL Database firewall is enabled for all Azure services, under Allow access to Azure services. Choose OFF to disable access for all Azure services.
Note: The firewall pane has an ON/OFF button that is labeled Allow access to Azure services. The ON setting allows communications from all Azure IP addresses and all Azure subnets. These Azure IPs or subnets might not be owned by you. This ON setting is probably more open than you want your SQL Database to be. The virtual network rule feature offers much finer granular control.
Step 2: In the Azure portal, create a server firewall rule Set up SQL Database server firewall rules
Server-level IP firewall rules apply to all databases within the same SQL Database server. To set up a server-level firewall rule:
DP-200 dumps exhibit In Azure portal, select SQL databases from the left-hand menu, and select your database on the SQL databases page.
DP-200 dumps exhibit On the Overview page, select Set server firewall. The Firewall settings page for the database server opens.
Step 3: Connect to the database and use Transact-SQL to create a database firewall rule
Database-level firewall rules can only be configured using Transact-SQL (T-SQL) statements, and only after you've configured a server-level firewall rule.
To setup a database-level firewall rule:
DP-200 dumps exhibitIn Object Explorer, right-click the database and select New Query.
DP-200 dumps exhibitEXECUTE sp_set_database_firewall_rule N'Example DB Rule','0.0.0.4','0.0.0.4';
DP-200 dumps exhibit On the toolbar, select Execute to create the firewall rule. References:
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-security-tutorial

NEW QUESTION 13

A company has a SaaS solution that uses Azure SQL Database with elastic pools. The solution contains a dedicated database for each customer organization. Customer organizations have peak usage at different periods during the year.
You need to implement the Azure SQL Database elastic pool to minimize cost. Which option or options should you configure?

  • A. Number of transactions only
  • B. eDTUs per database only
  • C. Number of databases only
  • D. CPU usage only
  • E. eDTUs and max data size

Answer: E

Explanation:
The best size for a pool depends on the aggregate resources needed for all databases in the pool. This involves determining the following:
DP-200 dumps exhibit Maximum resources utilized by all databases in the pool (either maximum DTUs or maximum vCores depending on your choice of resourcing model).
DP-200 dumps exhibit Maximum storage bytes utilized by all databases in the pool.
Note: Elastic pools enable the developer to purchase resources for a pool shared by multiple databases to accommodate unpredictable periods of usage by individual databases. You can configure resources for the pool based either on the DTU-based purchasing model or the vCore-based purchasing model.
References:
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-elastic-pool

NEW QUESTION 14

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some questions sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You need setup monitoring for tiers 6 through 8. What should you configure?

  • A. extended events for average storage percentage that emails data engineers
  • B. an alert rule to monitor CPU percentage in databases that emails data engineers
  • C. an alert rule to monitor CPU percentage in elastic pools that emails data engineers
  • D. an alert rule to monitor storage percentage in databases that emails data engineers
  • E. an alert rule to monitor storage percentage in elastic pools that emails data engineers

Answer: E

Explanation:
Scenario:
Tiers 6 through 8 must have unexpected resource storage usage immediately reported to data engineers.
Tier 3 and Tier 6 through Tier 8 applications must use database density on the same server and Elastic pools in a cost-effective manner.

NEW QUESTION 15

You need to mask tier 1 data. Which functions should you use? To answer, select the appropriate option in the answer area.
NOTE: Each correct selection is worth one point.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
A: Default
Full masking according to the data types of the designated fields.
For string data types, use XXXX or fewer Xs if the size of the field is less than 4 characters (char, nchar, varchar, nvarchar, text, ntext).
B: email
C: Custom text
Custom StringMasking method which exposes the first and last letters and adds a custom padding string in the middle. prefix,[padding],suffix
Tier 1 Database must implement data masking using the following masking logic:
DP-200 dumps exhibit
References:
https://docs.microsoft.com/en-us/sql/relational-databases/security/dynamic-data-masking

NEW QUESTION 16

You are developing the data platform for a global retail company. The company operates during normal working hours in each region. The analytical database is used once a week for building sales projections.
Each region maintains its own private virtual network.
Building the sales projections is very resource intensive are generates upwards of 20 terabytes (TB) of data. Microsoft Azure SQL Databases must be provisioned.
DP-200 dumps exhibit Database provisioning must maximize performance and minimize cost
DP-200 dumps exhibit The daily sales for each region must be stored in an Azure SQL Database instance
DP-200 dumps exhibit Once a day, the data for all regions must be loaded in an analytical Azure SQL Database instance You need to provision Azure SQL database instances.
How should you provision the database instances? To answer, drag the appropriate Azure SQL products to the correct databases. Each Azure SQL product may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
Box 1: Azure SQL Database elastic pools
SQL Database elastic pools are a simple, cost-effective solution for managing and scaling multiple databases that have varying and unpredictable usage demands. The databases in an elastic pool are on a single Azure
SQL Database server and share a set number of resources at a set price. Elastic pools in Azure SQL Database enable SaaS developers to optimize the price performance for a group of databases within a prescribed budget while delivering performance elasticity for each database.
Box 2: Azure SQL Database Hyperscale
A Hyperscale database is an Azure SQL database in the Hyperscale service tier that is backed by the Hyperscale scale-out storage technology. A Hyperscale database supports up to 100 TB of data and provides high throughput and performance, as well as rapid scaling to adapt to the workload requirements. Scaling is transparent to the application – connectivity, query processing, and so on, work like any other SQL database.

NEW QUESTION 17

You are creating a managed data warehouse solution on Microsoft Azure.
You must use PolyBase to retrieve data from Azure Blob storage that resides in parquet format and toad the data into a large table called FactSalesOrderDetails.
You need to configure Azure SQL Data Warehouse to receive the data.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
DP-200 dumps exhibit

NEW QUESTION 18

You develop data engineering solutions for a company.
A project requires the deployment of data to Azure Data Lake Storage.
You need to implement role-based access control (RBAC) so that project members can manage the Azure Data Lake Storage resources.
Which three actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  • A. Assign Azure AD security groups to Azure Data Lake Storage.
  • B. Configure end-user authentication for the Azure Data Lake Storage account.
  • C. Configure service-to-service authentication for the Azure Data Lake Storage account.
  • D. Create security groups in Azure Active Directory (Azure AD) and add project members.
  • E. Configure access control lists (ACL) for the Azure Data Lake Storage account.

Answer: ADE

NEW QUESTION 19

You manage a Microsoft Azure SQL Data Warehouse Gen 2.
Users report slow performance when they run commonly used queries. Users do not report performance changes for infrequently used queries
You need to monitor resource utilization to determine the source of the performance issues. Which metric should you monitor?

  • A. Cache used percentage
  • B. Local tempdb percentage
  • C. WU percentage
  • D. CPU percentage

Answer: B

NEW QUESTION 20

You are developing a solution to visualize multiple terabytes of geospatial data. The solution has the following requirements:
•Data must be encrypted.
•Data must be accessible by multiple resources on Microsoft Azure. You need to provision storage for the solution.
Which four actions should you perform in sequence? To answer, move the appropriate action from the list of actions to the answer area and arrange them in the correct order.
DP-200 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
DP-200 dumps exhibit

NEW QUESTION 21
......

P.S. Easily pass DP-200 Exam with 88 Q&As Passcertsure Dumps & pdf Version, Welcome to Download the Newest Passcertsure DP-200 Dumps: https://www.passcertsure.com/DP-200-test/ (88 New Questions)