This page was exported from Free Exams Dumps Materials [ http://exams.dumpsmaterials.com ] Export date:Sat Nov 23 10:27:17 2024 / +0000 GMT ___________________________________________________ Title: 2023 Provide Updated Microsoft DP-300 Dumps as Practice Test and PDF [Q27-Q46] --------------------------------------------------- 2023 Provide Updated Microsoft DP-300 Dumps as Practice Test and PDF DP-300 Dumps are Available for Instant Access Microsoft DP-300 exam has become increasingly popular among database administrators due to the growing demand for cloud-based database solutions. As more businesses move their data to the cloud, the need for skilled professionals who can administer and maintain these databases has increased. Administering Relational Databases on Microsoft Azure certification provides a competitive edge to professionals who are looking to advance their careers in the field of database administration.   NEW QUESTION 27You need to identify the cause of the performance issues on SalesSQLDb1.Which two dynamic management views should you use? Each correct answer presents part of the solution.NOTE: Each correct selection is worth one point.  sys.dm_pdw_nodes_tran_locks  sys.dm_exec_compute_node_errors  sys.dm_exec_requests  sys.dm_cdc_errors  sys.dm_pdw_nodes_os_wait_stats  sys.dm_tran_locks ExplanationSalesSQLDb1 experiences performance issues that are likely due to out-of-date statistics and frequent blocking queries.A: Use sys.dm_pdw_nodes_tran_locks instead of sys.dm_tran_locks from Azure Synapse Analytics (SQL Data Warehouse) or Parallel Data Warehouse.E: Example:The following query will show blocking information.SELECTt1.resource_type,t1.resource_database_id,t1.resource_associated_entity_id,t1.request_mode,t1.request_session_id,t2.blocking_session_idFROM sys.dm_tran_locks as t1INNER JOIN sys.dm_os_waiting_tasks as t2ON t1.lock_owner_address = t2.resource_address;Note: Depending on the system you’re working with you can access these wait statistics from one of three locations:sys.dm_os_wait_stats: for SQL Serversys.dm_db_wait_stats: for Azure SQL Databasesys.dm_pdw_nodes_os_wait_stats: for Azure SQL Data WarehouseReference:https://docs.microsoft.com/en-us/sql/relational-databases/system-dynamic-management-views/sys-dm-tran-locksNEW QUESTION 28You have an Azure subscription that contains an Azure SQL database named SQLDb1. SQLDb1 contains a table named Table1.You plan to deploy an Azure web app named webapp1 that will export rows in Table1 that have changed.You need to ensure that webapp1 can identity the changes to Table’. The solution must meet the following requirements:* Minimize compute times.* Minimize storage.Which three actions should you perform in sequence? To answer. move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. 1 – Connect to SQLDb1 and run the following Transact-SQL statment ALTER DATABASE SQLDb1 SET,,,,,2 – Connect to SQLDb1 and run the following Transact-SQL statment ALTER TABLEdbo,,,,3 – From webapp1,connect to,,,,,,NEW QUESTION 29You plan to create a table in an Azure Synapse Analytics dedicated SQL pool.Data in the table will be retained for five years. Once a year, data that is older than five years will be deleted.You need to ensure that the data is distributed evenly across partitions. The solutions must minimize the amount of time required to delete old data.How should you complete the Transact-SQL statement? To answer, drag the appropriate values to the correct targets. Each value may be used once, more than once, or not at all.You may need to drag the split bar between panes or scroll to view content.NOTE:Each correct selection is worth one point. ExplanationGraphical user interface, text, application Description automatically generatedBox 1: HASHBox 2: OrderDateKeyIn most cases, table partitions are created on a date column.A way to eliminate rollbacks is to use Metadata Only operations like partition switching for data management.For example, rather than execute a DELETE statement to delete all rows in a table where the order_date was in October of 2001, you could partition your data early. Then you can switch out the partition with data for an empty partition from another table.Reference:https://docs.microsoft.com/en-us/sql/t-sql/statements/create-table-azure-sql-data-warehousehttps://docs.microsoft.com/en-us/azure/synapse-analytics/sql/best-practices-dedicated-sql-poolNEW QUESTION 30You have an Azure subscription that contains a resource group named RG1. RG1 contains an instance of SQL Server on Azure Virtual Machines named SQL You need to use PowerShell to enable and configure automated patching for SQL The solution must include both SQL Server and Windows security updates.How should you complete the command? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. ExplanationNEW QUESTION 31You have an Azure SQL database named db1 on a server named server1.You use Query Performance Insight to monitor db1.You need to modify the Query Store configuration to ensure that performance monitoring data is available as soon as possible.Which configuration setting should you modify and which value should you configure? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. NEW QUESTION 32You have an Azure subscription.You plan to deploy an Azure SQL database by using an Azure Resource Manager template.How should you complete the template? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. Reference:https://docs.microsoft.com/en-us/azure/azure-sql/database/single-database-create-arm-template-quickstartNEW QUESTION 33You configure version control for an Azure Data Factory instance as shown in the following exhibit.Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.NOTE: Each correct selection is worth one point. Reference:https://docs.microsoft.com/en-us/azure/data-factory/source-controlNEW QUESTION 34You have an Azure SQL database.You identify a long running query.You need to identify which operation in the query is causing the performance issue.What should you use to display the query execution plan in Microsoft SQL Server Management Studio (SSMS)?  Live Query Statistics  an estimated execution plan  an actual execution plan  Client Statistics Section: [none]Explanation:To include an execution plan for a query during execution1. On the SQL Server Management Studio toolbar, click Database Engine Query. You can also open an existing query and display the estimated execution plan by clicking the Open File toolbar button and locating the existing query.2. Enter the query for which you would like to display the actual execution plan.3. On the Query menu, click Include Actual Execution Plan or click the Include Actual Execution Plan toolbar button.Note: Actual execution plans are generated after the Transact-SQL queries or batches execute. Because of this, an actual execution plan contains runtime information, such as actual resource usage metrics and runtime warnings (if any). The execution plan that is generated displays the actual query execution plan that the SQL Server Database Engine used to execute the queries.Reference:https://docs.microsoft.com/en-us/sql/relational-databases/performance/display-an-actual-execution-planNEW QUESTION 35HOTSPOTYou need to design an analytical storage solution for the transactional data. The solution must meet the sales transaction dataset requirements.What should you include in the solution? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point.Hot Area: Section: [none]Explanation:Box 1: HashScenario:Ensure that queries joining and filtering sales transaction records based on product ID complete as quickly as possible.A hash distributed table can deliver the highest query performance for joins and aggregations on large tables.Box 2: Round-robinScenario:You plan to create a promotional table that will contain a promotion ID. The promotion ID will be associated to a specific product. The product will be identified by a product ID. The table will be approximately 5 GB.A round-robin table is the most straightforward table to create and delivers fast performance when used as a staging table for loads. These are some scenarios where you should choose Round robin distribution:* When you cannot identify a single key to distribute your data.* If your data doesn’t frequently join with data from other tables.* When there are no obvious keys to join.Incorrect Answers:Replicated: Replicated tables eliminate the need to transfer data across compute nodes by replicating a full copy of the data of the specified table to each compute node. The best candidates for replicated tables are tables with sizes less than 2 GB compressed and small dimension tables.Reference:https://rajanieshkaushikk.com/2020/09/09/how-to-choose-right-data-distribution-strategy-for-azure-synapse/ Question Set 3NEW QUESTION 36You plan to create a table in an Azure Synapse Analytics dedicated SQL pool.Data in the table will be retained for five years. Once a year, data that is older than five years will be deleted.You need to ensure that the data is distributed evenly across partitions. The solutions must minimize the amount of time required to delete old data.How should you complete the Transact-SQL statement? To answer, drag the appropriate values to the correct targets. Each value may be used once, more than once, or not at all.You may need to drag the split bar between panes or scroll to view content.NOTE: Each correct selection is worth one point. Reference:https://docs.microsoft.com/en-us/sql/t-sql/statements/create-table-azure-sql-data-warehousehttps://docs.microsoft.com/en-us/azure/synapse-analytics/sql/best-practices-dedicated-sql-poolNEW QUESTION 37You plan to create a table in an Azure Synapse Analytics dedicated SQL pool.Data in the table will be retained for five years. Once a year, data that is older than five years will be deleted.You need to ensure that the data is distributed evenly across partitions. The solutions must minimize the amount of time required to delete old data.How should you complete the Transact-SQL statement? To answer, drag the appropriate values to the correct targets. Each value may be used once, more than once, or not at all.You may need to drag the split bar between panes or scroll to view content.NOTE: Each correct selection is worth one point. ExplanationGraphical user interface, text, application Description automatically generatedBox 1: HASHBox 2: OrderDateKeyIn most cases, table partitions are created on a date column.A way to eliminate rollbacks is to use Metadata Only operations like partition switching for data management.For example, rather than execute a DELETE statement to delete all rows in a table where the order_date was in October of 2001, you could partition your data early. Then you can switch out the partition with data for an empty partition from another table.Reference:https://docs.microsoft.com/en-us/sql/t-sql/statements/create-table-azure-sql-data-warehousehttps://docs.microsoft.com/en-us/azure/synapse-analytics/sql/best-practices-dedicated-sql-poolNEW QUESTION 38You have SQL Server on an Azure virtual machine.You need to use Policy-Based Management in Microsoft SQL Server to identify stored procedures that do not comply with your naming conventions.Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. 1 – Create a customer condition based on a built-in facet.2 – Create a custom policy based on a condition.3 – Run a policy evaluation.Reference:https://www.mssqltips.com/sqlservertip/2298/enforce-sql-server-database-naming-conventions-using-policy-based-management/NEW QUESTION 39You have an Azure SQL database named db1.You need to retrieve the resource usage of db1 from the last week.How should you complete the statement? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. Explanation:Box 1: sys.resource_statssys.resource_stats returns CPU usage and storage data for an Azure SQL Database. It has database_name and start_time columns.Box 2: DateAddThe following example returns all databases that are averaging at least 80% of compute utilization over the last one week.DECLARE @s datetime;DECLARE @e datetime;SET @s= DateAdd(d,-7,GetUTCDate());SET @e= GETUTCDATE();SELECT database_name, AVG(avg_cpu_percent) AS Average_Compute_Utilization FROM sys.resource_stats WHERE start_time BETWEEN @s AND @e GROUP BY database_name HAVING AVG(avg_cpu_percent) >= 80 Incorrect Answers:sys.dm_exec_requests:sys.dm_exec_requests returns information about each request that is executing in SQL Server. It does not have a column named database_name.sys.dm_db_resource_stats:sys.dm_db_resource_stats does not have any start_time column.Note: sys.dm_db_resource_stats returns CPU, I/O, and memory consumption for an Azure SQL Database database. One row exists for every 15 seconds, even if there is no activity in the database. Historical data is maintained for approximately one hour.Sys.dm_user_db_resource_governance returns actual configuration and capacity settings used by resource governance mechanisms in the current database or elastic pool. It does not have any start_time column.Reference:https://docs.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sys-resource-stats-azure-sql-databaseNEW QUESTION 40You have a Microsoft SQL Server 2019 instance in an on-premises datacenter. The instance contains a 4-TB database named DB1.You plan to migrate DB1 to an Azure SQL Database managed instance.What should you use to minimize downtime and data loss during the migration?  distributed availability groups  database mirroring  log shipping  Database Migration Assistant Section: [none]NEW QUESTION 41You have an instance of SQL Server on Azure Virtual Machines named SQL1.SQL1 contains an Extended Events session named session! that captures Microsoft SQL Server events.You need to correlate the session events with events captured by Event Tracing for Windows (ETW).What should you do for session1?  Modify the Set Session Event Filters settings.  Add a target.  Add an action.  Modify the Specify Session Data Storage settings. NEW QUESTION 42You are building a database in an Azure Synapse Analytics serverless SQL pool.You have data stored in Parquet files in an Azure Data Lake Storage Gen2 container.Records are structured as shown in the following sample.The records contain two applicants at most.You need to build a table that includes only the address fields.How should you complete the Transact-SQL statement? To answer, select the appropriate options in the answer area.NOTE:Each correct selection is worth one point. ExplanationGraphical user interface, text, application Description automatically generatedBox 1: CREATE EXTERNAL TABLEAn external table points to data located in Hadoop, Azure Storage blob, or Azure Data Lake Storage. External tables are used to read data from files or write data to files in Azure Storage. With Synapse SQL, you can use external tables to read external data using dedicated SQL pool or serverless SQL pool.Syntax:CREATE EXTERNAL TABLE { database_name.schema_name.table_name | schema_name.table_name | table_name } ( <column_definition> [ ,…n ] ) WITH ( LOCATION = ‘folder_or_filepath’, DATA_SOURCE = external_data_source_name, FILE_FORMAT = external_file_format_name Box 2. OPENROWSET When using serverless SQL pool, CETAS is used to create an external table and export query results to Azure Storage Blob or Azure Data Lake Storage Gen2.Example:ASSELECT decennialTime, stateName, SUM(population) AS populationFROMOPENROWSET(BULK‘https://azureopendatastorage.blob.core.windows.net/censusdatacontainer/release/us_population_county/year=*/* FORMAT=’PARQUET’) AS [r] GROUP BY decennialTime, stateName GO Reference:https://docs.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-external-tablesNEW QUESTION 43You have an Azure Synapse Analytics dedicated SQL pool named Pool1 and an Azure Data Lake Storage Gen2 account named Account1.You plan to access the files in Account1 by using an external table.You need to create a data source in Pool1 that you can reference when you create the external table.How should you complete the Transact-SQL statement? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. ExplanationGraphical user interface, table Description automatically generatedBox 1: blobThe following example creates an external data source for Azure Data Lake Gen2 CREATE EXTERNAL DATA SOURCE YellowTaxi WITH ( LOCATION = ‘https://azureopendatastorage.blob.core.windows.net/nyctlc/yellow/’, TYPE = HADOOP) Box 2: HADOOP Reference:https://docs.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-external-tablesNEW QUESTION 44You are building an Azure Stream Analytics job to retrieve game data.You need to ensure that the job returns the highest scoring record for each five-minute time interval of each game.How should you complete the Stream Analytics query? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point. Reference:https://docs.microsoft.com/en-us/stream-analytics-query/topone-azure-stream-analyticshttps://github.com/MicrosoftDocs/azure-docs/blob/master/articles/stream-analytics/stream-analytics-window-functions.mdNEW QUESTION 45You have SQL Server on an Azure virtual machine that contains a database named Db1.You need to enable automatic tuning for Db1.How should you complete the statements? To answer, select the appropriate answer in the answer area.NOTE: Each correct selection is worth one point. Reference:https://docs.microsoft.com/en-us/azure/azure-sql/database/automatic-tuning-enableNEW QUESTION 46You have an Azure Synapse Analytics Apache Spark pool named Pool1.You plan to load JSON files from an Azure Data Lake Storage Gen2 container into the tables in Pool1. The structure and data types vary by file.You need to load the files into the tables. The solution must maintain the source data types.What should you do?  Load the data by using PySpark.  Load the data by using the OPENROWSET Transact-SQL command in an Azure Synapse Analytics serverless SQL pool.  Use a Get Metadata activity in Azure Data Factory.  Use a Conditional Split transformation in an Azure Synapse data flow. ExplanationServerless SQL pool can automatically synchronize metadata from Apache Spark. A serverless SQL pool database will be created for each database existing in serverless Apache Spark pools.Serverless SQL pool enables you to query data in your data lake. It offers a T-SQL query surface area that accommodates semi-structured and unstructured data queries.To support a smooth experience for in place querying of data that’s located in Azure Storage files, serverless SQL pool uses the OPENROWSET function with additional capabilities.The easiest way to see to the content of your JSON file is to provide the file URL to the OPENROWSET function, specify csv FORMAT.Reference:https://docs.microsoft.com/en-us/azure/synapse-analytics/sql/query-json-fileshttps://docs.microsoft.com/en-us/azure/synapse-analytics/sql/query-data-storage Loading … The DP-300 exam is designed to assess an individual's ability to design, implement, and maintain Azure SQL databases, as well as secure the data within those databases. DP-300 exam covers a range of topics including SQL server administration, data security, scalability and performance, and database migration. By earning the DP-300 certification, an individual can demonstrate their proficiency in administering relational databases on Microsoft Azure, which is a highly sought-after skill in the IT industry.   Updated DP-300 Dumps Questions For Microsoft Exam: https://www.dumpsmaterials.com/DP-300-real-torrent.html --------------------------------------------------- Images: https://exams.dumpsmaterials.com/wp-content/plugins/watu/loading.gif https://exams.dumpsmaterials.com/wp-content/plugins/watu/loading.gif --------------------------------------------------- --------------------------------------------------- Post date: 2023-10-20 10:35:24 Post date GMT: 2023-10-20 10:35:24 Post modified date: 2023-10-20 10:35:24 Post modified date GMT: 2023-10-20 10:35:24