Microsoft 70-463 Dumps 2021

We provide microsoft 70 463 in two formats. Download PDF & Practice Tests. Pass Microsoft 70-463 Exam quickly & easily. The 70-463 PDF type is available for reading and printing. You can print more and practice many times. With the help of our exam 70 463 product and material, you can easily pass the 70-463 exam.

Check 70-463 free dumps before getting the full version:

NEW QUESTION 1
You have a database named Sales and a data warehouse named DataW.
From Sales, you plan to bulk insert to a table in DataW that is more than 2 TB. The process will insert a minimum of 11 million rows simultaneously.
You need to identify which data storage strategy must be used to minimize load times. Which data storage strategy should you identify?

  • A. a nonclustered columnstore
  • B. FileStream
  • C. a durable In-Memory OLTP
  • D. non-durable staging table

Answer: D

NEW QUESTION 2
You have a SQL Server Integration Services (SSIS) package. The package contains a script task that has the following comment.
// Update DataLoadBeginDate variable to the beginning of yesterday The script has the following code.
Dts.variables[“User::DataLoadBeginDate’’].Value = DataTime.Today.AddDays(-1); The script task is configured as shown in the exhibit. (Click the Exhibit button.)
70-463 dumps exhibit
When you attempt to execute the package, the package fails and returns the following error
message: ‘’Error: Exception has been thrown by the target of an invocation.’’ You need to execute the package successfully.
What should you do?

  • A. Add the dataLoadBegin Date variable to the ReadOnlyVariables property.
  • B. Add the DataLoadBeginDate variable to the ReadWriteVariables property.
  • C. Modify the entry point of the script.
  • D. Change the scope of the DataLoadBeginDate variable to Packag

Answer: B

Explanation: You add existing variables to the ReadOnlyVariables and ReadWriteVariables lists in the Script Task Editor to make them available to the custom script. Within the script, you access variables of both types through the Variables property of the Dts object.
References:
https://docs.microsoft.com/en-us/sql/integration-services/extending-packages-scripting/task/usingvariables- in-the-script-task?view=sql-server-2021

NEW QUESTION 3
Your company uses a proprietary encryption algorithm to secure sensitive dat

  • A. A custom cryptographic assembly was developed in Microsoft .NET and is used in several applications.A SQL Server Integration Services (SSIS) package is importing data into a Windows Azure SQL Database databas
  • B. Some of the data must be encrypted with the proprietary encryption algorith
  • C. You need to design the implementation strategy to meet the requirements while minimizing development and deployment effort and maximizing data flow performance.What should you do?
  • D. Create a SQL Common Language Runtime (SQLCLR) function that uses the custom assembly to encrypt the data, deploy it in the Windows Azure SQL Database database, and use it when inserting data.
  • E. Use an SSIS Script transformation that uses the custom assembly to encrypt the data when inserting it.
  • F. Create a SQL Common Language Runtime (SQLCLR) stored procedure that uses the custom assembly to encrypt the data, deploy it in the Windows Azure SQL Database database, and use it when inserting data.
  • G. Use an SSIS Script task that uses the custom assembly to encrypt the data when inserting i

Answer: B

NEW QUESTION 4
A SQL Server Integration Services (SSIS) package imports daily transactions from several files into a SQL Server table named Transaction. Each file corresponds to a different store and is imported in parallel with the other files. The data flow tasks use OLE DB destinations in fast load data access mode.
The number of daily transactions per store can be very large and is growing. The Transaction table does not have any indexes.
You need to minimize the package execution time. What should you do?

  • A. Partition the table by day and store.
  • B. Create a clustered index on the Transaction table.
  • C. Run the package in Performance mode.
  • D. Increase the value of the Row per Batch property.

Answer: D

Explanation: * Data Access Mode – This setting provides the 'fast load' option which internally uses a BULK INSERT
statement for uploading data into the destination table instead of a simple INSERT statement (for each single row) as in the case for other options.
* BULK INSERT parameters include: ROWS_PER_BATCH =rows_per_batch
Indicates the approximate number of rows of data in the data file.
By default, all the data in the data file is sent to the server as a single transaction, and the number of rows in the batch is unknown to the query optimizer. If you specify ROWS_PER_BATCH (with a value
> 0) the server uses this value to optimize the bulk-import operation. The value specified for ROWS_PER_BATCH should approximately the same as the actual number of rows.

NEW QUESTION 5
You plan to deploy a package to a server that has SQL Server installed. The server contains a SQL Server Integration Services (SSIS) catalog.
You need to ensure that the package credentials are encrypted. Which protection level should you set for the package?

  • A. ServerStorage
  • B. EncryptAllWithUserKey
  • C. EncryptSensitivewithPassword
  • D. EncryptSensitivewithUserKey

Answer: D

Explanation: The SSISDB catalog uses the ServerStorage protection level. When you deploy an Integration Services project to the Integration Services server, the catalog automatically encrypts the package data and sensitive values. The catalog also automatically decrypts the data when you retrieve it.
If you export the project (.ispac file) from the Integration Services server to the file system, the system automatically changes the protection level to EncryptSensitiveWithUserKey. References:
https://docs.microsoft.com/en-us/sql/integration-services/security/access-control-for-sensitivedata- in-packages?view=sql-server-2021

NEW QUESTION 6
You are the administrator of a server that hosts Data Quality Server for a large retail company. The server had a hardware failure during business hours.
You need to restore the server that hosts Data Quality Server to another server. You have a recent backup of all the required databases.
What should you do? (Each correct answer presents part of the solution. Choose all that apply.)

  • A. Restore the DQS_MAIN, DQS_PROJECTS, and DQS_STAGING_DATA databases to another server as soon as possible.
  • B. Execute the DQS_MAIN.internal_core.RestoreDQDatabases stored procedure with the appropriate parameter.
  • C. Restore only the DQS_MAIN and DQS_STAGING_DATA databases to another server as soon as possible.
  • D. Execute the DQS_MAIN.internal_core.InitServer stored procedure with the appropriate parameter.

Answer: AB

Explanation: Steps to restore DQS Databases: Restore DQS_MAIN database.
Restore the DQS_PROJECTS database. Restore the DQS_STAGING_DATA database.
In Object Explorer, right-click the server, and then click New Query.
In the Query Editor window, copy the following SQL statements, and replace <PASSWORD> with the password that you provided during the DQS installation for the database master key:
USE [DQS_MAIN] GO
EXECUTE [internal_core].[RestoreDQDatabases] '<PASSWORD>' GO
Press F5 to execute the statements. Check the Results pane to verify that the statements have executed successfully.
Note:
* Backup and restore of SQL Server databases are common operations that database administrators perform for preventing loss of data in a case of disaster by recovering data from the backup databases. Data Quality Server is primarily implemented by two SQL Server databases: DQS_MAIN
and DQS_PROJECTS. The backup and restore procedures of the Data Quality Services (DQS) databases are similar to any other SQL Server databases.
Ref: http://msdn.microsoft.com/en-gb/library/hh213068(v=sql.110).aspx

NEW QUESTION 7
HOTSPOT
You administer a Microsoft SQL Server 2012 database. The database contains a table that has the following definition:
70-463 dumps exhibit
You want to export data from the table to a flat file by using the SQL Server Import and Export Wizard.
You need to ensure that the following requirements are met:
-The first row of the file contains the first row of data.
-Each record is of the same length.
-The date follows the U.S. date format.
-The file supports international characters.
What should you do? (To answer, simply select the option or options in the answer areathat you would configure.)
70-463 dumps exhibit

    Answer:

    Explanation: 70-463 dumps exhibit

    NEW QUESTION 8
    You are creating a SQL Server Integration Services (SSIS) package to retrieve product data from two different sources. One source is hosted in a SQL Azure database. Each source contains products for different distributors.
    Products for each distributor source must be combined for insertion into a single product table destination.
    You need to select the appropriate data flow transformation to meet this requirement.
    Which transformation types should you use? (Each correct answer presents a complete solution. Choose all that apply.)

    • A. Multicast
    • B. Merge Join
    • C. Term Extraction
    • D. union All
    • E. Merge

    Answer: DE

    Explanation: Reference: http://msdn.microsoft.com/en-us/library/ms141703.aspx Reference: http://msdn.microsoft.com/en-us/library/ms141775.aspx Reference: http://msdn.microsoft.com/en-us/library/ms141020.aspx Reference: http://msdn.microsoft.com/en-us/library/ms141809.aspx Reference: http://msdn.microsoft.com/en-us/library/ms137701.aspx

    NEW QUESTION 9
    HOTSPOT
    You are developing a SQL Server Integration Services (SSIS) package to load data into a SQL Server
    2012 database.
    The package is allowed to connect to only one database. An Environment variable contains the name of the database.
    The OLE DB project connection manager has been parameterized.
    You need to configure the connection manager property to accept the value of the Environment variable.
    Which property should you use? (To answer, configure the appropriate option or options in the dialog box in the answer area.)
    70-463 dumps exhibit
    70-463 dumps exhibit

      Answer:

      Explanation: 70-463 dumps exhibit

      NEW QUESTION 10
      HOTSPOT
      You are designing a package control flow. The package moves sales order data from a SQL Azure transactional database to an on-premise reporting database. The package will run several times a day, while new sales orders are being added to the transactional database.
      The current design of the package control flow is shown in the answer area. (Click the Exhibit button.)
      70-463 dumps exhibit
      The Insert New Orders Data Flow task must meet the following requirements: Usage of the tempdb database should not be impacted.
      Concurrency should be maximized, while only reading committed transactions. If the task fails, only that task needs to be rolled back.
      You need to configure the Insert New Orders Data Flow task to meet the requirements.
      70-463 dumps exhibit
      How should you configure the transaction properties? To answer, select the appropriate setting or settings in the answer area.
      70-463 dumps exhibit

        Answer:

        Explanation: 70-463 dumps exhibit

        NEW QUESTION 11
        You are the administrator for a Master Data Services (MDS) environment.
        A user who is not an administrator must be able to access Master Data Manager. You need to assign permissions to the user.
        Which function should you use?

        • A. Explorer
        • B. System Administration
        • C. Version Management
        • D. User and Group Permissions
        • E. Integration Management

        Answer: D

        Explanation: In the User and Group Permissions functional area, administrators can grant permission to functional areas, to attributes (on the Models tab), and to members (on the Hierarchy Members tab). Overlapping permissions are resolved to determine a user’s permission to each individual attribute. References:
        https://docs.microsoft.com/en-us/sql/master-data-services/user-and-group-permissions-functionalarea- master-data-manager?view=sql-server-2021

        NEW QUESTION 12
        You have a data warehouse that contains all of the sales data for your company. The data warehouse contains several SQL Server Integration Services (SSIS) packages.
        You need to create a custom report that contains the total number of rows processed in the package and the time required for each package to execute. Which view should you include in the report?

        • A. catalog.executable_statistics
        • B. catalog.execution_data_taps
        • C. catalog.event_messages
        • D. catalog.execution_data_statistics

        Answer: D

        Explanation: The catalog.execution_data_statistics view displays a row each time a data flow component sends data to a downstream component, for a given package execution. The information in this view can be used to compute the data throughput for a component. Fields in this view include:
        created_time The time when the values were obtained. rows_sent The number of rows sent from the source component. References:
        https://docs.microsoft.com/en-us/sql/integration-services/system-views/catalog-executiondatastatistics

        NEW QUESTION 13
        You are designing an enterprise star schema that will consolidate data from three independent data marts. One of the data marts is hosted on SQL Azure.
        Most of the dimensions have the same structure and content. However, the geography dimension is slightly different in each data mart.
        You need to design a consolidated dimensional structure that will be easy to maintain while ensuring that all dimensional data from the three original solutions is represented.
        What should you do?

        • A. Create a conformed dimension for the geography dimension.
        • B. Implement change tracking.
        • C. Create a degenerate dimension for the geography dimension.
        • D. Create a Type 2 slowly changing dimension for the geography dimensio

        Answer: A

        NEW QUESTION 14
        HOTSPOT
        You are a data warehouse developer responsible for developing data cleansing processes. Duplicate employees exist in an employee dimension.
        You need to map, discover, and manage domain values based on the employee dimension.
        Which Data Quality Services (DQS) option should you use? (To answer, select the appropriate option in the answer area.)
        70-463 dumps exhibit

          Answer:

          Explanation: 70-463 dumps exhibit

          NEW QUESTION 15
          You are using a SQL Server Integration Services (SSIS) project that is stored in the SSIS catalog. An Environment has been defined in the SSIS catalog.
          You need to add the Environment to the project. Which stored procedure should you use?

          • A. catalog.create_environment_variable
          • B. catalog.create_environment_reference
          • C. catalog.set_execution_parameter_value
          • D. catalog.set_environment_variable_value

          Answer: B

          Explanation: Environments (Test, Production etc) are associated with projects by creating references to the environments in the projects.

          NEW QUESTION 16
          You administer a Microsoft SQL Server 2012 server that has SQL Server Integration Services (SSIS) installed.
          You plan to deploy new SSIS packages to the server. The SSIS packages use the Project Deployment Model together with parameters and Integration Services environment variables.
          You need to configure the SQL Server environment to support these packages. What should you do?

          • A. Create SSIS configuration files for the packages.
          • B. Create an Integration Services catalog.
          • C. Install Data Quality Services.
          • D. Install Master Data service

          Answer: B

          Explanation: Reference:
          http://msdn.microsoft.com/en-us/library/hh479588.aspx http://msdn.microsoft.com/en-us/library/hh213290.aspx http://msdn.microsoft.com/en-us/library/hh213373.aspx

          NEW QUESTION 17
          DRAG DROP
          You deploy a server that has SQL Server installed.
          You deploy a SQL Server Integration Services (SSIS) package to the server.
          You need to automate the execution of the package. The solution must ensure that you receive a notification if the package to execute.
          In which order should you perform all the actions?
          70-463 dumps exhibit

            Answer:

            Explanation: 70-463 dumps exhibit

            NEW QUESTION 18
            HOTSPOT
            You are developing a SQL Server Integration Services (SSIS) package to implement an incremental data load strategy. The package reads rows from a source system and compares them to rows in a destination system. New rows will be inserted and changed rows will be updated.
            You have used a Lookup transformation and a Conditional Split transformation. The Lookup transformation joins the source and destination table on the business key, and includes all columns from the destination table in the data flow output. The Conditional Split transformation inspects the destination columns and directs data flow to either insert new records or update existing records. You need to configure the Lookup transformation to ensure that all records flow to the Conditional Split transformation, regardless of whether the rows match an existing row in the destination table. Which setting should you select? (To answer, select the appropriate option in the answer area.)
            70-463 dumps exhibit

              Answer:

              Explanation: 70-463 dumps exhibit

              NEW QUESTION 19
              You are designing a data warehouse for a fresh food distribution business that stores sales by individual product. It stores sales targets by product category. Products are classified into subcategories and categories.
              Each product is included in only a single product subcategory, and each subcategory is included in only a single category.
              The data warehouse will be a data source for an Analysis Services cube. The data warehouse contains two fact tables:
              • factSales, used to record daily sales by product
              • factProductTarget, used to record the monthly sales targets by product category
              Reports must be developed against the warehouse that reports product sales by product, category and subcategory, and product sales targets.
              You need to design the product dimension. The solution should use as few tables as possible while supporting all the requirements.
              What should you do?

              • A. Create two product tables, dimProduct and dimProductCategor
              • B. ConnectfactSales to dimProduct and factProductTarget to dimProductCategory with foreign key constraint
              • C. Direct the cube developer to use key granularity attributes.
              • D. Create one product table, dimProduct, which contains product detail, category, and subcategory column
              • E. Connect factSales to dimProduct with a foreign key constrain
              • F. Direct the cube developer to use a non-key granularity attribute for factProductTarget.
              • G. Create three product tables, dimProduct, dimProductCategory, and dimProductSubcategory, and a fourth bridge table that joins products to their appropriate category and subcategory table records with foreign key constraint
              • H. Direct the cube developer to use key granularity attributes.
              • I. Create three product tables, dimProduct, dimProductCategory, and dimProductSubcategor
              • J. Connect factSales to all three product tables and connect factProductTarget to dimProductCategory with foreign key constraint
              • K. Direct the cube developer to use key granularity attributes.

              Answer: B

              NEW QUESTION 20
              You develop a SQL Server Integration Services (SSIS) package that imports SQL Azure data into a data warehouse every night.
              The SQL Azure data contains many misspellings and variations of abbreviations. To import the data, a developer used the Fuzzy Lookup transformation to choose the closest-matching string from a reference table of allowed values. The number of rows in the reference table is very large.
              If no acceptable match is found, the Fuzzy Lookup transformation passes a null value. The current setting for the Fuzzy Lookup similarity threshold is 0.50.
              Many values are incorrectly matched.
              You need to ensure that more accurate matches are made by the Fuzzy Lookup transformation without degrading performance.
              What should you do?

              • A. Change the Exhaustive property to True.
              • B. Decrease the maximum number of matches per lookup.
              • C. Change the similarity threshold to 0.85.
              • D. Increase the maximum number of matches per looku

              Answer: C

              Thanks for reading the newest 70-463 exam dumps! We recommend you to try the PREMIUM Simply pass 70-463 dumps in VCE and PDF here: https://www.simply-pass.com/Microsoft-exam/70-463-dumps.html (270 Q&As Dumps)