70-463 Implementing a Data Warehouse with Microsoft SQL Server 2012/2014

Hi, My name is Jimmy Sam. If you need DBA help, please email to jimmy_sam001@yahoo.com for further discussion. Thanks.
mssql certification 70-463     70-463     20463   20463 Ondemand   10777   70-461     70-462     70-464     70-465    

Quick Search


search term:      
   



70-463 Questions


dumps4mcsa with graph 70-463-exam 
dumps4mcsa with graph 70-463-study-material 
dumps4mcsa with graph Oct 2016  
dumps4mcsa with graph 


mcsepractice

sqlserverdumps q1, single question, with explanation 
sqlserverdumps q189, single question, with explanation


microsoft4shared 
microsoft4shared


epass4sure 181-190 With A few Explanations   
epass4sure With A few Explanations


aiotestking click to show answer 
aiotestking q221 




SSIS Tutorial


Lesson 1: Create a Project and Basic Package with SSIS ref1

Lesson 2: Adding Looping with SSIS   ref1

Lesson 3: Add Logging with SSIS ref1

Lesson 4: Add Error Flow Redirection with SSIS

Lesson 5: Add SSIS Package Configurations for the Package Deployment Model

Lesson 6: Using Parameters with the Project Deployment Model in SSIS



Design and implement a data warehouse


Design and implement dimensions
       Design shared/conformed dimensions; 
       determine if you need support for slowly changing dimensions; 
       determine attributes; 
       design hierarchies; 
       determine whether you need star or snowflake schema; 
       determine the granularity of relationship with fact tables; 
       determine the need for auditing or lineage; 
       determine keys (business transactional or your own data warehouse/surrogate keys); 
       implement dimensions; 
       implement data lineage of a dimension table

Design and implement fact tables
       Design a data warehouse that supports many to many relationships; 
       appropriately index a fact table; 
       using columnstore indexes; 
       partitioning; 
       additive measures; 
       semi additive measures; 
       non additive measures; 
       implement fact tables; 
       determining the loading method for the fact tables; 
       implement data lineage of a fact table; 
       design summary aggregation tables

Design and Implement Dimensions and Fact Tables

Introduction to Dimensions ( Analysis Services - multidimentional data) 

Dimension  Storage 

Dimension Relationships,degenerate dimensions 

Dimensions 

Columnstore Indexes 



Data Flow -- Extract and transform data


Define connection managers
       Plan the configuration of connection managers; 
       package level or project level connection manager; 
       define a connection string; 
       parameterization of connection strings

Design data flow
       Define data sources and destinations; 
       distinguish blocking and non-blocking transformations; 
       use different methods to pull out changed data from data sources; 
       determine appropriate data flow components; 
       determine the need for supporting Slowly Changing Dimensions (SCD); 
       determine whether to use SQL Joins or SSIS lookup or merge join transformations; 
       batch processing versus row by row processing; 
       determine the appropriate transform to use for a specific task; 
       determine the need and method for identity mapping and deduplicating; 
       fuzzy lookup, fuzzy grouping and Data Quality Services (DQS) transformation; 
       determine the need for custom data sources, destinations, and transforms; 
       determine what to do with erroneous rows; determine auditing needs; 
       trusted/authoritative data sources, including warehouse metadata; 
       extracting only changed rows

Implement data flow
       Debug data flow; 
       use the appropriate data flow components; 
       SQL / SSIS data transformation; 
       create SSIS packages that support slowly changing dimensions; 
       use the lookup task in SSIS; 
       map identities using SSIS fuzzy lookup (advanced); 
       specify a data source and destination; 
       use data flows; 
       different categories of transformations; 
       read, transform and load data; 
       understand which transforms to use to accomplish a specific business task; 
       data correction transformation; 
       performance tune an SSIS dataflow; 
       optimize Integration Services packages for speed of execution; 
       maintain data integrity, including good data flow

Manage SSIS package execution
       Schedule package execution by using SQL Server Agent; 
       execute packages by using DTEXEC; 
       execute packages by using SQL Server Management Studio; 
       implement package execution; 
       plan and design package execution strategy; 
       use PowerShell to execute script; 
       monitor the execution using Management Studio; 
       use DTEXECUI; 
       ETL restartability

Implement script tasks in SSIS
       Determine if it is appropriate to use a script task; 
       extending the capability of a control flow; 
       perform a custom action as needed (not on every row) during a control flow


Integration Services(SSIS) connections 

Data Flow 

Data Flow Task 

Integration Services Tasks 

Slowly Changing Dimension Transformation 


Data Flow: Fuzzy Lookup Transformation 

Transformation Custom PropertiesMaxOutputMatchesPerInput
 


Control Flow -- Load Data


Design control flow
       Determine control flow; 
       determine containers and tasks needed; 
       determine precedence constraints; 
       design an SSIS package strategy with rollback, staging and transaction control; 
       decide between one package or multiple packages; 
       determine event handlers; 
       determine variables; 
       determine parameters on package and project level; 
       determine connection managers and whether they are package or project level; 
       determine the need for custom tasks; 
       determine how much information you need to log from a package; 
       determine the need for checkpoints; 
       determine security needs

Implement package logic by using SSIS variables and parameters
       User variables; 
       variable scope, data type; 
       implement parameterization of properties using variables; 
       using variables in precedence constraints; 
       referring to SSIS system variables; 
       design dynamic SSIS packages; 
       package configurations (file or SQL tables); 
       expressions; 
       package and project parameters; 
       project level connection managers; 
       variables; 
       implement dynamic package behavior; 
       configure packages in SSIS for different environments, package configurations 
          (xmlconfiguration file, 
           SQLServer table, 
           registry entry; 
           parent package variables, 
           environment variable); 
       parameters (package and project level); 
       project connection managers; 
       property expressions (use expressions for connection managers)

Implement control flow
       Checkpoints; 
       debug control flow; 
       implement the appropriate control flow task to solve a problem; 
       data profiling; 
       use sequence containers and loop containers; 
       manage transactions in SSIS packages; 
       managing parallelism; 
       using precedence constraint to control task execution sequence; 
       creating package templates; 
       using the execute package task

Implement data load options
       Implement a full and incremental data load strategy; 
       plan for an incremental update of the relational Data Mart; 
       plan for loads into indexed tables; 
       configure appropriate bulk load options; 
       select an appropriate load technique (SSIS Destination versus T-SQL) and load partitioned tables

Implement script components in SSIS
       Create an SSIS package that handles SCD Type 2 changes without using the SCD component; 
       work with script component in SSIS; 
       deciding when it is appropriate to use a script component versus a built in; 
       source, transformation, destination component; 
       use cases: web service source and destination, getting the error message

Integration Services transactions Required,Supported,NotSupported

Developing a custom task  assembly, gacutil

Custom Objects  global assembly cache (gac)

Integration Services (SSIS) parameters Project.params



Configure and deploy SSIS solutions


Troubleshoot data integration issues
       Performance issues; 
       connectivity issues; 
       execution of a task or transformation failed; 
       logic issues; demonstrate awareness of the new SSIS logging infrastructure; 
       troubleshoot a failed package execution to determine the root cause of failure; 
       troubleshoot SSIS package failure from an invalid datatype; 
       implement break points; data viewers; 
       profile data with different tools; 
       batch cleanup

Install and maintain SSIS components
       Software installation (IS, management tools); 
       development box and server; 
       install specifics for remote package execution; 
       planning for installation (32- versus 64-bit); 
       upgrade; 
       provisioning the accounts; 
       creating the catalog

Implement auditing, logging, and event handling
       Audit package execution by using system variables; 
       propagate events; 
       use log providers; 
       log an SSIS execution; 
       create alerting and notification mechanisms; 
       use Event Handlers in SSIS to track ETL events and errors; 
       implement custom logging

Deploy SSIS solutions
       Create and configure an SSIS catalog; 
       deploy SSIS packages by using the deployment utility; 
       deploy SSIS packages to SQL or file system locations; 
       validate deployed packages; 
       deploy packages on multiple servers; 
       how to install custom components and tasks; 
       deploy SSIS packages by using DTUTIL

Configure SSIS security settings
       SSIS catalog database roles; 
       package protection levels; 
       secure Integration Services packages that are deployed at the file system; 
       secure Integration Services parameters, configuration

Troubleshooting tools for package development 

Load-balancing packages on remote servers by using SQL Server Agent 

Integration Services (SSIS) logging sysssislog table in a SQL Server database.

Integration Services (SSIS) Logging 
system table: sysssislog 
Contains one row for each logging entry that is generated by packages or their tasks and containers at run time. 
This table is created in the msdb database when you install Microsoft SQL Server Integration Services. 
If you configure logging to log to a different SQL Server database, a sysssislog table with this format is created 
in the specified database

Data Viewer 



Build data quality solutions


Install and maintain data quality services
       Installation prerequisites; 
       .msi package; 
       adding users to the DQ roles; 
       identity analysis, including data governance

Implement master data management solutions
       Install Master Data Services (MDS); 
       implement MDS; 
       create models, entities, hierarchies, collections, attributes; 
       define security roles; 
       import/export; 
       subscriptions

Create a data quality project to clean data
       Profile Online Transaction Processing (OLTP) and other source systems; 
       data quality knowledge base management; 
       create data quality project; 
       use data quality client; 
       improve data quality; 
       identity mapping and deduplicating; 
       handle history and data quality; 
       manage data quality/cleansing

Install Data Quality Services(DQS) 

DQSInstaller  upgradedlls upgrade uninstall collation

Grant DQS Roles to Users dqs_administrator dqs_kb_editor dqs_kb_operator

Install Master Data Services 

Master Data Services features and tasks 

similarity(exact,similar),weight,prerequisite
matching policy 
similarity:exact,weight=0 or ignored,prerequisite=yes
silimarity:similar,weight=99,prerequisite=no



Measures, Measure Groups and Semiaddictive Behavior


Measures and Measure Groups 

Define Semiadditive Behavior 



dtexec


dtexec  dtexec  2008R2 
bimonkey dtexec dtexecui 
erikhaselhofer dtexec etc 
DTS dtexec 

Able to load packages from the following sources:
Integration Services server
.ispac project file
SQL Server database
SSIS Package Store
File system

rem execute an SSIS package saved to SQL Server using Windows authentication
dtexec /SQL my_packagename /SER my_servername\my_instancename
dtexec /sq pkgOne /ser productionServer
dtexec /sq pkgOne /ser productionServer /va    -- Verification Only

execute an SSIS package saved to the package store (on the filesystem)
dtexec /DTS "\File System\MyPackage"
dtexec /dts "\File System\MyPackage"

execute an SSIS package saved in the filesystem
dtexec /FILE "C:\myssispackage.dtsx" /MAXCONCURRENT "-1" /CHECKPOINTING OFF  /REPORTING EWCDI

rem logging options
dtexec /FILE "C:\myssispackage.dtsx" /l "DTS.LogProviderTextFile;c:\log.txt"
dtexec /f "c:\pkgOne.dtsx" /l "DTS.LogProviderTextFile;c:\log.txt"

dtexec /f mypackage.dtsx /set \package.variables[myvariable].Value;myvalue



dtexecui


dtexecui  dtexecui ( It is a 32-bit utility )

Able to run packages in one of the following locations
SQL Server
SSIS Package Store
File system

Option Pages:
Configurations Page (.dtsconfig)
Command Files Page
Connection Managers Page
Execution Options Page
Reporting Page
Logging Page ( Log Provider )
Set Values Page
Verification Page
Command Line Page

dtutil



dtutil 
bimonkey 
sqlblogcasts 

To copy, move, delete or verify a package in the following locations:
SQL Server
SSIS Package Store
File system

On a 64-bit computer, the 64-bit version of DTUTIL is insntalled.  
If you need the 32-bit version you will need to install it.  
By default, if both are installed, the 32-bit version will run.



dtutil /sourceserver localhost /SQL "Package" /copy file;.\Package.dtsx

rem Checking if the Test folder exists in the MSDB database on the laptop04\dev server:
dtutil /SourceServer laptop04\dev /FExists SQL;\Test

rem Moving a package from File to SSIS Package Store:
dtutil /File C:\Package.dtsx /Copy DTS;Package
dtutil /Fi C:\Package.dtsx /C DT;Package

rem Moving a package from msdb on a Named Instance of SQL Server using SQL Server Authentication to the file system
dtutil /SQL Folder\Package /SourceServer SQLSERVER.INSTANCE /SourceUser Monkey_User /SourcePassword ^
P@$$word /Copy File;C:\Package.dtsx

dtutil /SQ Folder\Package  /SourceS      SQLSERVER.INSTANCE /SourceU    Monkey_User /SourceP   ^
P@$$word /C FI;C:\Package.dtsx


rem Create a Test folder in the root of MSDB database on the laptop04\dev server:
dtutil /SourceServer laptop04\dev /FCreate SQL;\;Test

rem Copy a package from file system into a specified MSDB folder with a specified name:
dtutil /File packagename.dtsx /DestServer laptop04\dev /Copy SQL;\Test\newpackagename

rem Move a package from one MSDB folder into another:
dtutil /SourceServer laptop04\dev /SQL \Test\package /DestServer laptop04\dev /Move SQL;\Prod\package

rem Check if a package exists in the specified MSDB folder:
dtutil /Exists /SourceServer laptop04\dev /SQL \Test\Package

rem Using SQL Server authentication method instead of Windows Authentication for delete:
dtutil /SQL Package /SourceUser SSIS_User /SourcePassword [password] /Delete

 The greatest disadvantage of using dtutil is that you cannot deploy package configurations with it.
 If you want to solve this problem there are two ways of executing these packages:
  1. using DTEXEC (with or without the GUI) where you can set which config file to use,
  2. creating a scheduled job in which there is a possibility to specify which config file to use.



DTSWizard.exe: Import and Export Wizard



Import and Export Wizard: DTSWizard.exe

C:\Program Files\Microsoft SQL Server\130\DTS\Binn for the 64-bit version
C:\Program Files (x86)\Microsoft SQL Server\130\DTS\Binn for the 32-bit version

export and import wizard

example

Tap the F1 key from any page or dialog box of the wizard to see documentation for the current page

-- -------------

SQL Server Data Tools (SSDT):
1): Solution Explorer
2): Visual Studio




gacutil



rem list all the assemblies in GAC
gacutil.exe /l

rem list all installed versions of a specific assembly
gacutil.exe /l my_assmebly_name

rem install a dll file
rem gacutil.exe /i "C:Program FilesMicrosoft SQL Server100DTSTE.SSIS.DataFlow.dll"

rem uninstall an assembly from GAC
gacutil.exe /u my_assmebly_name

rem GAC Directory Structure
C:\Windows\Microsoft.NET\assembly\GAC_32
C:\Windows\Microsoft.NET\assembly\GAC_64
C:\Windows\Microsoft.NET\assembly\GAC_MSIL

gacutil




SSIS Package Store


%Program Files%\Microsoft SQL Server\{Version}\DTS\Packages

SSIS Package Store
The package store can consist of either or both the msdb database and file system folders listed 
in the Integration Services service configuration file
Packages that you save to msdb are stored in a table named sysssispackages.
system table: sysssispackages 

The logical folders that you create for grouping packages in msdb are represented as rows in the 
sysssispackagefolders table in msdb
sysstem table: sysssispackagefolders 

ssis package store 
File System
You can use the SSIS Package Store which is nothing but a well known location in the installation location.
%Program Files%\Microsoft SQL Server\{Version}\DTS\Packages

Or you can pick anywhere on the file system you like. If you go this route, then you'll need to 
ensure the SQL Agent account, or the credentialed proxies or, if you running packages from xp_cmdshell 
the SQL Server Service Account has access to that location.

The only advantage, if you want to call it that, of using the Package Store (i.e. the folder I mentioned) 
is you can use the Integration Services management tool that exists in SSMS 
(by connecting to Integration Services instead of database engine).

However that has a lot of pitfalls such as not being able to handle multiple instances, 
packages only run in 64 bit mode, no access to proxy accounts, etc. You shouldn't run packages from SSMS anyway.

Saving SSIS results to Log or text file using dtexec 




Question 12: Transformation: Slowly Changing Dimension Transformation


You work as a senior database administrator at ABC.com. The ABC.com network
consists of a single domain named ABC.com. ABC.com makes use of Microsoft SQL
Server 2012 in their environment. You are running a training exercise for Microsoft
SQL Server 2012 junior administrators. You are discussing the use of Slowly
Changing Dimension Transformation Outputs. One of the output options causes
Derived Column transformations to create columns for the expired row and the
current row indicators. Which option is the output that causes this?
A. Unchanged Output
B. Inferred Member Updates Output ( dimension table rows to be loaded )
C. Historical Attributes Inserts Output ( Type II)
D. Fixed Attribute Output
E. Changing Attributes Updates Output (Type I )

Answer C: Historical Attributes Inserts Output

Slowly Changing Dimension Transformation
notes 1: coordinates the updating and inserting of records in data warehouse dimension tables notes 2: The Slowly Changing Dimension Wizard only supports connections to SQL Server. notes 3: The Slowly Changing Dimension transformation does not support Type 3 changes, which require changes to the dimension table. By identifying columns with the fixed attribute update type, you can capture the data values that are candidates for Type 3 changes.



Question 54


Q54.
You are designing a data warehouse with two fact tables. 
The first table contains sales per month 
and the second table contains orders per day.
Referential integrity must be enforced declaratively. 
You need to design a solution that can join a single time dimension to both fact tables.
What should you do?

A. Create a time mapping table.
B. Change the level of granularity in both fact tables to be the same.
C. Merge the fact tables.
D. Create a view on the sales table.

Correct Answer: B

notes 1: another source says A is right: ( but this is wrong) 
         Create a time mapping table: hints datawarehouse use less tables and can be de-normalized

notes 2: Create a time dimension that can join to both fact tables at their respective granularity. 
( in ssas, create a time dimension, don't necessary need to create a time mapping table in the database 

utestking search term: design a solution that can join a single time dimension to both fact tables



Question 71



QUESTION 71
You are adding a new capability to several dozen SQL Server Integration Services
(SSIS) packages. The new capability is not available as an SSIS task. Each package
must be extended with the same new capability. You need to add the new capability to
all the packages without copying the code between packages. What should you do?
A. Use the Expression task.
B. Use the Script component.
C. Use the Script task.
D. Develop a custom task.
E. Develop a custom component.

Correct Answer: D

notes 1: Script task (control flow) Script component ( Data Flow)
         custom task ( specific ), custom component ( generic )


Developing Custom Objects for Integration Services
 

Extending Packages with Custom Objects 

Developing a Custom Task 
2012 

SSIS component is general term and can be used for Tasks,
Data sources and can be bound to SSIS toolbox.

some custom components types that can be implemented like 
custom tasks, custom connection managers, custom log providers,
custom enumerators and custom data flow components.

search term: You need to add the new capability to all the packages without copying the code between packages



Question parallel



Q:
You are performance tuning a SQL Server Integration Services (SSIS) package to 
load sales data from a source system into a data warehouse that is hosted on 
Windows Azure SQL Database. The package contains a data flow task that has 
seven source-to-destination execution trees. Only three of the source-to-destination 
execution trees are running in parallel.

You need to ensure that all the execution trees run in parallel.What should you do?

A. Set the EngineThreads property of the data flow task to 7.

B. Set the MaxConcurrentExcecutables property of the package to 7.

C. Create seven data flow tasks that contain one source-to-destination execution tree each.

D. Place the data flow task in a For Loop container that is configured to execute seven times.

Answer: A
MaxConcurrentExecutables EngineThreads 

MaxConcurrentExecutables: package that has three Data Flow tasks. 
If you set MaxConcurrentExecutables to 3, all three Data Flow tasks can run simultaneously. 

EngineThreads: The EngineThreads property is a property of each Data Flow task. 
This property defines how many threads the data flow engine can create and run in parallel. 
The EngineThreads property applies equally to both the source threads that the data flow engine 
creates for sources and the worker threads that the engine creates for transformations 
and destinations. 
Therefore, setting EngineThreads to 10 means that the engine can create up to ten source threads 
and up to ten worker threads.

70463exam 
search term: ensure that all the execution trees run in parallel ssis question 70-463



Question Data Profiling Task



Data Profiling task 

Data Profiling Task and Viewer 

Column Statistics Profile:
Reports statistics, such as minimum, maximum, average, and standard deviation for numeric columns, 
and minimum and maximum for datetime columns.
This profile helps you identify problems in your data, such as dates that are not valid. 
For example, you profile a column of historical dates and discover a maximum date that is in the future.

Column Value Distribution Profile
Reports all the distinct values in the selected column and the percentage of rows in the table 
that each value represents. Can also report values that represent more than a specified percentage of rows in the table.
This profile helps you identify problems in your data, such as an incorrect number of distinct values in a column. 
For example, you profile a column that is supposed to contain states in the United States and discover 
more than 50 distinct values.

dbj 

simple talk