Parallel Data Warehouse (PDW) Tip: Proof of Concept (POC) Guidelines

As with any other project our objective is to drive PDW POC success!  To clear and cultivate the playing field for successful POC engagements, we must, at a minimum, consider the following guidelines.

Guideline 1: Identify Existing Roadblocks and De-scope
Major POC challenges can manifest themselves in many different forms i.e. inability to seed data, dealing with sub-par data warehouse architectures, wrangling poorly concieved T-SQL/ETL architectures, etc.  If these challenges exist in the current production environment and are the primary driver behind the PDW POC initiative, approach with caution as these core issues can take more time to address when compared to minor issues such as lack of an SSIS framework, data delivery issues, or inability to generate very large data volumes.  If show-stoppers exist, try either de-scoping the issues/components from the PDW altogether or try to find a sustainable work-around.

Guideline 2: Gather Accurate Business and Technical Requirements
This should be fairly standard as the client will have a core dataset and/or processes that they would like see run on the PDW appliance coupled with an ‘ideal’ runtime. Keep the deliverables manageable and implement the SMART methodology (Specific, Measureable, Achievable, Realistic, Timescale). Be sure to gather base requirements for documentation, data volumes, environmental requirements, etc.   For example, the client may want to stage both a dev and test environment, they may only want a schemas worth of tables/data brought over from an existing  SMP database (Note: PDW does not support schemas), or the client may want to bring over 2 years of history rather than 10 years.  Keep historical data requirements to a minimum, as it may be difficult to find extraction windows from existing source systems. Similarly, data loads into the PDW via dwloader command or SSIS destination (slower than dwloader) can take a significant amount of time and may run over the course of X number of days depending on source system availability and data volumes.  Maybe most importantly, assess the current environment, data volumes, and processing windows to manage realistic expectations! Although there have been instances where specific loading routines can result in 100X improvement in load times, this is not always the case and this factor should not be used as a baseline.

Guideline 3: Streamline Testing Requirements
When addressing the testing phase, do not overload the POC activities with extraneous testing/recon requirements and keep the scope of testing and data validation to a minimum. Simplify the testing criteria to critical dataset load times vs. data volumes.  The success criteria should be clear i.e. Company GECKO has defined success as loading 2Tb of prior day policy activity data for 7 consecutive days in under 30 minutes each day, or Company STERNXM defines success as loading 20 regional subscriber datasets via flatfile concurrently in under 2 hours once a week.  Also, the client may be interested in testing ad-hoc query and data loads while the appliance is actively processing large data load requests. At a minimum, the testing process will address the following :

  • Data Load
  • Data Expansion
  • Query Performance
  • Query Concurrency
  • Mixed Workload Concurrency

Guideline 4:  Resource Planning and Communication
While keeping project activities within scope, ensure that the team has at least the minimum number of dedicated resources to the project.  In-house members of a POC delivery team usually include at minimum: DBA, Business Analyst, and a Data Developer engineer. Keep in mind, most streamlined POC engagements wrap up within a 2-3 week timeframe and if the delivery requirements are modified (heaven forbid) , to say,  include more complex PDW processing routines or include additional table loads, always ensure that you remain communicative and provide updates/recommendations to resource budgets and overall POC timelines.  Additionally, provide daily email updates and try to incorporate simple graphical representations of project progress (i.e. RAG Traffic lights). While never under-estimating, pad for the unexpected and ensure that time is added to the plan to account for contingencies.

Guideline 5: Partnership with Microsoft PDW COE
In a recent PDW implementation, I was involved with a few PDW-specific challenges applicable to query optimization, data loading, and appliance software upgrades.  As a result of both the clients and the consulting company’s partnership with the Microsoft COE, the downtime associated with these  challenges were minimized. Partnership led to smoother and more efficient PDW POC delivery as the COE was ready and willing to recommend and demonstrate solutions that aligned to MS COE best practices . Ultimately, the COE recommendations and actions directly benefited system usability and sustainability and ensured a successful POC delivery.

I would like to build a collection of guidelines within this post to benefit future PDW POC implementations!  If you have an idea or want to expand on one of these guidelines, feel free to share your personal experiences and/or workarounds to common POC challenges.

I want to thank the following contributors of the PDW community for providing insight that led to the formulation of the POC guidelines above:

Matt Goswell
James Herring
Ralph Kemperdick
Derek Comingore
Matthew Winter

Visio 2010 Workaround: Application Hangs when opening ‘Database Model Diagram’ template

I received the following error today when attempting to use Visio’s ‘Database Model Diagram’ template under category ‘Software and Database’.

The program VISIO.EXE version 14.0.6122.5000 stopped interacting with Windows and was closed. To see if more information about the problem is available, check the problem history in the Action Center control panel.  Process ID: 3fa8  Start Time: 01cdad7f36f7bba9  Termination Time: 0  Application Path: D:\Program Files\Microsoft Office\Office14\VISIO.EXE  Report Id: 9c9eef0a-1972-11e2-9a12-005056995c65

Sure enough, per the seemingly improbable suggestion made in the post below, Visio did not hang after changing the default printer to ‘Microsoft XPS Document Writer’. Don’t make the i mistake I did and change the default printer to ‘Microsoft XPS Document Writer (redirected 7)’.

http://answers.microsoft.com/en-us/office/forum/office_2010-visio/visio-2010-not-responding-when-opening-database/64b2b582-53f0-41f2-bb47-d3a3f36aeb82

As it turns out,  Remote Desktop (RDP) assigns remotely-mounted printers which can cause issues with Visio’s templates.

Below is similar to what you should see once you’ve assigned the correct default printer.

Parallel Data Warehouse (PDW) Performance Tip: CTAS vs INSERT SELECT

When loading tables in Microsoft’s Parallel Data Warehouse (PDW) via T-SQL, one will soon encounter the need to load data into a new table. Whether the developer needs to perform an initial data load of table or even go through the PDW-specific steps to perform a ‘trunc and load’ of a table, there are a two table load methods to consider.

PDW T-SQL Load Methods:

  • INSERT SELECT
  • CREATE TABLE AS SELECT(CTAS)

To make a direct comparison between the two methods, I have provided the actual PDW query and execution plans generated by each method using identical SELECTs

Load Method: INSERT SELECT  

Usage
Appends one or more new rows to a table. Add one row by listing the row values or add multiple rows by inserting the results of a SELECT statement.

Example
Insert Into TABLEA SELECT only, use, this, method, to, append FROM TABLEB

Generated Query Plan

Image

Generated Execution Plan

Load Method: Create Table as Select (CTAS)

Usage
Creates a new table in SQL Server PDW that is populated with the results from a SELECT statement.

Example 
CREATE TABLE myTable (
yearId int NOT NULL,
some varchar(50),
stuff varchar(50))
WITH
( CLUSTERED INDEX (yearId) );

Generated Query Plan

Generated Execution Plan

*When comparing to INSERT SELECT, notice the absence of the Index Scan and Parallelism steps above.  CTAS results in an Index Seek for this particular SELECT and also does away with the Parallellism steps.

The Verdict – Use CREATE TABLE AS SELECT (CTAS) when loading empty tables!

Performance Results
CTAS loaded the test table more that twice as fast as INSERT SELECT.

  • CTAS runtime: 1min  44sec
  • INSERT SELECT runtime: 4min  2sec 

Simple Explanation
On the PDW, INSERT SELECT is a fully logged transaction and occurs serially per each distribution. CTAS, however, is minimally logged and happens parellelly across all nodes.

NOTE: When performing CTAS or re-creating tables in any way, you will need to create statistics on those tables upon loading.

Ideally, you should create stats on all the join columns, group by, order by and restriction. SQL Server PDW does not automatically create and update statistics on the Control node for every occasion when SQL Server creates or updates statistics on the Compute nodes:

— This will create stat for all columns on all objects

select ‘create statistics ‘+b.name+‘ on dbo.’+a.name+‘ (‘+b.name+‘)’
from sys.tablesa,sys.columnsb
where a.object_id=b.object_id and notexists(
selectnullfromsys.stats_columnswhere object_idin(selectobject_idfromsys.stats_columnsgroupbyobject_idhavingcount(*)>=1)
and object_id=b.object_idandcolumn_id=b.column_id)
order bya.name,b.column_id;

%d bloggers like this: