WIDBA

WIDBA - The busiest DBA in Wisconsin, its all I do and I do it well

Wednesday, January 30, 2013

Disabing SSIS properties quickly courtesy of SSISDB

Today I am going to tackle something that came up recently at work - finding the error in an SSIS package and disabling the problematic step so the rest of the processing could continue.  (In a perfect world, the package would compensate for the "unimportant" step, but things happen when you have 100's of packages and release software faster than Java releases updates.)

The problem came up when a small file was not able to open and the step was buried inside a much larger SSIS package, so simply skipping the package was not prudent.

A few solutions came to mind:

  1. Try to figure out the root problem and fix that
  2. Open up BIDS and simply disable the step and run the package in its entirety.
  3. Open up the Job and navigate to the configuration for the step and override the property on the step to disabled.
  4. Fix the issue (change a precedent constraint and a property) and just go to production without a second thought to creating more problems :)
I am recovering DBA, so I went towards number 1 right out of the gate.  This proved to not be feasible given the importance of the step, the fact it involved people outside the team, etc.  2 was a very real possibility but I hate doing "out of band" things on production.  Number 3 ended up being the method I went with.  4 was the last resort - a DBA's nightmare and it turns out is probably common in small Agile shops.

So, I needed to figure out what I needed to disable, in this case I found the error message using SSMS and navigating through the Integration Services Catalogs -> Project -> Package -> Reports -> All Executions and seeing the error.  I then opened up the project in BIDS to find the path to the task that needed to be disabled.(Click on the task, there is a property that has the "path")  This tends to take awhile to accomplish.  Here is a much quicker way to get there.

Here is the SQL to query SSIS DB in SQL 2012.  I have no idea if this works on those old SQL versions.

DECLARE @project_name varchar(128) = 'BeersOfWisconsin'
DECLARE @package_name varchar(128) = 'LoadBreweryDim.dtsx'
DECLARE @DaysLookBack int = -7;

SELECT start_time,end_time,package_path, message,execution_path
FROM catalog.executions ex
INNER JOIN catalog.event_messages evt on ex.execution_id = evt.operation_id
WHERE ex.project_name = @project_name
AND ex.package_name = @package_name
AND ex.status = 4
AND message_type >= 100 -- as of now 110 is warning, 120 is Error, 100 is QueryCancel, 130 is TaskFailed, 140 is Diagnostic
AND start_time > DATEADD(Day,@DaysLookBack,CURRENT_TIMESTAMP)
ORDER BY execution_id DESC

Most of that is for information but the last column as an execution_path that in my case yielded exactly what I needed. (note bold portion). 

\BeersOfWisconsin\DimBreweryLoad\DimDistributor\DimDistributor - ReadFile
Doing a little surgery to this - I end up with what I need to change in the Configuration Property Override path.  

\Package\BeersOfWisconsin\DimBreweryLoad\DimDistributor.Properties[Disable]

Now simply set the Property Value = 1 and save the job and rerun.  

Note:  I have only done this as a back tracking exercise to solve the problem faster in the future, I am not sure if this method will work with all packages and container structures.

Remember to address the root cause and remove the property!

Saw this image the other day, some days they go hand in hand :)





Sunday, January 20, 2013

Powershell and a poor man's math tutor

Math.  It is an important thing, hard to realize when you are 6 and learning to add, subtract and multiply, but as parents we know its important and want our kids to do better.  To keep math interesting, I have made it increasingly harder and tried to add complexity and different delivery methods all the time, sometimes its adding coins or using prices in the grocery stores with scenarios to keep them learning.  This weekend I whipped up an excel spreadsheet where my son could type in answers and get the correct/incorrect answer right away.  He liked being able to use the computer to do this, so today I decided to whip up a quick program that would allow him to choose a symbol and the size of the numbers if he wanted.  It is pretty basic and could use ShowUI, but certainly might keep him interesting for a few nights.



 [int]$lowNum=1  
 [int]$highNum=99  
 [string]$symbol = "+"  
 if($lowNum -eq $null){ $lowNum = Read-Host "Please enter low number for equations"}  
 if($highNum -eq $null) { $highNum = Read-Host "Please enter high number for equations"}  
 if($symbol -eq $null) {$symbol = Read-Host "Please enter the symbol for the equations"}  
 do  
 {  
   $rand = New-Object system.random  
   $first = $rand.next($lowNum,$highNum)  
   $last = $rand.next($lowNum,$highNum)   
   write-host $first " " $last  
     [string]$ansString = "What is {0} {1} {2}?" -f $first,$symbol,$last  
     [int]$answer = Read-Host $ansString  
     [int]$correctAnswer = Switch($symbol)  
     {  
       "+" {$first + $last}  
       "-" {$first - $last}  
       "*" {$first * $last}  
       "/" {$first / $last}  
       default{"UNK"}  
     }  
   write-host "correct" $correctAnswer " " $answer  
   If($correctAnswer -eq $answer)  
   {  
     $cont = read-host "Correct. Do you want to continue? (y/n)"  
   }else  
   {  
    Write-Host "Incorrect. Answer is " $correctAnswer " Do you want to continue? (y/n)"  
   }  
 }  
 while($cont -eq "y")  


Once their math is in good order, you can head to the woods to work on their tracking skills.

1

Friday, January 11, 2013

Setting up a SSAS Sync with Powershell

In the SSAS world, a method to separate processing and presentation is utilizing a two server setup.  The "processing" server is where all the heavy lifting is done and a secondary server is where the cubes are hit by end users and reports.  Given my newbie status in the BI space, I am guessing, but pretty sure that XMLA and SSIS tasks are the common methods for syncing a cube from one server to another.  A side benefit of the script I am about to share is its really easy to move a cube from one server to another for any reason, plug in the values and hit the button.

So, a picture is a worth a thousand words (this one probably doesn't qualify as I did it with Paint, but here is the general architecture.

Alright, first we will need a function to see if SSAS is actually running (with retry logic) - this is used throughout.


 function Verify-SSASConnection([Parameter(Mandatory=$True,Position=1)][Microsoft.AnalysisServices.Server]$SSASConnection,  
                 [Parameter(Mandatory=$True,Position=2)][string]$SSASName)  
 {  
   if($SSASConnection.Connected) {return;}  
     
   $SSASConnection  
     
   Write-Info "Attempting ReConnect for $SSASName"  
     
   [int]$i = 0;  
   while(!($SSASConnection.Connected))   
   {  
     if($i > $retryAttempts) {throw "Alert! $SSASName Unavailable, $cubeName Disconnected"}  
       
     try{  
       $SSASConnection.Connect($SSASName)  
       }  
     catch{  
       Write-Info "retry loop for $SSASName - $i" + $SSASConnection.Connected  
     }  
       
     if($SSASConnection.Connected) {return;}  
     $i++;  
     Start-Sleep -Seconds $retryDelay  
       
   } # looping retry      
   return;  
 }  

The steps below can be summarized pretty nicely into this:

  1. Detach Processing Cube and copy the entire cube folder to the Presentation Server (with a _new appended on the name) and then reattach Processing Cube
  2. Detach the Presentation Cube and rename the folder (with a _old)
  3. Swap the Processing Folder to previous Presentation Cube folder name.
  4. Bring the Presentation Cube back online
  5. Party

The script should be saved as a file and simply called via SQL Agent (Operating System Command in this case).  The parameters are the source (processing server), the presentation server, the cube and the mode it should be restored in.


 Param(  
   [Parameter(Mandatory=$True,Position=1)]  
   [alias("SRC")]  
   [string]$srcSSASName ,  
   [Parameter(Mandatory=$True,Position=2)]  
   [alias("DEST")]    
   [string]$destSSASName ,  
   [Parameter(Mandatory=$True,Position=3)]  
   [string]$CubeName ,  
   [Parameter(Mandatory=$True,Position=4)]  
   [ValidateSet("ReadOnly","ReadWrite")]  
   [string]$CubeModeParm    
 )  
 $ErrorActionPreference = "STOP"  
 [int]$retryDelay = 30  
 [int]$retryAttempts = 10  
   
 [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices") | out-null  
   
 $source = New-Object Microsoft.AnalysisServices.Server  
 $destination = New-Object Microsoft.AnalysisServices.Server  
   
 if($cubeModeParm -eq "ReadOnly"){  
   $cubeMode = 1  
 } else {  
   $cubeMode = 0  
 }    
   
 function Write-Info([string]$msg)  
 {  
   $dt = (Get-Date)  
   if ($host.Name -eq "ConsoleHost"){Write-Host "$msg | $dt"} else {Write-Output "$msg | $dt"}  
 }  


Now grab the server names for the physical copy, the Regex below was not my first choice due to the fact I still don't get it completely, but this hombre advocates for it.


 $srcServerName = $srcSSASName -replace '^([\w\d]+).*$', '$1'  
 $destServerName = $destSSASName -replace '^([\w\d]+).*$', '$1'  

You will notice the Write-Info function use, its my hacky way of having the function decide how to output the myriad of debugging steps I have in this script.  It is nice when things aren't working to find the problem.  Now its time to connect and get the data directories for each instance.


 Write-Info "Starting process to move $cubeName from $srcServerName to $destServerName"  
   
   # Source Connect  
   try {  
      $source.Connect($srcSSASName)  
   }   
   catch {  
     throw "Source SSAS $srcSSASNAME Connect $Error "  
     Exit 1  
   }    
     
   # Destination Connect  
   try {  
      $destination.Connect($destSSASName)  
   }   
   catch {  
     throw "Destination SSAS $destSSASNAME Connect $Error "  
     Exit 1  
   }    
     
   #Fetch current ssas data directories and unc    
   try {  
     $srcDataDir = $source.ServerProperties.Item("DataDir").Value  
     $destDataDir = $destination.ServerProperties.Item("DataDir").Value  
     $uncSrcDataDir = "\\{0}\{1}\" -f $srcServerName, $srcDataDir.Replace(":","$")  
     $uncDestDataDir = "\\{0}\{1}\" -f $destServerName, $destDataDir.Replace(":","$")  
   
     If((test-path -Path $uncSrcDataDir) -eq $false) {throw "Source Data Directory Does Not Exist - $srcServerName , $srcDataDir"}  
     If((test-path -Path $uncDestDataDir) -eq $false) {throw "Destination Data Directory Does Not Exist - $uncdestDataDir"}   
   
   }  
   catch [system.exception] {  
     throw "Directory setup failed - $Error[0]"  
     Exit 1  
   }  


Ok, we have connected to the SSAS servers and got the data directories.  The next statement is a bit trickier as SSAS DBs have a ..db naming scheme, and we need to find the most recent revision to sync.


   $srcFolderName = (gci $uncSrcDataDir | ?{$_.Name -match "$cubeName\." -and $_.psIsContainer}| sort -property LastWriteTime -des | select-object -first 1).Name  
    
If the cube exists on the destination, this code gets those pieces of data set up. 
   

   $destFolderName = (gci $uncDestDataDir | ?{$_.Name -match "$cubeName\." -and $_.psIsContainer}| sort -property LastWriteTime -des | select-object -first 1).Name  
   IF($destFolderName -eq $null)  
   {  
     $destFolderName = $srcFolderName  
     Write-Info -msg "Destination does not exists for cube - using $srcFolderName"  
   }   
   $destLocalFolder = "{0}\{1}" -f $destDataDir,$destFolderName #path to cube to attach for ssas (not UNC)  
     
   #setup vars  
   $uncSrcDataDir = "{0}{1}" -f $uncSrcDataDir,$srcFolderName  
   $baseFolder = "{0}{1}" -f $uncDestDataDir,$destFolderName  
   $newDestFolder = ($uncDestDataDir + $destFolderName + "_new")  
   $oldDestFolder = ($uncDestDataDir + $destFolderName + "_old")  
       
   $srcFolderReAttach = ($baseFolder -replace '\\\\.+?\\(\w)\$','$1:')  
     
   if(test-path -Path $oldDestFolder) # Remove old folder if exists  
   {  
     Remove-Item -Path $oldDestFolder -Force -Recurse  
     Write-Info "Removing $oldDestFoler - still existed!"  
   }  
   Double check that the "new copy of the cube" isn't on the destination.  
   if(test-path -Path $newDestFolder)   
   {  
     $newestFileDate = (gci -Path $newDestFolder | Sort-Object LastWriteTime -Descending | Select-Object -First 1).LastWriteTime   
     $dateFilter = (Get-Date).AddSeconds(($retryAttempts * $retryDelay) * -1)  
       
     if($newestFileDate -ge $dateFilter) {  
     throw "Existence of $newDestFolder with recent file dates - a Sync is already running"  
     Exit 1;  
     }  
     # wait for one delay - then double check destination connection.  
     Start-Sleep -Seconds $retryDelay   
       
     Verify-SSASConnection ([ref]$destination) ($destSSASName)  
     Remove-Item -Path $newDestFolder -Force -Recurse  
     Write-Info "$newDestFolder Exists! Likely failure on prior run."  
   }  
   
    
Now separating the wheat from the chafe - detach the source cube so the files can be copied.


   try {  
     Write-Info "Starting Detach of $cubeName"   
     $source.Databases.GetByName($cubeName).Detach();  
   }    
   catch {  
     throw "Detach failed for $cubeName $Error[0]"  
     Exit 1  
   }  
Copy the folder (it will have lots of files depending on the cube), note the finally as if the copy fails the source cube will be brought back online.


   try {    
     Write-Info "Starting copy $newDestFolder "   
     Copy-Item -Path $uncSrcDataDir -Destination $newDestFolder -recurse -force  
   }    
   catch {  
     throw "Copy failed for $cubeName $Error[0] - Attempting reattach on Source"  
     Exit 1  
   }  
   finally  
   {  
     Write-Info "Attaching Source Cube $srcFolderName"  
       
     Verify-SSASConnection ([ref]$source) ($srcSSASName)  
     $source.Attach($srcDataDir + "\" + $srcFolderName)  
     $source.Disconnect();    
   }  
     
   Write-Info "Starting Destination Cube Detach Process $destSSASName and replace with $newDestFolder "  
     
   if(!(test-path -Path $newDestFolder)) {throw "New Folder Copy on $newDestFolder did not succeed - $error[0]"}    

  At this point, the copy of the updated cube has been completed and the source (processing cube) is back online for the next process command and we now detach the presentation cube.      
    

 try{  
     Verify-SSASConnection ([ref]$destination) ($destSSASName)  
       
     if($destination.Databases.Contains($cubeName))  
     {  
     $destination.Databases.GetByName($cubeName).Detach()  
       Write-Info "Detach of $destSSASName completed"  
     }   
   }    
   catch{  
     throw "Destination Detach Failed for $cubeName - $error[0]"  
   }    
     
   Write-Info "Attempting Folder Swap Main:$baseFolder Old:$oldDestFolder "  
    
Once the detach is complete, we now want to rename the current destination cube (append an _old to the name.  Upon failure, the presentation cube is brought back online.


   try{  
     if(test-path -Path $baseFolder)  
     {  
       Write-Info "Renaming Current Destination Folder $destFolderName"  
       Rename-Item -Path $baseFolder -NewName $oldDestFolder -Force   
     }  
       
   }   
   catch {  
     Write-Info "Failure of folder rename $destFolderName - attempting attach of old data"  
       
     Verify-SSASConnection ([ref]$destination) ($destSSASName)  
     $destination.Attach($destLocalFolder,$cubeMode) #1=readonly  
     $destination.Disconnect()     
       
     throw "Folder Swap Failed - $baseFolder. Attempting reAttach - $error[0]"  
   }  
     
   Write-Info "Renaming $newDestFolder to $destFolderName"  


The folder rename can be problematic so there is a rolling retry over this ( in testing, this has only failed once in a while, and usually one for one loop.  Note the finally again will bring the cube back online on failure or success.


   try{  
     [bool]$fileRenameComplete = $false  
     [int]$retries = 0  
     while($fileRenameComplete -eq $false)  
     {  
       try {  
         Rename-Item -Path $newDestFolder -NewName $destFolderName -Force  
         $fileRenameComplete = $true;  
       } catch {  
         Write-Info "Rename Failed for $newDestFolder"  
       }  
       $retries++;  
       IF($retries -ge $retryAttempts)   
       {  
         throw "Retries for $newDestFolder rename Failed!";  
       }     
       Start-Sleep -Milliseconds $retryDelay #very brief delay before retry  
     }  
   }   
   catch {  
     throw "Folder Replace with update failed $newDestFolder - $Error[0] "   
   }   
   finally{  
     Write-Info "Attaching Destination Cube $destLocalFolder"   
       
     Verify-SSASConnection ([ref]$destination) ($destSSASName)  
     $destination.Attach($destLocalFolder,$cubeMode) #1=readonly  
     $destination.Disconnect()     
   }  
   
   Write-Info "Cube attach completed - Starting $oldDestFolder removal "   

So, here is a Powershell version of a cube sync, the base set of code was borrowed/stolen from a co-workers implementation.  I was charged with adding the retry logic and fault tolerance.

Note: SQL Agent can be a bit painful when working with Powershell, that stuff is documented on the web.  


Wednesday, January 2, 2013

A former DBA's guide to Agile.


"Ninety-nine percent of all software sucks, with Agile we can make it a hundred" (paraphrasing a tweet from Buck Woody some years back)

The past year brought me out of the waterfall software methodology and into working in a agile shop.  I used to make fun of agile, partly because I have never seen it implemented well and partly because I was a DBA in a waterfall shop. (If for some reason you don't know a DBA, they proceed cautiously, probably have a black leather jacket and generally don't like change.)

So here it is in former DBA writing style - a list.

0. Forget everything you knew about project management, it hardly exists anymore as a stand alone job.  (And C# arrays are zero based, so my list is as well)

1. It is better to be a developer in an Agile shop, a waterfall DBA would go insane.

2. As you get more comfortable with pushing code that has hardly been tested to production, be careful as it starts to creep into your daily life; just the other day I walked out on the ice on the river behind my house to make sure it was safe (open water a few feet away), when my ax was right there to test it first.

3. You have to completely give in to the developers, especially since work that has been promised to the business needs to get delivered, there is no more "control".  In any corporate power struggle, the business folks win, and if you hold up their new KPI, because the code is insecure or non performant, you lose.

4. Each unit of work (a story) is small enough generally to be completed in a day or two of work and it might not be the finished product. "You mean this stored procedure is only half written! You are NOT going to production with that!"  says a waterfall DBA the morning of a huge deployment every three months.

5. If you ever want to be a remote worker - this is a great way to show management that work is being completed despite the fact they don't get the face time they value so much.

6. Because deliverables are generally small - the screw-ups are usually smaller but more visible; making this useful if you have a vendetta and want to get someone axed, but have a big HR department. :)

7. There seem to be less meetings because things need to get done.  A huge win for those of us who lived in meeting driven development.

8. Everything you do has a "point" value, so if you are a competitor and believe the developer with the most points wins, you get to win many times a year.

9. You will get better at estimating, or you will work more than you want to in a given sprint.  The upside is the entire team gets to put their .02 in on how complex the work actually is; so the old days of some folks over estimating their work in an effort to provide time to play on Facebook is gone.

10. If a sprint falls during mushroom season, you can just lower your velocity a bit and spend a little more time in the woods :)




I have no classical training in agile so I am sure the pedantic readers will note I don't know the terminology or really understand what I am talking about, my lone excuse is I am too busy trying to keep up with all the cool new technology and projects I get to work on.