WIDBA

WIDBA - The busiest DBA in Wisconsin, its all I do and I do it well

Tuesday, November 12, 2013

Getting run time for "child" SSIS packages out of SSISDB

Today I was looking at an SSIS package, it's sole job is to call a bunch of other SSIS packages in a specific order (loading a bunch of dimension tables).  Given some recent data growth, I was looking for low hanging fruit to help shorten our load times.  Interestingly enough, SSISDB, nor the Integration Services Catalog, have an obvious way to view these "inner" SSIS package statistics via the normal channels.  My Google fu was not working after several tries so I whipped this up.  Hopefully I hit all the keywords for the next person trying to find something similar.

I am extrapolating the "duration" from the first message to the last, so keep that in the back of your mind.  It gave me what I needed which was a place to focus based on run time.

 SELECT exe.folder_name, exe.project_name, exe.package_name as master_package,details.*  
 FROM ssisdb.catalog.executions exe  
     INNER JOIN (  
                 SELECT op.operation_id,created_time,object_name,evt.package_name,DATEDIFF(MINUTE,stDate,EndDate) as MinDur  
                 FROM SSISDB.[internal].[operations] op  
                     INNER JOIN (  
                                 SELECT package_name,operation_id,MIN(message_time) as stDate, MAX(message_time) AS endDate  
                                 FROM ssisdb.catalog.event_messages  
                                 GROUP BY package_name,operation_id  
                                 ) evt on op.operation_id = evt.operation_id  
                 ) details ON exe.execution_id = details.operation_id  
 WHERE  exe.project_name = 'SSISProjectName'   
     AND exe.start_time > '2013-11-05'   
     AND exe.package_name = 'TheSSISPackageParent.dtsx'  
 ORDER BY start_time  


Friday, October 25, 2013

A year in review.

A co-worker decided to assign individual blog topics to each of us recently, presumably to create peer pressure for blogging.

My topic:
  • 1 year recap at Trek

I will sum up this past year.  I do not hate Mondays.  #booyah

.................

Trek is a bike company, I had not really rode in 10 years, so I got a new one and rode close to 430 miles this summer, that is what some crazy coworkers do in a week, but I was happy with the miles for my first year getting in the groove and a whole lot of uncooperative weather to start out the riding season.

The company is based out of Waterloo, WI (near Madison).  I live about 4 hours away and see the home office every other month.  The remainder of the time I commute to work in about 30 seconds, with a quiet office overlooking my garden, at least one yellow lab laying at my feet and my own coffee.  I built a stand up desk (my posture is horrible sitting), so I work standing up most of the time and love it.  Comparing this to my previous employment is pointless, there is no comparison (1.5-2 hrs on the road, mediocre coffee, small cubicles, low walls, noise, politics, wearing dress clothes, etc)

Trek does Agile, in the sense that we have sprints, stories and deliver every 2 weeks.  (for those who say "how do I know this person is working at home", you simply allow them to participate in planning the work and setting velocity), if the work is not complete and it happens consistently, its pretty obvious they should go work in a big corporation that does waterfall where they can hide for months or years at a time. 

The main reason for me to switch jobs was boredom and opportunity, I was bored in my current position and BI seemed to have all the cool toys and new tech of the day (minus Hekaton, which is really interesting on many levels).  Trek is the antithesis of boredom and old technology, within 2 weeks of my first day I was working in SSIS on Azure IaaS.  The team has been on SQL 2012 for well over a year and we have several tabular cubes in production on top of the several MDM cubes already.  Sharepoint 2013 is running and all the Power_* technologies are being used, abused or at least being studied for feasibility right now.  This new SSIS feature is going to be lots of fun to work with as Power Query becomes more prevalent.

The last thing I want to brag about is my co-workers.  I work on a relatively small team ~10 folks of which most have blogs, half of them speak at user groups and SQL Saturdays and all are good at what they do.  I highly recommend working with people who are smarter than you, its challenging, but your skills improve exponentially faster.  I should also mention that I got to attend the PASS Summit 2013 with several of my co-workers after only being with the company for a year.  It was my second Summit and it is a great opportunity to really learn new skills and talk with the brightest minds in the business.

So in summation, I took a risk switching careers, took a few steps down the career ladder and got back to learning again.  It has been a great ride.


Might as well throw some goals for year two in.
  • Get my head around how SSAS cubes work - so I can logically solve MDX problems without resorting to guesswork.
  • Ride 1000 miles in 2014 and ride in the Trek 100, 36 mile route
  • Get better at testing and validating my own code.
  • Improve my performance tuning skills in the data warehouse arena
  • Speak at a technical event on a non Powershell topic
  • Pick a technology and immerse myself in it, some neat possibilities are Power Query and "Q and A" (my bet is its going to be called Power Bing or Power Google eventually).  

Tuesday, August 6, 2013

Saving an Excel workbook using Excel Services and Powershell on Sharepoint 2013

Welcome, I can't imagine too many scenarios where someone would want to open and "save as" an Excel file in Sharepoint 2013 with Excel Services, but if you do, here is how I implemented this via Powershell.  This was strictly a trial attempt at work, so its all pretty straight hard coding.

The long and short of it, SSRS was dropping an Excel file on Sharepoint and the Power[Pivot or whatever new name is] didn't recognize the Excel file, so one thought was simply using ExcelServices to open and do a "Save as" to make Sharepoint and Power_ realize it really was an Excel file.

This short script connects to the Excel Service, opens the workbook and does a SaveWorkbookCopy and closes.  (As of now, the documentation for SaveWorkbookCopy seems to be different than the implementation).  One thing I noticed is how quick this is considering the size of some of the workbooks - so it must be doing the work completely on the web service.  I could be wrong, but that is how it "seems".


 $URI = "https://your.sharepoint.url/site/_vti_bin/excelservice.asmx"  
 $proxy = New-WebServiceProxy -Uri $URI -UseDefaultCredential  
 $culture = "en-US"  
 $wkbook = "https://your.sharepoint.url/site/UnrecognizeExcel.xlsx";  
 $newwkbook = "https://your.sharepoint.url/site/RecognizedExcel.xlsx";  
 $wsStatus = $null  
 $sessionId = $proxy.OpenWorkbook($wkbook,$culture,$culture,[ref]$wsStatus)  
 $sessionId  
 $proxy.SaveWorkbookCopy($sessionid,$newwkbook,"FullWorkbook","AllowOverwrite")  
 $proxy.CloseWorkbook($sessionId);  



Sunday, April 7, 2013

Speaking

Just returned home from my sixth SQL Saturday speaking engagement on Powershell and SQL Server.  Had a great time in Madison this year, visiting with various people I have met over the years and visiting with lots of PASS MN folks that I rarely see now that I am telecommuting.

My session A Powershell Driven Life (scripts available) was well attended again and follows the theme I have used the past three SQL Saturdays, using pictures of various beers, etc as my lead in to each script.  The demos all worked perfectly (something I noticed a few speakers really struggled with) and I found myself going much slower than I intended too - getting through even less demos than I had figured was my minimum based on trial runs.  I don't know how the great speakers can predict timing so effectively.  One thing that surprised me was the sheer number of people new to Powershell that came to my "intermediate" session.  I tried to slow down and explain a few concepts that tripped me up when I was learning.  The feedback was excellent, and my numerical scores were lower than usual.  Personally, the feedback is more useful than the arbitrary grading scale.  (Or that is what someone with lots of 3's and 4's would say :) )

The thing I thought about this morning, is the sheer amount of commitment that is required from the organizers and speakers of these events.  My presentation code was basically already done, because it was developed "in the line of duty".  I spent an inordinate amount of time rehearsing - and hoping to develop a some comedic relief as I went along.  Speaking of comedic relief, my coworker @PhoneticTalk did a fantastic job of intermixing humor and content in his Power View session.  In developing this presentation, I did not track my hours, but if I was consulting it would be in the $1000's for billable time.  In talking with some fellow speakers, I still marvel at why people are willing to travel, finance their own hotels, etc and spend inordinate amounts of time speaking.  So, I will share in brief what speaking did for my career.

My current boss and I met when I was speaking at a user group.  That occurred years before I got the opportunity to work with them and so I have written off the tons of hours preparing and giving presentations as worth it. I am one of the lucky people with a challenging job on a great team of smart people who are fun to work with.  Not too mention, I am learning loads of new technologies and getting real world experience working with Analysis Services, after ~11 years on the OLTP side.  In addition, I know some really smart people scattered around the region and if needed, could probably reach out to them for help when I am in a bind.

So, I may not speak again for awhile, I enjoy Powershell talks but I am not sure where to take it from here - and its time to let some other folks jump in.  Based on the amount of folks that haven't used Powershell , I may be inclined to work on an introduction course if one is not offered next year.

If for some reason you read this and have thought about speaking but haven't, give it a try, a great opportunity might be sitting in the audience.







Tuesday, March 26, 2013

The eye pleasing data alert

There comes a time in any developer's life where they must write some code that emails folks when there are data problems.  Regardless of the system or causation of a data problem, an alert must be sent.  In the days of being a DBA, the alert would generally be a text file or just plain text dumped into the body of an email.  This worked fine for me for several years as I was generally the primary consumer of such alerts.  Those days are over.

Enter working in business intelligence.  Now the data matters and the format it is delivered in.  The learning curve for me has been steep; I have always been more focused on content than aesthetics.  Now both are a requirement.  Initially, I thought this was silly but I am starting to understand the importance of things being right and looking good.  Alerts, reports, etc will be used and paid attention to when they have the proper coloring and gradients to express what the data is saying.  Just dumping numbers to the screen will not suffice anymore.  In addition, the obvious mobile device usage in this day and age predicates that the alerts look good on a mobile phone as well.

The process is brute force HTML construction, but it yields a pretty nice looking alert and does not require outright looping or much code.  Essentially, you construct a style sheet and then use a few TSQL tricks to generate the data set and create ladderstriping with Row_number() functions. (yet another term I was unfamiliar with until BI).  I would not call the process graceful, but it passes the eye test with the very picky data consumers I have :).



DECLARE @Alert NVARCHAR(MAX) = '' +
      
N'' +

N'Data Alert Title' + 
    N' Order NumberLineBusiness UnitDocumentDocument TypeCompanyCustomer ' 

SELECT @Alert = coalesce(@Alert,'') +
' CASE WHEN (ROW_NUMBER() OVER (ORDER BY [Date] desc,Document desc,OrderNumber desc)) %2 = 0 THEN 'r2' ELSE 'r1' END 
+ '">' +
'' + ISNULL(OrderNumber,'') + '' +
'' + ISNULL(CAST(LineNumber AS VARCHAR(10)),'') + '' +
'' + ISNULL(BusinessUnit,'') + '' +
'' + ISNULL(CAST(Document AS VARCHAR(24)),'') + '' +
'' + ISNULL(CAST(DocType AS VARCHAR(24)),'') + '' +
'' + ISNULL(Company,'') + '' +
'' + ISNULL(CustomerNumber,'') + '' 
FROM 
FROM dbo.MyDataTable
        WHERE alertProblem=1 )
DECLARE @rc INT = @@ROWCOUNT

SELECT @Alert += N'
' ;
IF(@rc > 0 AND @Alert IS NOT NULL) 
BEGIN
DECLARE @MailProfile VARCHAR(255) = 'mailprofile'
DECLARE @subj varchar(128) = @@SERVERNAME + ': DataAlertName'
DECLARE @FromAddress varchar(128) = @FromAddress

EXEC msdb.dbo.sp_send_dbmail  
@from_address = @FromAddress
,@reply_to = @FromAddress
,@recipients = @email
,@body = @Alert
,@subject = @subj
,@profile_name = @MailProfile
,@body_format = 'HTML' ;
END


Looking forward to trying one of these out for fixing fences and hauling rocks this spring.


Tuesday, March 19, 2013

Solving the Space Wars via Powershell

The constant battle as the consumption of data grows is storage space.  When a disk space alert goes off, you generally will jump on the server over RDP and start hunting through the drive for what has caused the growth to hit the "wake me up" limit.  

This has been an occurrence in my career for a number of years, but seems to be happening more and more often.  As my career has progressed, I create more and more monitoring for various things that cause outages ultimately but this was never put together.  Given the simplicity of this script, I wish I had done it when my primary job was being a DBA.  Sure would have helped on our development and QA servers with finding what data file grew out of sight, or what log file had gotten carried away.

So here it is, a pretty basic thing that just dumps the information to the out-gridview control.  This could be shortened and condensed but that would sacrifice the readability and that is more important :)

Note: You may need a second variable for the SQL Instance name if not default
 $Server = "DBServer"  
 [Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo")  
 
 #Load up the SMO for the server      
 $sqlServer = New-Object("Microsoft.SqlServer.Management.Smo.Server") $Server  
 
 # Grab list of drives - this should get us mount points as well as regular drives (not tested)  
 [Array]$Drives = Get-WmiObject -ComputerName $server -Class Win32_Volume |   
     Select -Property DriveLetter, Label, @{Name=’FreeSpaceMB’;Expression={$_.FreeSpace/1MB} } |  
     Where-Object{$_.DriveLetter -ne $null}   

 # Grab the data/index files for each database  
 $dbFiles = $sqlServer.Databases |   
   Select-Object FileGroups -ExpandProperty FileGroups |  
   Select-Object Files -ExpandProperty Files |  
   Select FileName,Name,UsedSpace,Size  

 # Grab the log Files for each database    
 $dbFiles += $sqlServer.Databases |   
   Select-Object LogFiles -ExpandProperty LogFiles |  
   Select FileName,Name,UsedSpace,Size  

 #Container to hold the information in    
 [Array]$diskSummary=@()  

 #Loop through each drive ordered by free space on the drive  
 foreach($drive in $drives | Sort-Object FreeSpaceMB )  
 {  
   Write-Host $drive.DriveLetter  
   Write-Host $drive.Label   
   #Grab the size and used space for each database file on the current drive  
   $size = ($dbFiles | Where-Object{$_.FileName.StartsWith($drive.DriveLetter)} | Measure-Object Size -sum).Sum   
   $used = ($dbFiles | Where-Object{$_.FileName.StartsWith($drive.DriveLetter)} | Measure-Object UsedSpace -sum).Sum      
   #Add that information to the array for output later  
   $diskSummary += $drive | Select DriveLetter,Label,FreeSpaceMb,@{Label="DBFileSizeMB";Expression={$size/1024}},@{Label="DBFileUsedMB";Expression={$used/1024}}  
 }    
 #Show the files and disk information  
 $dbFiles | Sort-Object DriveLetter,FreeSpaceMB -descending | out-gridview -Title "DB Files"  
 $diskSummary | out-gridview -Title "Disk Summary"  

So there is the basics to get you started.

I will be speaking on this and a lot more bright and early on April 6th, 2013 at SQL Saturday 206 in


Hope to see you there.

Thursday, March 14, 2013

Telecommuting, the beginning is ahead of us


Fascinating stuff has been written lately about telecommuting.  Yahoo is pulling in their telecommuters and in the wake of it, Scott Hanselman wrote a really accurate piece on the matter (my opinion based on 4.5 months of actually doing it).  And again, Thomas Larock wrote about it.  It is hard to know what the real motivation is at Yahoo, but the key public component comes down to accountability.

Or simply put.

Can the company see that you are doing your job and can you prove you are doing it well? 

In thinking about how it works for the team I am on, I can safely say the fact that we deliver cubes, reports, etc every 2 weeks helps in terms of visibility to the business and the stories completed/points burned shows that individuals are doing their part.

Here is how the team setup unfolds:
  • The bulk of my team, including the leads are remote.  The folks that live near the office work remote a fair amount of time.  I think that addresses the work place jealousy aspect (or at least I presume it does).
  • We use Yammer, Lync, IP phones and do Agile so everyone has at least 5 minutes of vocal contact per day and multiple ways to converse, badger each other, support, whatever. 
On the personal side:
  • I have been in IT for 12 years, and a "senior" level SQL DBA, so I have been in the industry awhile.
  • My previous jobs have allowed some degree of telecommuting, but never any extensive amounts.
  • My career path and presentations prove to some degree that I am "driven".
  • I require very little social interaction, thus the need for daily office interaction is less important.
With all that said - here is what I have experienced/noticed in my first few months on the job in no real order.
  1. I work too much.  Just plain not used to being the weak spot on a team, so its hard to not keep going, especially during these miserable winter days and nights.  Not to mention sometimes what I am working on is interesting enough I just keep working on it.
  2. Communication can be hard.  Nearly every single time a phone call, IM or screen share will solve it, but I think sometimes being in person would be easier.
  3. I get monumentally more work done in a day working from home.  I don't have to listen to a loud salesman "we can help align your vertical markets with value added synergies" and the bizarre lives of nearby cube mates.  I have become somewhat blissfully ignorant of the outside world - probably not a long term thing, but has sure allowed me to think about things that do matter, like fishing trips with the kids, foraging for mushrooms and the impending maple syrup season:)
  4. Daylight - I have been able to spend more lunch hours hiking with my dog (w or w/o a gun) then I ever have during the week.  I live in Wisconsin, daylight is short in the winter, commuting chewed up an hour and a half of my day.  Lunch hour hikes are the norm now, to the point my dog comes in and reminds me.  At this rate I might be in decent shape this year.
  5. Sunday nights used to be source of dread for me, the long drive in on Monday, the traffic, snow storms, rain, crashes, etc.  There has not been a Sunday since I started telecommuting where I hated what Monday brought.  That may be because my job is challenging and interesting too, but I know I hated the commute more than the work all along.
  6. Kids.  I have been to 4 school functions in the past few months, which might total the amount I had seen the previous few years.  I even wait for the bus with my son occasionally.  Some mornings they will get up excited and I get to be there to talk about fishing, ninjas, bugs or video games.
  7. Internet speed.  I live in the country, and I have good internet speeds, but not great.  At times this has become a bit of an issue as doing screen sharing where I am the host can be problematic.  But that is about the only real problem I have had.
  8. Leaving your work in the home office.  There is always talk about never being able to leave your work when its at home, but I don't recall that ever being the case in my neck of IT, we were working after we got home then too.  You just don't have to haul a laptop home with you.  
  9. I make fun of the "green" movement, but my family actually lives it.  Meaning a huge garden, lots of raising, harvesting and foraging of our food, and all the other subtle things we can do to lessen our impact on the environment.  Even though I carpooled into work as much as I could, the amount of miles I put on my car has gone down dramatically.  Not to mention I utilize a space in my house that would be heated regardless and I found a really cool highly efficient lamp that I use for light (when the sun isn't out).  These are all things my work place would have to provide for someone who is there 9 hours out of 24 during the week.
  10. While I am thinking about it from the business side, I think a telecommuting worker is willing to work for less money.  Some might not, but for someone like me that lives out in the country, that commute was a killer in many aspects of the word.  I was willing to give up money for life (and a really good opportunity).
There is more really, but that is a good place to stop.  Already looking forward to a noon time hike to a nice waterfalls with all this new snow on the ground and 34 degree temperatures.  



My office if lived on a lake :)


On that note, I better get back to work.

Monday, February 25, 2013

Moving SSIS Projects in SQL Server 2012

How do you move SSIS packages from one server to another when using the SSIS Catalog on SQL Server 2012?

Before I start, here are the parameters of the post:
  1. We are using the "Integration Services Catalog" on SQL Server 2012
  2. We deploy from BIDS to staging and production - no "ispac" files
  3. We use the project deployment method

When confronted with this task recently, I immediately went off to www.google.com and started searching, I was unable to find anything for my current situation.  There were several posts together that helped get me started including this one by Matt Masson .  The complexity that was introduced was primarily around the fact that I could not figure out how to move SSIS packages without an "ispac" file.  This script is primarily courtesy of the post mentioned, but I then utilized a stored procedure from the SSIS catalog SSISDB.catalog.get_project to get the project bytes and load them on the destination server.

Hope this helps someone out in the future, although I would guess there is a method that doesn't require querying the catalog to get the project bytes, here is the script I came up with.



 $SourceServer = "SourceServer"  
 $DestinationServer = "DestServer"  
 $ProjectFolder = "ProjectFolderName"  
 $ISNamespace = "Microsoft.SqlServer.Management.IntegrationServices"       
 [Reflection.Assembly]::Load("$ISNamespace,Version=11.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91") | out-null      
 Write-Host "Connecting to server ..."       
 # Create a connection to the server       
 $constr = "Data Source=$SourceServer;Initial Catalog=master;Integrated Security=SSPI;"       
 $destconstr = "Data Source=$DestinationServer;Initial Catalog=master;Integrated Security=SSPI;"  
 $con = New-Object System.Data.SqlClient.SqlConnection $constr       
 $destcon = New-Object System.Data.SqlClient.SqlConnection $destconstr       
 # Create the Integration Services object       
 $ssis = New-Object $ISNamespace".IntegrationServices" $con   
 $destssis = New-Object $ISNamespace".IntegrationServices" $destcon   
 #Grab the SSISDB catalog  
 $catalog = $ssis.Catalogs | Where-Object{$_.Name -eq "SSISDB"}  
 $destcatalog = $destssis.Catalogs | Where-Object{$_.Name -eq "SSISDB"}  
 #Grab the Folders where the project  
 $curFolder = $catalog.Folders | Where{$_.Name -eq $ProjectFolder}  
 $destFolder = $destcatalog.Folders | Where{$_.Name -eq $ProjectFolder}  
 # If the destination folder is missing - create it  
 if($destFolder -eq $null)  
 {  
   $destFolder = New-Object $ISNamespace".CatalogFolder" ($destCatalog, $curFolder.Name, $curFolder.Description)       
   $destFolder.Create()       
 }  
 #This deploys everything under the projects  
 Write-Host "Starting Project Deployments for " $curFolder.Name  
 foreach($proj in $curFolder.Projects)  
 {  
   if($con.State -eq "Closed") {$con.Open()};  
   $cmd = New-Object System.Data.SqlClient.SqlCommand  
   $cmd.CommandType = "StoredProcedure"  
   $cmd.connection = $con  
   $cmd.CommandText = "SSISDB.catalog.get_project"  
   $cmd.Parameters.Add("@folder_name",$curFolder.Name) | out-null;  
   $cmd.Parameters.Add("@project_name",$proj.Name) | out-null;  
   [byte[]]$results = $cmd.ExecuteScalar();  
   if($results -ne $null) {  
     $deployedProject = $destFolder.DeployProject($proj.Name,$results)  
     Write-Host $proj.Name $deployedProject.Status  
   }  
   else {  
     Write-Host $proj.Name " failed - no results from Get_Project"  
   }  
 }  

If you are looking to learn more, I will be presenting on this and many other similar scripts at SQL Saturday 206 in Madison, WI .  Here is a map of the campus.

Thursday, February 14, 2013

Rye Whiskey and an SSRS Inventory Report

In my previous post, Moving SSRS 2012 with Powershell and Bourbon I covered how to change a data source for any or all the reports on the server.  That worked out so well, I poured myself a small taste of the brand new Rye Whisky from http://www.45thparallelspirits.com/ and celebrated.



The next morning I was talking through the results with our SSRS maven @Equerystrian and she requested that each report, when it had a shared data source, have the data source path listed next to the report.  I started thinking about modifying the script that changes data sources and decided I would rather have a script that is just an inventory of SSRS reports and data sources.  The basics are the same for both, but this report is much cleaner for its purpose.

The CSV file created lists the report path and name, connection string, type of data source and the path to the shared data source.

 CLS  
 $SSRS2012Server = "SSRSServer"   #SSRS Server  
 $ReportPathMatch = "*" # To specify certain reports, * for all
 $file = "C:\Users\AleAsylumMakesGoodBeer\Production_SSRS_Inventory.csv"

 #URL to the web service  
 $SSRSurl = "http://{0}/ReportServer/ReportService2010.asmx" -f $SSRS2012Server  
 # Get a handle on the service  
 $SSRS = New-WebServiceProxy -uri $SSRSurl -UseDefaultCredential;  
 # Get a collection of the reports on the server  
 $itemlist = $SSRS.ListChildren("/", $true) | Where-Object {$_.TypeName -eq "Report" -and $_.Path -like $ReportPathMatch -and $_.Path -notlike "/Users Folders*"} ;  
 Write-Host "Reports to review" $itemlist.Count  
 #Creating a report tracker  
 $Reports =@()  
 #Creates a report tracking object  
 function Add-Reports  
 {  
   param([string]$reportPath, [string]$connString, [string]$dataSourceType , [string]$dataSourcePath)  
   $report = New-Object System.Object  
   $report | Add-Member -type NoteProperty -name "Report" -value $reportPath  
   $report | Add-Member -type NoteProperty -name "Connection" -value $connString  
   $report | Add-Member -type NoteProperty -name "DataSourceType" -value $dataSourceType  
   $report | Add-Member -type NoteProperty -name "DataSourcePath" -value $dataSourcePath  
   return $report   
 }  
 #Loop Through Each Report  
 Foreach($item in $itemlist)  
 {  
   Write-Host "Checking " $item.Path  
   try  
   {  
     $dsList = $null #clearing it out explicitly  
     #Any given report may have multiple data sources, grab the collection  
     $dsList = $ssrs.GetItemDataSources($item.Path)  
     #Loop through each DS in the report  
     for($i=0;$i -le $dsList.Length-1;$i++)  
     {  
       $ds = $dsList[$i];  
       #If the DS has a reference it would appear to be a shared datasource  
       if($ds.Item.Reference -ne $null)  
       {  
         $dataSource = $ssrs.GetDataSourceContents($ds.Item.Reference);  
         $Reports += Add-Reports -reportPath $item.Path -connString $dataSource.ConnectString -dataSourceType "Shared" -dataSourcePath $ds.Item.Reference  
       }  
       else #embedded datasource  
       {  
         $dataSource = $ds.Item  
         $Reports += Add-Reports -reportPath $item.Path -connString $dataSource.ConnectString -dataSourceType "Embedded" -dataSourcePath ""  
       }  
     } #for each datasource   
   }  
   catch{  
     Write-Host "Error on " $item.Path $Error[0]  
     $Error.Clear();  
   }  
 } # Foreach Report  
 $Reports | Export-Csv $file -Force  

Monday, February 11, 2013

Moving SSRS 2012 with Powershell and Bourbon

Recently, talk of splitting a fairly large SSRS instance off on to its own box came up.  Given the BI shop at my employer is aggressive and gets things done often, there are lots of reports and they are used by people daily to make decisions.  The one issue as I see it with being aggressive and not having heavy handed code reviews or dictatorial team leads is things don't get done the same way each time.  So, in this case, what you have is some reports pointing to localhost, some point to the server name and others to the server alias.  In our case, we want to get everything pointing to the alias to make it easier when we move.

Naturally, Powershell came up as the magical elixir to this predicament, even though there may be other viable ways.  My thoughts were I would be able to grab someone else's script easily enough and just modify it for our environment, but that didn't work out well.  It appears few people are using SQL Server 2012 and SSRS in native mode (or at least they choose to blog about it, ask question on it in forums, etc)  For one reason or another, Sharepoint seems to be more prominent, and that is where we are going as well, piece by piece.

This script follows the easy road of getting collections into variables and then looping through them, there are much more condensed ways to write this script.  I doubt this will work with any other of the Reporting Services versions.

So here is the script - be darn careful since I haven't kicked the tires on this completely yet.  I have commented out the main damaging part toward the bottom, as of my tests, this report will manage both shared and embedded data sources.  I have commented this pretty hard, and as the careful eye of @Equerystrian pointed out, I even had lies in my comments.  Which explains why commenting code is bad :)

Note: I made a few edits after not liking how it handled the shared data sources, the process now collects them for each report and then updates a unique list at the end, a bit cleaner.  

 CLS  
 [Boolean]$DoDamage = $false     #If you wish to update the DataSources  
 $SSRS2012Server = "MySSRSServer"   #SSRS Server  
 $OldSQLServer = "MyOldDataSource"    #Old DataSource Server Name you want to replace  
 $NewSQLServer = "MyNewDataSource"     #New DataSource Server Name   
 $ReportPathMatch = "/Development*" # To specify certain reports, * for all  
 #URL to the web service  
 $SSRSurl = "http://{0}/ReportServer/ReportService2010.asmx" -f $SSRS2012Server  
 # Get a handle on the service  
 $SSRS = New-WebServiceProxy -uri $SSRSurl -UseDefaultCredential;   
 # Get a collection of the reports on the server  
 $itemlist = $SSRS.ListChildren("/", $true) | Where-Object {$_.TypeName -eq "Report" -and $_.Path -like $ReportPathMatch} ;  
 #Creating a report tracker  
 $ReportChanges =@()  
 $SharedDataSources = @(); #Track Shared Data Sets for Updating after checking all reports  
 #Creates a report tracking object  
 function Add-ReportChanges  
 {  
   param([string]$reportPath, [string]$connString, [string]$dataSourceType , [string]$newConnString)  
   $report = New-Object System.Object  
   $report | Add-Member -type NoteProperty -name "Report" -value $reportPath  
   $report | Add-Member -type NoteProperty -name "Connection" -value $connString  
   $report | Add-Member -type NoteProperty -name "DataSourceType" -value $dataSourceType  
   $report | Add-Member -type NoteProperty -name "NewConnection" -value $newConnString  
   return $report    
 }  
 #Loop Through Each Report  
 Foreach($item in $itemlist)  
 {  
   [Boolean]$DataSourceChanged=$false   
   Write-Host "Checking " $item.Path  
   try  
   {   
     $dsList = $null #clearing it out explicitly  
     #Any given report may have multiple data sources, grab the collection  
     $dsList = $ssrs.GetItemDataSources($item.Path)  
     #Loop through each DS in the report  
     for($i=0;$i -le $dsList.Length-1;$i++)  
     {   
       $ds = $dsList[$i];  
       $newItem = $null  
       $dsType = $null    
       $dataSource = $null  
       $oldConnString = $null  
       #If the DS has a reference it would appear to be a shared datasource  
       if($ds.Item.Reference -ne $null)   
       {  
         $dataSource = $ssrs.GetDataSourceContents($ds.Item.Reference);  
         $dsType = "Shared"  
       }  
       else #embedded datasource  
       {  
         $dataSource = $ds.Item  
         $dsType = "Embedded"  
       }  
       $oldConnString = $dataSource.ConnectString  
        # Old Name Exist in Connection  
       if($oldConnString -ilike "*Data Source=$OldSQLServer*")  
       {  
         # If a Shared Datasource add to list for updating later, if embedded, update it.    
         If($dsType -eq "Shared")   
         {  
           $SharedDataSources += $ds.Item.Reference   
           $ReportChanges += Add-ReportChanges -reportPath $item.Path -connString $oldConnString -dataSourceType $dsType -newConnString "Shared"        
         }   
         else   
         {  
           # Make a Copy of the datasource  
           $newitem = $dataSource  
           # Simple replace of the server name  
           $newItem.ConnectString = $oldConnString -iReplace $OldSQLServer, $NewSQLServer  
           $dsList[$i] = $newItem  
           $DataSourceChanged = $true # Flip this bit so we know to save this DS collection back to SSRS   
           $ReportChanges += Add-ReportChanges -reportPath $item.Path -connString $oldConnString -dataSourceType $dsType -newConnString $newItem.ConnectString  
         }  
       }  
       else  
       {  
         $ReportChanges += Add-ReportChanges -reportPath $item.Path -connString $oldConnString -dataSourceType $dsType -newConnString ""        
       }  
     } #for each datasource    
     # Push the datasource back on the report  
     If($DataSourceChanged)  
     {  
       Write-Host "Updating Datasources for " $Item.Path  
       IF($DoDamage) {$SSRS.SetItemDataSources($Item.Path, $dsList)}  
     }    
   }  
   catch{  
     Write-Host "Error on " $item.Path $Error[0]  
     $Error.Clear();  
   }  
 }  
 #Now get a unique list of the shared data sources and update them  
 Foreach($sh in $SharedDataSources | Sort-Object | Get-Unique)  
 {  
   $ref = $ssrs.GetDataSourceContents($sh);  
   $oldConnString = $ref.ConnectString  
   $ref.ConnectString = $oldConnString -iReplace $OldSQLServer, $NewSQLServer  
   if($DoDamage)   
   {  
     Write-Host "Updating $oldConnString for $sh"  
     $ssrs.SetDataSourceContents($sh,$ref)  
   }  
   else  
   {  
     Write-Host "Would have updated $sh with " $ref.ConnectString  
   }   
 }  
 $ReportChanges | out-gridview  


To close out this post, near my home is a distillery that buys the bulk of their grains from local farmers.  If you are into Vodka, Bourbon or Rye, give http://www.45thparallelspirits.com/ a try.  Here is a picture of my favorite spirit they produce.
.

Tuesday, February 5, 2013

Slightly Intelligent SSAS Processing with Powershell

Recently, I had a story to do some partitioning in one of our cubes to help performance on some data that was loaded once per month and had years of history.  The process is pretty straightforward and part of the convention was to use the word "Current" at the end of the partition name that we would want to process after the ETL was complete.  Once you get into partition specific processing, the XMLA goes from simple to a whole bunch of commands, not complicated but cumbersome.  The very next sprint yielded adding another measure group and even more dimensions, the setup of the XMLA was not hard, but manual and prone to error if you are anything like me.

Quick interruption for a public service announcement: Maple syrup season will hopefully be starting in about one month.  Its a great time to be out in the woods, and the product is pretty good too.



Ok, back to it.  Having worked on some SSAS with Powershell this ended up being a straightforward way to process the dimensions and any partition with the word "Current" in it.  This really is a template for me as I want to utilize a naming convention to make this easier across the board, and if a measure group contains no "Current partitions, I want to process the partitions in that measure group.  Its a nice start to what will likely be many more "management" type scripts in working with SSAS to automate some of the mundane parts.

Here is the base script, free of fluff and error handling.  Remember SQL Agent is really specific about scripts throwing errors....

 $srcSSASName = "MySSASServer"  
  $ssasDb = "Brewery"  
  $partitionSpecificTerm = "Current";  
  [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices") | out-null  
  $source = New-Object Microsoft.AnalysisServices.Server  
  $source.Connect($srcSSASName)  
 
  #Grab the SSAS Database handle
  $Db = $source.Databases.GetByName($ssasDb)  

  #For the DB, Process any Dimensions that need it  
  foreach($dimension in $Db.Dimensions )  
  {  
    Write-Host "Processing dimension " $dimension.Name  
    #$dimension.Process("ProcessFull");  
  }  
 
  #Loop through each cube in the SSAS database  
  foreach($cube in $Db.Cubes)  
  {  
   foreach($measuregroup in $cube.MeasureGroups)  
   {  
   foreach($partition in $measuregroup.Partitions)  
   {  
    if($partition.Name -like "*$partitionSpecificTerm*")  
    {  
     Write-Host "Processing specific partition " $partition.Name  
     #$partition.Process("ProcessFull");  
    }  
    if($partition.State -ne "Processed")  
    {  
     Write-Host "Processing unprocessed partition " $partition.Name  
     # $partition.Process("ProcessFull");  
    }  
   }  
   if($measuregroup.State -ne "Processed")  
   {  
    Write-Host "Problem with measuregroup " $measuregroup.Name  
   }  
   }  
  }   

Wednesday, January 30, 2013

Disabing SSIS properties quickly courtesy of SSISDB

Today I am going to tackle something that came up recently at work - finding the error in an SSIS package and disabling the problematic step so the rest of the processing could continue.  (In a perfect world, the package would compensate for the "unimportant" step, but things happen when you have 100's of packages and release software faster than Java releases updates.)

The problem came up when a small file was not able to open and the step was buried inside a much larger SSIS package, so simply skipping the package was not prudent.

A few solutions came to mind:

  1. Try to figure out the root problem and fix that
  2. Open up BIDS and simply disable the step and run the package in its entirety.
  3. Open up the Job and navigate to the configuration for the step and override the property on the step to disabled.
  4. Fix the issue (change a precedent constraint and a property) and just go to production without a second thought to creating more problems :)
I am recovering DBA, so I went towards number 1 right out of the gate.  This proved to not be feasible given the importance of the step, the fact it involved people outside the team, etc.  2 was a very real possibility but I hate doing "out of band" things on production.  Number 3 ended up being the method I went with.  4 was the last resort - a DBA's nightmare and it turns out is probably common in small Agile shops.

So, I needed to figure out what I needed to disable, in this case I found the error message using SSMS and navigating through the Integration Services Catalogs -> Project -> Package -> Reports -> All Executions and seeing the error.  I then opened up the project in BIDS to find the path to the task that needed to be disabled.(Click on the task, there is a property that has the "path")  This tends to take awhile to accomplish.  Here is a much quicker way to get there.

Here is the SQL to query SSIS DB in SQL 2012.  I have no idea if this works on those old SQL versions.

DECLARE @project_name varchar(128) = 'BeersOfWisconsin'
DECLARE @package_name varchar(128) = 'LoadBreweryDim.dtsx'
DECLARE @DaysLookBack int = -7;

SELECT start_time,end_time,package_path, message,execution_path
FROM catalog.executions ex
INNER JOIN catalog.event_messages evt on ex.execution_id = evt.operation_id
WHERE ex.project_name = @project_name
AND ex.package_name = @package_name
AND ex.status = 4
AND message_type >= 100 -- as of now 110 is warning, 120 is Error, 100 is QueryCancel, 130 is TaskFailed, 140 is Diagnostic
AND start_time > DATEADD(Day,@DaysLookBack,CURRENT_TIMESTAMP)
ORDER BY execution_id DESC

Most of that is for information but the last column as an execution_path that in my case yielded exactly what I needed. (note bold portion). 

\BeersOfWisconsin\DimBreweryLoad\DimDistributor\DimDistributor - ReadFile
Doing a little surgery to this - I end up with what I need to change in the Configuration Property Override path.  

\Package\BeersOfWisconsin\DimBreweryLoad\DimDistributor.Properties[Disable]

Now simply set the Property Value = 1 and save the job and rerun.  

Note:  I have only done this as a back tracking exercise to solve the problem faster in the future, I am not sure if this method will work with all packages and container structures.

Remember to address the root cause and remove the property!

Saw this image the other day, some days they go hand in hand :)





Sunday, January 20, 2013

Powershell and a poor man's math tutor

Math.  It is an important thing, hard to realize when you are 6 and learning to add, subtract and multiply, but as parents we know its important and want our kids to do better.  To keep math interesting, I have made it increasingly harder and tried to add complexity and different delivery methods all the time, sometimes its adding coins or using prices in the grocery stores with scenarios to keep them learning.  This weekend I whipped up an excel spreadsheet where my son could type in answers and get the correct/incorrect answer right away.  He liked being able to use the computer to do this, so today I decided to whip up a quick program that would allow him to choose a symbol and the size of the numbers if he wanted.  It is pretty basic and could use ShowUI, but certainly might keep him interesting for a few nights.



 [int]$lowNum=1  
 [int]$highNum=99  
 [string]$symbol = "+"  
 if($lowNum -eq $null){ $lowNum = Read-Host "Please enter low number for equations"}  
 if($highNum -eq $null) { $highNum = Read-Host "Please enter high number for equations"}  
 if($symbol -eq $null) {$symbol = Read-Host "Please enter the symbol for the equations"}  
 do  
 {  
   $rand = New-Object system.random  
   $first = $rand.next($lowNum,$highNum)  
   $last = $rand.next($lowNum,$highNum)   
   write-host $first " " $last  
     [string]$ansString = "What is {0} {1} {2}?" -f $first,$symbol,$last  
     [int]$answer = Read-Host $ansString  
     [int]$correctAnswer = Switch($symbol)  
     {  
       "+" {$first + $last}  
       "-" {$first - $last}  
       "*" {$first * $last}  
       "/" {$first / $last}  
       default{"UNK"}  
     }  
   write-host "correct" $correctAnswer " " $answer  
   If($correctAnswer -eq $answer)  
   {  
     $cont = read-host "Correct. Do you want to continue? (y/n)"  
   }else  
   {  
    Write-Host "Incorrect. Answer is " $correctAnswer " Do you want to continue? (y/n)"  
   }  
 }  
 while($cont -eq "y")  


Once their math is in good order, you can head to the woods to work on their tracking skills.

1

Friday, January 11, 2013

Setting up a SSAS Sync with Powershell

In the SSAS world, a method to separate processing and presentation is utilizing a two server setup.  The "processing" server is where all the heavy lifting is done and a secondary server is where the cubes are hit by end users and reports.  Given my newbie status in the BI space, I am guessing, but pretty sure that XMLA and SSIS tasks are the common methods for syncing a cube from one server to another.  A side benefit of the script I am about to share is its really easy to move a cube from one server to another for any reason, plug in the values and hit the button.

So, a picture is a worth a thousand words (this one probably doesn't qualify as I did it with Paint, but here is the general architecture.

Alright, first we will need a function to see if SSAS is actually running (with retry logic) - this is used throughout.


 function Verify-SSASConnection([Parameter(Mandatory=$True,Position=1)][Microsoft.AnalysisServices.Server]$SSASConnection,  
                 [Parameter(Mandatory=$True,Position=2)][string]$SSASName)  
 {  
   if($SSASConnection.Connected) {return;}  
     
   $SSASConnection  
     
   Write-Info "Attempting ReConnect for $SSASName"  
     
   [int]$i = 0;  
   while(!($SSASConnection.Connected))   
   {  
     if($i > $retryAttempts) {throw "Alert! $SSASName Unavailable, $cubeName Disconnected"}  
       
     try{  
       $SSASConnection.Connect($SSASName)  
       }  
     catch{  
       Write-Info "retry loop for $SSASName - $i" + $SSASConnection.Connected  
     }  
       
     if($SSASConnection.Connected) {return;}  
     $i++;  
     Start-Sleep -Seconds $retryDelay  
       
   } # looping retry      
   return;  
 }  

The steps below can be summarized pretty nicely into this:

  1. Detach Processing Cube and copy the entire cube folder to the Presentation Server (with a _new appended on the name) and then reattach Processing Cube
  2. Detach the Presentation Cube and rename the folder (with a _old)
  3. Swap the Processing Folder to previous Presentation Cube folder name.
  4. Bring the Presentation Cube back online
  5. Party

The script should be saved as a file and simply called via SQL Agent (Operating System Command in this case).  The parameters are the source (processing server), the presentation server, the cube and the mode it should be restored in.


 Param(  
   [Parameter(Mandatory=$True,Position=1)]  
   [alias("SRC")]  
   [string]$srcSSASName ,  
   [Parameter(Mandatory=$True,Position=2)]  
   [alias("DEST")]    
   [string]$destSSASName ,  
   [Parameter(Mandatory=$True,Position=3)]  
   [string]$CubeName ,  
   [Parameter(Mandatory=$True,Position=4)]  
   [ValidateSet("ReadOnly","ReadWrite")]  
   [string]$CubeModeParm    
 )  
 $ErrorActionPreference = "STOP"  
 [int]$retryDelay = 30  
 [int]$retryAttempts = 10  
   
 [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices") | out-null  
   
 $source = New-Object Microsoft.AnalysisServices.Server  
 $destination = New-Object Microsoft.AnalysisServices.Server  
   
 if($cubeModeParm -eq "ReadOnly"){  
   $cubeMode = 1  
 } else {  
   $cubeMode = 0  
 }    
   
 function Write-Info([string]$msg)  
 {  
   $dt = (Get-Date)  
   if ($host.Name -eq "ConsoleHost"){Write-Host "$msg | $dt"} else {Write-Output "$msg | $dt"}  
 }  


Now grab the server names for the physical copy, the Regex below was not my first choice due to the fact I still don't get it completely, but this hombre advocates for it.


 $srcServerName = $srcSSASName -replace '^([\w\d]+).*$', '$1'  
 $destServerName = $destSSASName -replace '^([\w\d]+).*$', '$1'  

You will notice the Write-Info function use, its my hacky way of having the function decide how to output the myriad of debugging steps I have in this script.  It is nice when things aren't working to find the problem.  Now its time to connect and get the data directories for each instance.


 Write-Info "Starting process to move $cubeName from $srcServerName to $destServerName"  
   
   # Source Connect  
   try {  
      $source.Connect($srcSSASName)  
   }   
   catch {  
     throw "Source SSAS $srcSSASNAME Connect $Error "  
     Exit 1  
   }    
     
   # Destination Connect  
   try {  
      $destination.Connect($destSSASName)  
   }   
   catch {  
     throw "Destination SSAS $destSSASNAME Connect $Error "  
     Exit 1  
   }    
     
   #Fetch current ssas data directories and unc    
   try {  
     $srcDataDir = $source.ServerProperties.Item("DataDir").Value  
     $destDataDir = $destination.ServerProperties.Item("DataDir").Value  
     $uncSrcDataDir = "\\{0}\{1}\" -f $srcServerName, $srcDataDir.Replace(":","$")  
     $uncDestDataDir = "\\{0}\{1}\" -f $destServerName, $destDataDir.Replace(":","$")  
   
     If((test-path -Path $uncSrcDataDir) -eq $false) {throw "Source Data Directory Does Not Exist - $srcServerName , $srcDataDir"}  
     If((test-path -Path $uncDestDataDir) -eq $false) {throw "Destination Data Directory Does Not Exist - $uncdestDataDir"}   
   
   }  
   catch [system.exception] {  
     throw "Directory setup failed - $Error[0]"  
     Exit 1  
   }  


Ok, we have connected to the SSAS servers and got the data directories.  The next statement is a bit trickier as SSAS DBs have a ..db naming scheme, and we need to find the most recent revision to sync.


   $srcFolderName = (gci $uncSrcDataDir | ?{$_.Name -match "$cubeName\." -and $_.psIsContainer}| sort -property LastWriteTime -des | select-object -first 1).Name  
    
If the cube exists on the destination, this code gets those pieces of data set up. 
   

   $destFolderName = (gci $uncDestDataDir | ?{$_.Name -match "$cubeName\." -and $_.psIsContainer}| sort -property LastWriteTime -des | select-object -first 1).Name  
   IF($destFolderName -eq $null)  
   {  
     $destFolderName = $srcFolderName  
     Write-Info -msg "Destination does not exists for cube - using $srcFolderName"  
   }   
   $destLocalFolder = "{0}\{1}" -f $destDataDir,$destFolderName #path to cube to attach for ssas (not UNC)  
     
   #setup vars  
   $uncSrcDataDir = "{0}{1}" -f $uncSrcDataDir,$srcFolderName  
   $baseFolder = "{0}{1}" -f $uncDestDataDir,$destFolderName  
   $newDestFolder = ($uncDestDataDir + $destFolderName + "_new")  
   $oldDestFolder = ($uncDestDataDir + $destFolderName + "_old")  
       
   $srcFolderReAttach = ($baseFolder -replace '\\\\.+?\\(\w)\$','$1:')  
     
   if(test-path -Path $oldDestFolder) # Remove old folder if exists  
   {  
     Remove-Item -Path $oldDestFolder -Force -Recurse  
     Write-Info "Removing $oldDestFoler - still existed!"  
   }  
   Double check that the "new copy of the cube" isn't on the destination.  
   if(test-path -Path $newDestFolder)   
   {  
     $newestFileDate = (gci -Path $newDestFolder | Sort-Object LastWriteTime -Descending | Select-Object -First 1).LastWriteTime   
     $dateFilter = (Get-Date).AddSeconds(($retryAttempts * $retryDelay) * -1)  
       
     if($newestFileDate -ge $dateFilter) {  
     throw "Existence of $newDestFolder with recent file dates - a Sync is already running"  
     Exit 1;  
     }  
     # wait for one delay - then double check destination connection.  
     Start-Sleep -Seconds $retryDelay   
       
     Verify-SSASConnection ([ref]$destination) ($destSSASName)  
     Remove-Item -Path $newDestFolder -Force -Recurse  
     Write-Info "$newDestFolder Exists! Likely failure on prior run."  
   }  
   
    
Now separating the wheat from the chafe - detach the source cube so the files can be copied.


   try {  
     Write-Info "Starting Detach of $cubeName"   
     $source.Databases.GetByName($cubeName).Detach();  
   }    
   catch {  
     throw "Detach failed for $cubeName $Error[0]"  
     Exit 1  
   }  
Copy the folder (it will have lots of files depending on the cube), note the finally as if the copy fails the source cube will be brought back online.


   try {    
     Write-Info "Starting copy $newDestFolder "   
     Copy-Item -Path $uncSrcDataDir -Destination $newDestFolder -recurse -force  
   }    
   catch {  
     throw "Copy failed for $cubeName $Error[0] - Attempting reattach on Source"  
     Exit 1  
   }  
   finally  
   {  
     Write-Info "Attaching Source Cube $srcFolderName"  
       
     Verify-SSASConnection ([ref]$source) ($srcSSASName)  
     $source.Attach($srcDataDir + "\" + $srcFolderName)  
     $source.Disconnect();    
   }  
     
   Write-Info "Starting Destination Cube Detach Process $destSSASName and replace with $newDestFolder "  
     
   if(!(test-path -Path $newDestFolder)) {throw "New Folder Copy on $newDestFolder did not succeed - $error[0]"}    

  At this point, the copy of the updated cube has been completed and the source (processing cube) is back online for the next process command and we now detach the presentation cube.      
    

 try{  
     Verify-SSASConnection ([ref]$destination) ($destSSASName)  
       
     if($destination.Databases.Contains($cubeName))  
     {  
     $destination.Databases.GetByName($cubeName).Detach()  
       Write-Info "Detach of $destSSASName completed"  
     }   
   }    
   catch{  
     throw "Destination Detach Failed for $cubeName - $error[0]"  
   }    
     
   Write-Info "Attempting Folder Swap Main:$baseFolder Old:$oldDestFolder "  
    
Once the detach is complete, we now want to rename the current destination cube (append an _old to the name.  Upon failure, the presentation cube is brought back online.


   try{  
     if(test-path -Path $baseFolder)  
     {  
       Write-Info "Renaming Current Destination Folder $destFolderName"  
       Rename-Item -Path $baseFolder -NewName $oldDestFolder -Force   
     }  
       
   }   
   catch {  
     Write-Info "Failure of folder rename $destFolderName - attempting attach of old data"  
       
     Verify-SSASConnection ([ref]$destination) ($destSSASName)  
     $destination.Attach($destLocalFolder,$cubeMode) #1=readonly  
     $destination.Disconnect()     
       
     throw "Folder Swap Failed - $baseFolder. Attempting reAttach - $error[0]"  
   }  
     
   Write-Info "Renaming $newDestFolder to $destFolderName"  


The folder rename can be problematic so there is a rolling retry over this ( in testing, this has only failed once in a while, and usually one for one loop.  Note the finally again will bring the cube back online on failure or success.


   try{  
     [bool]$fileRenameComplete = $false  
     [int]$retries = 0  
     while($fileRenameComplete -eq $false)  
     {  
       try {  
         Rename-Item -Path $newDestFolder -NewName $destFolderName -Force  
         $fileRenameComplete = $true;  
       } catch {  
         Write-Info "Rename Failed for $newDestFolder"  
       }  
       $retries++;  
       IF($retries -ge $retryAttempts)   
       {  
         throw "Retries for $newDestFolder rename Failed!";  
       }     
       Start-Sleep -Milliseconds $retryDelay #very brief delay before retry  
     }  
   }   
   catch {  
     throw "Folder Replace with update failed $newDestFolder - $Error[0] "   
   }   
   finally{  
     Write-Info "Attaching Destination Cube $destLocalFolder"   
       
     Verify-SSASConnection ([ref]$destination) ($destSSASName)  
     $destination.Attach($destLocalFolder,$cubeMode) #1=readonly  
     $destination.Disconnect()     
   }  
   
   Write-Info "Cube attach completed - Starting $oldDestFolder removal "   

So, here is a Powershell version of a cube sync, the base set of code was borrowed/stolen from a co-workers implementation.  I was charged with adding the retry logic and fault tolerance.

Note: SQL Agent can be a bit painful when working with Powershell, that stuff is documented on the web.  


Wednesday, January 2, 2013

A former DBA's guide to Agile.


"Ninety-nine percent of all software sucks, with Agile we can make it a hundred" (paraphrasing a tweet from Buck Woody some years back)

The past year brought me out of the waterfall software methodology and into working in a agile shop.  I used to make fun of agile, partly because I have never seen it implemented well and partly because I was a DBA in a waterfall shop. (If for some reason you don't know a DBA, they proceed cautiously, probably have a black leather jacket and generally don't like change.)

So here it is in former DBA writing style - a list.

0. Forget everything you knew about project management, it hardly exists anymore as a stand alone job.  (And C# arrays are zero based, so my list is as well)

1. It is better to be a developer in an Agile shop, a waterfall DBA would go insane.

2. As you get more comfortable with pushing code that has hardly been tested to production, be careful as it starts to creep into your daily life; just the other day I walked out on the ice on the river behind my house to make sure it was safe (open water a few feet away), when my ax was right there to test it first.

3. You have to completely give in to the developers, especially since work that has been promised to the business needs to get delivered, there is no more "control".  In any corporate power struggle, the business folks win, and if you hold up their new KPI, because the code is insecure or non performant, you lose.

4. Each unit of work (a story) is small enough generally to be completed in a day or two of work and it might not be the finished product. "You mean this stored procedure is only half written! You are NOT going to production with that!"  says a waterfall DBA the morning of a huge deployment every three months.

5. If you ever want to be a remote worker - this is a great way to show management that work is being completed despite the fact they don't get the face time they value so much.

6. Because deliverables are generally small - the screw-ups are usually smaller but more visible; making this useful if you have a vendetta and want to get someone axed, but have a big HR department. :)

7. There seem to be less meetings because things need to get done.  A huge win for those of us who lived in meeting driven development.

8. Everything you do has a "point" value, so if you are a competitor and believe the developer with the most points wins, you get to win many times a year.

9. You will get better at estimating, or you will work more than you want to in a given sprint.  The upside is the entire team gets to put their .02 in on how complex the work actually is; so the old days of some folks over estimating their work in an effort to provide time to play on Facebook is gone.

10. If a sprint falls during mushroom season, you can just lower your velocity a bit and spend a little more time in the woods :)




I have no classical training in agile so I am sure the pedantic readers will note I don't know the terminology or really understand what I am talking about, my lone excuse is I am too busy trying to keep up with all the cool new technology and projects I get to work on.