WIDBA

WIDBA - The busiest DBA in Wisconsin, its all I do and I do it well

Tuesday, November 12, 2013

Getting run time for "child" SSIS packages out of SSISDB

Today I was looking at an SSIS package, it's sole job is to call a bunch of other SSIS packages in a specific order (loading a bunch of dimension tables).  Given some recent data growth, I was looking for low hanging fruit to help shorten our load times.  Interestingly enough, SSISDB, nor the Integration Services Catalog, have an obvious way to view these "inner" SSIS package statistics via the normal channels.  My Google fu was not working after several tries so I whipped this up.  Hopefully I hit all the keywords for the next person trying to find something similar.

I am extrapolating the "duration" from the first message to the last, so keep that in the back of your mind.  It gave me what I needed which was a place to focus based on run time.

 SELECT exe.folder_name, exe.project_name, exe.package_name as master_package,details.*  
 FROM ssisdb.catalog.executions exe  
     INNER JOIN (  
                 SELECT op.operation_id,created_time,object_name,evt.package_name,DATEDIFF(MINUTE,stDate,EndDate) as MinDur  
                 FROM SSISDB.[internal].[operations] op  
                     INNER JOIN (  
                                 SELECT package_name,operation_id,MIN(message_time) as stDate, MAX(message_time) AS endDate  
                                 FROM ssisdb.catalog.event_messages  
                                 GROUP BY package_name,operation_id  
                                 ) evt on op.operation_id = evt.operation_id  
                 ) details ON exe.execution_id = details.operation_id  
 WHERE  exe.project_name = 'SSISProjectName'   
     AND exe.start_time > '2013-11-05'   
     AND exe.package_name = 'TheSSISPackageParent.dtsx'  
 ORDER BY start_time  


Friday, October 25, 2013

A year in review.

A co-worker decided to assign individual blog topics to each of us recently, presumably to create peer pressure for blogging.

My topic:
  • 1 year recap at Trek

I will sum up this past year.  I do not hate Mondays.  #booyah

.................

Trek is a bike company, I had not really rode in 10 years, so I got a new one and rode close to 430 miles this summer, that is what some crazy coworkers do in a week, but I was happy with the miles for my first year getting in the groove and a whole lot of uncooperative weather to start out the riding season.

The company is based out of Waterloo, WI (near Madison).  I live about 4 hours away and see the home office every other month.  The remainder of the time I commute to work in about 30 seconds, with a quiet office overlooking my garden, at least one yellow lab laying at my feet and my own coffee.  I built a stand up desk (my posture is horrible sitting), so I work standing up most of the time and love it.  Comparing this to my previous employment is pointless, there is no comparison (1.5-2 hrs on the road, mediocre coffee, small cubicles, low walls, noise, politics, wearing dress clothes, etc)

Trek does Agile, in the sense that we have sprints, stories and deliver every 2 weeks.  (for those who say "how do I know this person is working at home", you simply allow them to participate in planning the work and setting velocity), if the work is not complete and it happens consistently, its pretty obvious they should go work in a big corporation that does waterfall where they can hide for months or years at a time. 

The main reason for me to switch jobs was boredom and opportunity, I was bored in my current position and BI seemed to have all the cool toys and new tech of the day (minus Hekaton, which is really interesting on many levels).  Trek is the antithesis of boredom and old technology, within 2 weeks of my first day I was working in SSIS on Azure IaaS.  The team has been on SQL 2012 for well over a year and we have several tabular cubes in production on top of the several MDM cubes already.  Sharepoint 2013 is running and all the Power_* technologies are being used, abused or at least being studied for feasibility right now.  This new SSIS feature is going to be lots of fun to work with as Power Query becomes more prevalent.

The last thing I want to brag about is my co-workers.  I work on a relatively small team ~10 folks of which most have blogs, half of them speak at user groups and SQL Saturdays and all are good at what they do.  I highly recommend working with people who are smarter than you, its challenging, but your skills improve exponentially faster.  I should also mention that I got to attend the PASS Summit 2013 with several of my co-workers after only being with the company for a year.  It was my second Summit and it is a great opportunity to really learn new skills and talk with the brightest minds in the business.

So in summation, I took a risk switching careers, took a few steps down the career ladder and got back to learning again.  It has been a great ride.


Might as well throw some goals for year two in.
  • Get my head around how SSAS cubes work - so I can logically solve MDX problems without resorting to guesswork.
  • Ride 1000 miles in 2014 and ride in the Trek 100, 36 mile route
  • Get better at testing and validating my own code.
  • Improve my performance tuning skills in the data warehouse arena
  • Speak at a technical event on a non Powershell topic
  • Pick a technology and immerse myself in it, some neat possibilities are Power Query and "Q and A" (my bet is its going to be called Power Bing or Power Google eventually).  

Tuesday, August 6, 2013

Saving an Excel workbook using Excel Services and Powershell on Sharepoint 2013

Welcome, I can't imagine too many scenarios where someone would want to open and "save as" an Excel file in Sharepoint 2013 with Excel Services, but if you do, here is how I implemented this via Powershell.  This was strictly a trial attempt at work, so its all pretty straight hard coding.

The long and short of it, SSRS was dropping an Excel file on Sharepoint and the Power[Pivot or whatever new name is] didn't recognize the Excel file, so one thought was simply using ExcelServices to open and do a "Save as" to make Sharepoint and Power_ realize it really was an Excel file.

This short script connects to the Excel Service, opens the workbook and does a SaveWorkbookCopy and closes.  (As of now, the documentation for SaveWorkbookCopy seems to be different than the implementation).  One thing I noticed is how quick this is considering the size of some of the workbooks - so it must be doing the work completely on the web service.  I could be wrong, but that is how it "seems".


 $URI = "https://your.sharepoint.url/site/_vti_bin/excelservice.asmx"  
 $proxy = New-WebServiceProxy -Uri $URI -UseDefaultCredential  
 $culture = "en-US"  
 $wkbook = "https://your.sharepoint.url/site/UnrecognizeExcel.xlsx";  
 $newwkbook = "https://your.sharepoint.url/site/RecognizedExcel.xlsx";  
 $wsStatus = $null  
 $sessionId = $proxy.OpenWorkbook($wkbook,$culture,$culture,[ref]$wsStatus)  
 $sessionId  
 $proxy.SaveWorkbookCopy($sessionid,$newwkbook,"FullWorkbook","AllowOverwrite")  
 $proxy.CloseWorkbook($sessionId);  



Sunday, April 7, 2013

Speaking

Just returned home from my sixth SQL Saturday speaking engagement on Powershell and SQL Server.  Had a great time in Madison this year, visiting with various people I have met over the years and visiting with lots of PASS MN folks that I rarely see now that I am telecommuting.

My session A Powershell Driven Life (scripts available) was well attended again and follows the theme I have used the past three SQL Saturdays, using pictures of various beers, etc as my lead in to each script.  The demos all worked perfectly (something I noticed a few speakers really struggled with) and I found myself going much slower than I intended too - getting through even less demos than I had figured was my minimum based on trial runs.  I don't know how the great speakers can predict timing so effectively.  One thing that surprised me was the sheer number of people new to Powershell that came to my "intermediate" session.  I tried to slow down and explain a few concepts that tripped me up when I was learning.  The feedback was excellent, and my numerical scores were lower than usual.  Personally, the feedback is more useful than the arbitrary grading scale.  (Or that is what someone with lots of 3's and 4's would say :) )

The thing I thought about this morning, is the sheer amount of commitment that is required from the organizers and speakers of these events.  My presentation code was basically already done, because it was developed "in the line of duty".  I spent an inordinate amount of time rehearsing - and hoping to develop a some comedic relief as I went along.  Speaking of comedic relief, my coworker @PhoneticTalk did a fantastic job of intermixing humor and content in his Power View session.  In developing this presentation, I did not track my hours, but if I was consulting it would be in the $1000's for billable time.  In talking with some fellow speakers, I still marvel at why people are willing to travel, finance their own hotels, etc and spend inordinate amounts of time speaking.  So, I will share in brief what speaking did for my career.

My current boss and I met when I was speaking at a user group.  That occurred years before I got the opportunity to work with them and so I have written off the tons of hours preparing and giving presentations as worth it. I am one of the lucky people with a challenging job on a great team of smart people who are fun to work with.  Not too mention, I am learning loads of new technologies and getting real world experience working with Analysis Services, after ~11 years on the OLTP side.  In addition, I know some really smart people scattered around the region and if needed, could probably reach out to them for help when I am in a bind.

So, I may not speak again for awhile, I enjoy Powershell talks but I am not sure where to take it from here - and its time to let some other folks jump in.  Based on the amount of folks that haven't used Powershell , I may be inclined to work on an introduction course if one is not offered next year.

If for some reason you read this and have thought about speaking but haven't, give it a try, a great opportunity might be sitting in the audience.







Tuesday, March 26, 2013

The eye pleasing data alert

There comes a time in any developer's life where they must write some code that emails folks when there are data problems.  Regardless of the system or causation of a data problem, an alert must be sent.  In the days of being a DBA, the alert would generally be a text file or just plain text dumped into the body of an email.  This worked fine for me for several years as I was generally the primary consumer of such alerts.  Those days are over.

Enter working in business intelligence.  Now the data matters and the format it is delivered in.  The learning curve for me has been steep; I have always been more focused on content than aesthetics.  Now both are a requirement.  Initially, I thought this was silly but I am starting to understand the importance of things being right and looking good.  Alerts, reports, etc will be used and paid attention to when they have the proper coloring and gradients to express what the data is saying.  Just dumping numbers to the screen will not suffice anymore.  In addition, the obvious mobile device usage in this day and age predicates that the alerts look good on a mobile phone as well.

The process is brute force HTML construction, but it yields a pretty nice looking alert and does not require outright looping or much code.  Essentially, you construct a style sheet and then use a few TSQL tricks to generate the data set and create ladderstriping with Row_number() functions. (yet another term I was unfamiliar with until BI).  I would not call the process graceful, but it passes the eye test with the very picky data consumers I have :).



DECLARE @Alert NVARCHAR(MAX) = '' +
      
N'' +

N'Data Alert Title' + 
    N' Order NumberLineBusiness UnitDocumentDocument TypeCompanyCustomer ' 

SELECT @Alert = coalesce(@Alert,'') +
' CASE WHEN (ROW_NUMBER() OVER (ORDER BY [Date] desc,Document desc,OrderNumber desc)) %2 = 0 THEN 'r2' ELSE 'r1' END 
+ '">' +
'' + ISNULL(OrderNumber,'') + '' +
'' + ISNULL(CAST(LineNumber AS VARCHAR(10)),'') + '' +
'' + ISNULL(BusinessUnit,'') + '' +
'' + ISNULL(CAST(Document AS VARCHAR(24)),'') + '' +
'' + ISNULL(CAST(DocType AS VARCHAR(24)),'') + '' +
'' + ISNULL(Company,'') + '' +
'' + ISNULL(CustomerNumber,'') + '' 
FROM 
FROM dbo.MyDataTable
        WHERE alertProblem=1 )
DECLARE @rc INT = @@ROWCOUNT

SELECT @Alert += N'
' ;
IF(@rc > 0 AND @Alert IS NOT NULL) 
BEGIN
DECLARE @MailProfile VARCHAR(255) = 'mailprofile'
DECLARE @subj varchar(128) = @@SERVERNAME + ': DataAlertName'
DECLARE @FromAddress varchar(128) = @FromAddress

EXEC msdb.dbo.sp_send_dbmail  
@from_address = @FromAddress
,@reply_to = @FromAddress
,@recipients = @email
,@body = @Alert
,@subject = @subj
,@profile_name = @MailProfile
,@body_format = 'HTML' ;
END


Looking forward to trying one of these out for fixing fences and hauling rocks this spring.


Tuesday, March 19, 2013

Solving the Space Wars via Powershell

The constant battle as the consumption of data grows is storage space.  When a disk space alert goes off, you generally will jump on the server over RDP and start hunting through the drive for what has caused the growth to hit the "wake me up" limit.  

This has been an occurrence in my career for a number of years, but seems to be happening more and more often.  As my career has progressed, I create more and more monitoring for various things that cause outages ultimately but this was never put together.  Given the simplicity of this script, I wish I had done it when my primary job was being a DBA.  Sure would have helped on our development and QA servers with finding what data file grew out of sight, or what log file had gotten carried away.

So here it is, a pretty basic thing that just dumps the information to the out-gridview control.  This could be shortened and condensed but that would sacrifice the readability and that is more important :)

Note: You may need a second variable for the SQL Instance name if not default
 $Server = "DBServer"  
 [Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo")  
 
 #Load up the SMO for the server      
 $sqlServer = New-Object("Microsoft.SqlServer.Management.Smo.Server") $Server  
 
 # Grab list of drives - this should get us mount points as well as regular drives (not tested)  
 [Array]$Drives = Get-WmiObject -ComputerName $server -Class Win32_Volume |   
     Select -Property DriveLetter, Label, @{Name=’FreeSpaceMB’;Expression={$_.FreeSpace/1MB} } |  
     Where-Object{$_.DriveLetter -ne $null}   

 # Grab the data/index files for each database  
 $dbFiles = $sqlServer.Databases |   
   Select-Object FileGroups -ExpandProperty FileGroups |  
   Select-Object Files -ExpandProperty Files |  
   Select FileName,Name,UsedSpace,Size  

 # Grab the log Files for each database    
 $dbFiles += $sqlServer.Databases |   
   Select-Object LogFiles -ExpandProperty LogFiles |  
   Select FileName,Name,UsedSpace,Size  

 #Container to hold the information in    
 [Array]$diskSummary=@()  

 #Loop through each drive ordered by free space on the drive  
 foreach($drive in $drives | Sort-Object FreeSpaceMB )  
 {  
   Write-Host $drive.DriveLetter  
   Write-Host $drive.Label   
   #Grab the size and used space for each database file on the current drive  
   $size = ($dbFiles | Where-Object{$_.FileName.StartsWith($drive.DriveLetter)} | Measure-Object Size -sum).Sum   
   $used = ($dbFiles | Where-Object{$_.FileName.StartsWith($drive.DriveLetter)} | Measure-Object UsedSpace -sum).Sum      
   #Add that information to the array for output later  
   $diskSummary += $drive | Select DriveLetter,Label,FreeSpaceMb,@{Label="DBFileSizeMB";Expression={$size/1024}},@{Label="DBFileUsedMB";Expression={$used/1024}}  
 }    
 #Show the files and disk information  
 $dbFiles | Sort-Object DriveLetter,FreeSpaceMB -descending | out-gridview -Title "DB Files"  
 $diskSummary | out-gridview -Title "Disk Summary"  

So there is the basics to get you started.

I will be speaking on this and a lot more bright and early on April 6th, 2013 at SQL Saturday 206 in


Hope to see you there.

Thursday, March 14, 2013

Telecommuting, the beginning is ahead of us


Fascinating stuff has been written lately about telecommuting.  Yahoo is pulling in their telecommuters and in the wake of it, Scott Hanselman wrote a really accurate piece on the matter (my opinion based on 4.5 months of actually doing it).  And again, Thomas Larock wrote about it.  It is hard to know what the real motivation is at Yahoo, but the key public component comes down to accountability.

Or simply put.

Can the company see that you are doing your job and can you prove you are doing it well? 

In thinking about how it works for the team I am on, I can safely say the fact that we deliver cubes, reports, etc every 2 weeks helps in terms of visibility to the business and the stories completed/points burned shows that individuals are doing their part.

Here is how the team setup unfolds:
  • The bulk of my team, including the leads are remote.  The folks that live near the office work remote a fair amount of time.  I think that addresses the work place jealousy aspect (or at least I presume it does).
  • We use Yammer, Lync, IP phones and do Agile so everyone has at least 5 minutes of vocal contact per day and multiple ways to converse, badger each other, support, whatever. 
On the personal side:
  • I have been in IT for 12 years, and a "senior" level SQL DBA, so I have been in the industry awhile.
  • My previous jobs have allowed some degree of telecommuting, but never any extensive amounts.
  • My career path and presentations prove to some degree that I am "driven".
  • I require very little social interaction, thus the need for daily office interaction is less important.
With all that said - here is what I have experienced/noticed in my first few months on the job in no real order.
  1. I work too much.  Just plain not used to being the weak spot on a team, so its hard to not keep going, especially during these miserable winter days and nights.  Not to mention sometimes what I am working on is interesting enough I just keep working on it.
  2. Communication can be hard.  Nearly every single time a phone call, IM or screen share will solve it, but I think sometimes being in person would be easier.
  3. I get monumentally more work done in a day working from home.  I don't have to listen to a loud salesman "we can help align your vertical markets with value added synergies" and the bizarre lives of nearby cube mates.  I have become somewhat blissfully ignorant of the outside world - probably not a long term thing, but has sure allowed me to think about things that do matter, like fishing trips with the kids, foraging for mushrooms and the impending maple syrup season:)
  4. Daylight - I have been able to spend more lunch hours hiking with my dog (w or w/o a gun) then I ever have during the week.  I live in Wisconsin, daylight is short in the winter, commuting chewed up an hour and a half of my day.  Lunch hour hikes are the norm now, to the point my dog comes in and reminds me.  At this rate I might be in decent shape this year.
  5. Sunday nights used to be source of dread for me, the long drive in on Monday, the traffic, snow storms, rain, crashes, etc.  There has not been a Sunday since I started telecommuting where I hated what Monday brought.  That may be because my job is challenging and interesting too, but I know I hated the commute more than the work all along.
  6. Kids.  I have been to 4 school functions in the past few months, which might total the amount I had seen the previous few years.  I even wait for the bus with my son occasionally.  Some mornings they will get up excited and I get to be there to talk about fishing, ninjas, bugs or video games.
  7. Internet speed.  I live in the country, and I have good internet speeds, but not great.  At times this has become a bit of an issue as doing screen sharing where I am the host can be problematic.  But that is about the only real problem I have had.
  8. Leaving your work in the home office.  There is always talk about never being able to leave your work when its at home, but I don't recall that ever being the case in my neck of IT, we were working after we got home then too.  You just don't have to haul a laptop home with you.  
  9. I make fun of the "green" movement, but my family actually lives it.  Meaning a huge garden, lots of raising, harvesting and foraging of our food, and all the other subtle things we can do to lessen our impact on the environment.  Even though I carpooled into work as much as I could, the amount of miles I put on my car has gone down dramatically.  Not to mention I utilize a space in my house that would be heated regardless and I found a really cool highly efficient lamp that I use for light (when the sun isn't out).  These are all things my work place would have to provide for someone who is there 9 hours out of 24 during the week.
  10. While I am thinking about it from the business side, I think a telecommuting worker is willing to work for less money.  Some might not, but for someone like me that lives out in the country, that commute was a killer in many aspects of the word.  I was willing to give up money for life (and a really good opportunity).
There is more really, but that is a good place to stop.  Already looking forward to a noon time hike to a nice waterfalls with all this new snow on the ground and 34 degree temperatures.  



My office if lived on a lake :)


On that note, I better get back to work.