Posted by: sqlswimmer | February 11, 2015

Automating SSAS Backups

Backing up databases is one of the most important jobs of a DBA. If your data is not safe, your job is not safe. Data is the lifeblood of a DBA. That said, there are so many products out on the market that will help with backing up transactional databases in SQL Server, but when it comes to Analysis Services (SSAS), you are on your own. That’s what I discovered when I became responsible for a SSAS database.

The good thing, is that there’s a very simple way to back up your SSAS databases. SQL Server Management Studio (SSMS) has this great feature that allows you to script just about anything you need to do, including backing up a SSAS database.

Here’s how:

  1. Open up SSMS and select the Analysis Services server type in the Registered Servers window.

Connect to Analysis Services

  1. Double-click your server name, so that it appears in the object explorer, then expand the databases folder. Right click on the database you want to backup and select Back Up…

Right-click your database

  1. The Backup Database dialog opens. Fill out the values appropriate for your environment. I highly recommend encrypting your backup files, just don’t forget what the password is otherwise you will never be able to restore your database.

Backup Database dialog

  1. Instead of clicking the OK button when you are done, click the little arrow next to the Script button at the top of the screen and select Script Action to New Query Window. Click the Cancel button to cancel the Backup Database dialog.

Script backup

  1. You should now have an XMLAQuery window in SSMS that contains the commands to back up your database.

XMLA Code

Wow, that was easy. Now you can create a SQL Agent job and just paste this XMLA query in the job step (be sure to select SQL Server Analysis Services Command as the job step type) and call it a day. But you probably shouldn’t. As you will notice, I selected the Allow file overwrite option in the Backup Database dialog and that is reflected in my XMLA script with the AllowOverWrite tag set to true. So, if I created a SQL Agent job to run every day and used this as my job step, I would never have any backup history, I would only have the most current backup. For some shops, this will be okay, for others, it won’t. In my shop it wasn’t enough. Policy dictated that I keep one week of backups, regardless of whether it was a transactional database or an OLAP database.

Luckily, PowerShell and I have become good friends. I was able to quickly create two additional steps in my SQL Agent job that utilized PowerShell commands to achieve my goal of maintaining one week of backups. I created one step to rename the backup file by appending the current date to the file name and the other step I created to clean up any old backup files, so that I didn’t fill up my hard drive with backup files. Here are my scripts.

Rename file:

cd c:
$today = get-date -uformat "%Y%m%d"
$oldname = "\\uncfilepath\Databasename.abf"
$filepath = "\\uncfilepath\"
$newname = $filepath + "Databasename_" + $today + ".abf"
rename-item $oldname $newname

 

Clean up old files:

cd c:
$RetentionDate = (Get-Date).AddDays(-6)
$FilePath = "\\uncfilepath"
Get-ChildItem $FilePath -recurse -include "*.abf" | Where {($_.CreationTime -le $RetentionDate)} | Remove-Item –Force

 

I won’t go into detail about my PowerShell script here, it’s mostly self-explanatory, with the exception of the first line in each, cd c:. I discovered that since I was using a UNC path, I needed to add this little tidbit to the beginning of each script otherwise the steps would fail. This is because the version of PowerShell that is being invoked inside a SQL Agent job is not EXACTLY the same version that is invoked outside of SQL Server.

Posted by: sqlswimmer | February 10, 2015

Managing Security – TSQL2sday # 63

A big thank you goes out to Kenneth Fisher ( b | t ) for hosting this month’s TSQL2sday party. Security is a big deal. How many times have you opened the paper (I’m dating myself, I know – no one reads an actual newspaper anymore, it’s all online now) in the last 6 months and there’s a story about another security breach, more records compromised or flat out stolen? Too many. While securing your data is probably the key to keeping your current employment status, there’s also a piece of security that is quite often overlooked and could be the reason for a resume generating event. Recovering from a failed server when you don’t use any of the HA features that are now available.

TSQL2sDay150x150

The scenario:
Your production server has failed and you don’t use any of those new fancy HA features like Always On Availability Groups, Log Shipping or even Database Mirroring. Your server hosts a standalone instance for the HR/Payroll department. Payroll must be processed in the next two hours or your company will be out of compliance with Federal Regulations and face heavy fines, not to mention all the really mad employees who won’t get their paychecks on time. I don’t know about you, but I do NOT want to be responsible for every employee not getting a paycheck, including myself.

You have a good backup plan in place, you take full, differential and log backups on a schedule that meets the minimum required data loss SLA and send those backups to a remote SAN data store. Your Sysadmin stands up a new standalone server for you in 30 minutes. You install and configure SQL Server in about 60 minutes (those pesky service packs and cumulative updates can take quite a bit of time). Now you are left with 30 minutes to get your databases restored and functioning. No sweat! Easy as 1..2..3, right? Wrong!

You restore your database only to discover that all your logins no longer exist on your brand new server. No problem, just recreate the logins and give them brand new passwords (SQL Authentication). All will be right with the world. You give your HR/Payroll department the okay to proceed and you catch your breath with 20 minutes to spare. The phone rings 5 minutes later, it’s HR/Payroll and it’s not working. They are getting invalid login errors. You have that momentary flashback to when you helped with the application install 4 years ago – the vendor hard coded the password into their application code, so you can’t just change it or give it a new password. That’s when you remember that you created a job to script the logins with their passwords on a weekly basis and saved the results off to file on that same remote SAN data store as the backups. Hallelujah! You find your script on the remote SAN data store, clean up the logins you created, then execute the script with the logins and their passwords. HR/Payroll is back up and running with 4 minutes to spare.

Paychecks for everyone!

While some of this may seem far-fetched, it’s based on an actual incident very early in my career. I may have embellished a little, but you get the point. You need to make sure you can recreate any login on your server at any time due to disaster/failure. If you can’t, you may just be looking for a new job.

To this day I still script the logins on all my servers on a weekly basis. I store that file in a secure location on a remote server. I’ve never had to use one since this original incident, but it’s nice to know that I can recreate the logins if I ever need to. Can you?

Posted by: sqlswimmer | December 15, 2014

Transaction Isolation Level Blues

Have you ever had a mental block in one particular area when learning something? It might be the simplest thing, but for some reason your brain turns to Teflon when you try to store the information. For example, I have a degree in Math, so I am pretty good at arithmetic, but for the life of me I cannot remember what eight plus five is. I always have to break out my phalanges to get the answer.  Why the Hell can I remember what phalanges means and not a simple thing like eight plus five?!

I have this same problem when it comes to Transaction Isolation Levels in SQL Server. I can remember that there are five of them, Read Uncommitted, Read Committed, Repeatable Read, Snapshot & Serializable, but I cannot remember the little nuances that set them apart. It’s total Teflon. So I decided it was time to come up with a little song to help me remember. My older sister is a preschool teacher and she says that if you learn something as a song, it sticks with you for life. Here’s hoping that is true!

This is sung to the tune of George Thorogood’s Bad to the Bone.

At the time I am used
No Shared locks are issued
Not blocked by X locks
It is Loosey-Goosey
Just call me crazy
No restrictions abound
I could tell right away
It was Read Uncommitted

Bad to the bone
Bad to the bone
B-B-B-B-Bad
B-B-B-B-Bad
B-B-B-B-Bad
Bad to the bone

Not breakin’ any rules
Going by the book
Not readin’ any uncommitted
Transactions it’s true
I am the default baby
Transactions alone
I’m Read Committed
That’s what I do

Bad to the bone
B-B-B-Bad
B-B-B-Bad
B-B-B-Bad
Bad to the bone

No readin’ ’til committed
Can’t read dirty data either
I use shared locks baby
And hold ’til committed
I’m the repeatable read baby
Yours and yours alone
Data’s all yours honey
And I’m bad to the bone

B-B-B-B-Bad
B-B-B-B-Bad
B-B-B-B-Bad
Bad to the bone

When I query data
Kings and Queens step aside
Every bit I meet
It’s mine it’s all mine
Serializable baby
Range blocks on keys that’s me
HOLDLOCK does the same thing baby
Serializable oo-ee

Bad to the bone
B-B-B-B-Bad
B-B-B-Bad
B-B-B-Bad
Bad to the bone

(Extra verse)
There’s no write blockin’
While I’m readin’
No locks less I’m recoverin’
You can’t switch to me
But I can switch to you
I’m a snapshot baby
A photo just for you

Bad to the bone
B-B-B-B-Bad
B-B-B-Bad
B-B-B-Bad
Bad to the bone

If your brain is Teflon when it comes to Transaction Isolation Levels, then I hope this helps. If not, I hope you got a good laugh and please don’t tell George Thorogood what I did to one of his best songs (and one of my favorites).

By the way, eight plus five is .. thirteen.

Posted by: sqlswimmer | December 9, 2014

Giving Back T-SQL Tuesday #61

First off I wanted to thank Wayne Sheffield (Twitter | Blog) for hosting this month’s T-SQL Tuesday party and Adam Machanic (Twitter | Blog) for starting this party five years ago. I can’t believe it’s been five years.

TSQL2sDay150x150

This month’s theme is Giving Back to the SQL Community.

This is a great topic, it’s kind of like the Dickens’ holiday classic, A Christmas Carol. It gives me an opportunity to reflect on what I have done in the past, what I am doing now and what more I could be doing to give back to the community that has helped me so much in my career.

Past Giving

Member of the Abstract Review Committee (2010-2014)
Member of the Nomination Committee (2012)
“Speaker Wrangler” for Triad SQL, the local PASS chapter in Greensboro, NC (2012-2014)
PASS Ambassador at Summit (2010-2014)
24 Hours of PASS moderator (2012-2014)
PASS Summit Buddy (2013)
Chosen as a mentor in Steve Jones’ and Andy Warren’s The Mentoring Experiment (2012)

Present Giving

“Speaker/Sponsor/Swag Wrangler” for Triad SQL, the local PASS chapter in Greensboro, NC
Program Manager with the Program Committee for the PASS Summit.

Future Giving

I will continue my work (as long as they will have me) with the Program Committee. I absolutely love this volunteer position, it allows me to do something I enjoy while helping the community. It’s a win-win. I will also continue my position with Triad SQL. I have stepped it up a notch by wrangling not only speakers but sponsors and swag as well for 2015.

I enjoy moderating the 24 Hours of PASS too. It always gives me an opportunity to connect with some of the speakers that I’ve never had interaction with before. Just this year, I moderated for Gail Shaw (Twitter | Blog) and then got to meet her in person at Red Gate’s SQL in the City event in Seattle just before the Summit. How cool is that?!

I have a blog, in fact you’re reading it right now (wink wink), but I don’t write nearly enough. Part of me feels, “Surely someone has written about this before, so why should I clutter up cyberspace with my drivel?” and the other part of me feels, “Maybe my post can help someone who couldn’t quite make heads or tails of something they found via Google.” So for next year, I am setting a goal to write/blog at least once a month. I bet T-SQL Tuesday can provide the subject matter for those months where nothing interesting happens at work.

My favorite way, by far, to give back to the SQL Community is by being a PASS Ambassador during the PASS Summit. I get to see all the eager faces ready to stuff their brains to the point of overflowing. I get to help someone find the registration desk so their adventure can begin. I also get some of the first hugs of the Summit just by smiling and answering questions. It really is the best. So, as long as this program exists and I’m attending the Summit, I will continue to be a PASS Ambassador.

Lastly, I may even try speaking this year.  This is a huge deal for me, so I make no promises, other than to think about speaking.

How are you giving back?

Posted by: sqlswimmer | November 10, 2014

Summit 2014

It’s hard to believe it’s over.  It felt like a whirlwind while I was in Seattle for my 7th PASS Summit, but now that I’m back home it feels like it was ages ago.  I think time moves more quickly when you’re with friends and that’s where I was, with friends.

I got to reconnect with old friends and meet new ones.  I didn’t attend nearly as many sessions as I would have liked, because let’s face it, cloning technology isn’t quite where it needs to be as Michael Keaton found out in Multiplicity.  With my luck my “Number Four” would have attended one of Paul Randal‘s sessions and I would have wound up doing God knows what to my servers when I got back.

I also got to meet people that I have “worked” with for quite a while virtually, but never met in person.  I must say it’s always refreshing when their “in person” exceeds your expectations.  There are so many genuinely nice people in our community, I am truly in awe.

In years past I have not been able to participate in most of the after-hours activities due to Summit happening right before a big annual swim meet, which meant I couldn’t take a break from training.  This year, my swim meet was the week before Summit so I didn’t need to get up at 4:30 a.m. every morning to make it to practice before breakfast.  I got to see how the “other half” lived at Summit this year.  I must say it was eye opening and entertaining.  They don’t have next year’s swim meet on the calendar yet, but I have the Summit dates, so next year’s meet just may have to go on without me.

If you’ve ever attended a PASS Summit, you know what I’m talking about when I say I’ve already started the count down until next year’s Summit.  If you’ve never attended a Summit, what are you waiting for?

Posted by: sqlswimmer | November 4, 2014

Data Driven Subscription On A Budget, Part 2

Data Driven Subscriptions On A Budget – Part 2

Yes, this is Part 2, you can find Part 1 here to get background information.

This blog will talk about Point 3 – Distribute a report to a fluctuating list of subscribers.

Distribute a report to a (fluctuating) list of subscribers

When using email as your method of delivery for a Data Driven Subscription, best practice is to use metadata to drive that process. Usually a simple table that contains the email address of the recipient and the report name does the trick. This part of the process is no different if you don’t have Data Driven subscriptions. I usually create a table similar to this:

CREATE TABLE dbo.SSRSEmailSubscribers
(
EmailAddress varchar(128) NOT NULL
,ReportName varchar(128) NOT NULL
)

Let’s say I have a sales report that needs to go out on a daily basis and the standard format for this report is Excel. Because we don’t have data driven subscriptions, we can’t just query the table and use the resulting list to email the report. Instead we need to create a File Share subscription that generates this report and saves it to a file share. From there we can “pick up” the newly generated Excel file and email it to the recipients.

  1. Create a subscription to your Daily Sales Report in Report Manager, schedule it for a one time execution at a time just a few minutes in the future, and remember the execution time. (This creates the SQL Agent job in SQL Server.)
  2. Take a look at your SQL Agent jobs in SQL Server. If you have never seen or noticed a job created by SSRS, then you will be wondering where your job is because SSRS does not use human readable names for its jobs, it uses those pesky GUIDs as names, ugh! If your server has lots of jobs you may need to sort your jobs by Category to get all the “Report Server” jobs together. Find the job that executed at the time you scheduled (this is why you needed to remember the time!), this will be the job you will need to reference in the next step.
  3. Create a new SQL Agent job and add a step for Transact SQL script. In this new step you need to execute the SQL Agent job that you created back in step 1:

exec msdb.dbo.sp_start_job N'B514C05F-07D5-4C0B-9600-666E9980C7C3'

    where B514C05F-07D5-4C0B-9600-666E9980C7C3 is the GUID from the job that SSRS created.
  1. Next you will need to add a new job step for PowerShell. In this newly created step write your PowerShell script to get the file that was generated (as a result of executing the previous step) and retrieve your list of email addresses. Once you have this information you can send the email with the report attached. There are several ways you can do this, but I chose to use PowerShell. Since PowerShell requires a double quoted semicolon (;) delimited list of email addresses when using SMTP, I wrote my SQL query to return a double quoted semicolon (;) delimited list of the email addresses. You could have just as easily used PowerShell command-lets to format your list. Here’s my PowerShell script:

cd c:

$FilePath = "c:\temp\"
$smtpServer = "10.0.0.4"
$smtpFrom = noreply@email.com

$AddressQuery = "DECLARE @List varchar(MAX);"
$AddressQuery = $AddressQuery + "SELECT @List = COALESCE(@List + '"";""', '') + EmailAddress "

$AddressQuery = $AddressQuery + "FROM dbo.SSRSEmailSubscribers "
$AddressQuery = $AddressQuery + "WHERE ReportName = 'Daily Sales Report'; "
$AddressQuery = $AddressQuery + "SELECT '""' + @List + '""';"
Invoke-Sqlcmd -Query $AddressQuery -ServerInstance "MyServer" -Database "MyDatabase" -Variable $smtpTo

$messageSubject = "Daily Sales Report was executed"
$latest = Get-ChildItem -Path $FilePath -Filter "*.xlsx" | Sort-Object CreationTime -Descending | Select-Object -First 1

$FullFileName = $FilePath + $latest
$body = "Attached is the Daily Sales Report"
send-mailmessage -from $smtpFrom -to $smtpTo -subject $messageSubject -body $body -smtpServer $smtpServer -Attachments $FullFileName

Now schedule this newly created SQL Agent job for the time you need your Daily Sales Report to be executed and Wa-La, you now have a data driven subscription that distributes a report to a fluctuating list of email addresses.

Posted by: sqlswimmer | August 28, 2014

I Didn’t Hyperventilate!

I gave what is officially my second presentation this week. I presented at the Triad SQL BI (Twitter | PASS) user group meeting and I didn’t hyperventilate! That’s a huge deal for someone like me, who is petrified of public speaking.

It started out a little rough though.

Timeline
Friday, 8/22 (just 4 days before presentation date) – I met Katherine Fraser (SQLSassy) for lunch and she mentioned that their scheduled speaker had just cancelled on them the day before. I asked her what she was going to do and she said unless I wanted to present for her, she had no idea. I jokingly said, “Yeah, sure, I’ll present”. Do not EVER, tell a chapter lead you will present, even if joking around because they will pounce on you! Lesson learned there. I agreed to present a very introductory session on SSIS. I then went home and started to panic.

Saturday, 8/23 – I woke up with a horrible sinus headache and thought I was in the beginning of nasty sinus infection. Now I really started to panic. I sent Martin to the drugstore to buy every sinus medication they had on the shelf. There was no way I could be sick, I could not cancel on Katherine after I had just agreed to present the day before. I proceeded to pound down some Emergen-C and drink about a gallon of water an hour for the rest of the day.

Sunday, 8/24 – I woke up at 4:30 a.m. to take part in the upgrade of major system at work. I felt about the same as Saturday. I pounded some more Emergen-C and worked until 11:30am. After we got the green light from the testers at 3:30 p.m., I went to bed and collapsed.

Monday, 8/25 – Woke up feeling much better, but not great. Pounded more Emergen-C. Started to work on my presentation. Did I mention that I didn’t have anything prepped for a presentation? I’m not a speaker, why on earth would I have a presentation ready to go? Got a call from my boss that the system upgrade wasn’t going so smoothly and had to start firefighting in production.

Tuesday, 8/26 – Presentation day. Got the word from my boss that the system upgrade was still up in the air, but none of the pieces that were broken were anything I could help with or fix. I started to work on my presentation. Just before lunch time I was told I had two conference calls I needed to participate in. Great, another two hours I don’t get to work on my presentation! Finally done with conference calls, when I got a call from my boss, we are rolling back the upgrade and I need to bring the old server back online. Luckily I had been able to create the content of the presentation and test it. I just didn’t have any time to do a practice run through. That was going to have to be enough, it was time to go to Winston-Salem.

I arrived in plenty of time, but I forgot: the power supply for my laptop, my external mouse, speaker evaluation forms and my list of things I needed to take with me to the meeting. Luckily my laptop was fully charged and didn’t die during the presentation (in fact I could have gone on for another 2 ½ hours, thankfully no one wanted to stay that long!). A mouse was provided by our wonderful host, Matt Clepper of Inmar, but not before I had a slight mishap using that @!#$%^& embedded mouse on my laptop. Katherine was well prepared and brought speaker evaluation forms. As for my list of things I needed to bring with me, well, I just had to “adapt and overcome”.

The presentation went pretty well, I didn’t hyperventilate. Sometimes you have to have a very simple goal, just survive without a trip to the ER.

Wrap up
Overall it was a good experience. I think I did a good job of presenting and the feedback I got reinforced that. There were some great ideas on what I could have done better and some great comments on what I did well.

Will I speak again? Probably. I’m not sure I’m ready for a SQL Saturday quite yet, but maybe another couple of times at UG meetings and I’ll think about it. A huge “Thank you” goes out to Katherine for taking a chance and believing in me.

Of course I didn’t sleep at all Tuesday night. I kept thinking, “I forgot to tell them…”

Posted by: sqlswimmer | July 1, 2014

Data Driven Subscriptions On A Budget

Data Driven subscriptions in SQL Server Reporting Services (SSRS) is only available if you have the Enterprise or BI Editions for 2012 & 2014, Enterprise or Data Center Editions for 2008R2 or Enterprise for 2008. But what happens when the money is not in the budget for those versions? Can you still get Data Driven subscriptions? The answer is You Bet!

I have worked in large shops in the past where purchasing the Enterprise Edition of SQL Server was never an issue, in fact, it was the standard flavor of SQL Server. But when I switched to a smaller shop, where cost was an issue, I had to say good-bye to all those lovely Enterprise features that I have come to know and love. As the proverb goes, “Necessity is the mother of invention”. So when I was asked to essentially create a data driven subscription in Reporting Services, I paused ever so slightly, then said, “Yeah, I can do that”.

There are a few things you can do with a data driven subscription in SSRS

  1. Trigger the execution of a report based on data
  2. Provide parameters to filter the report data at run time
  3. Distribute a report to a fluctuating list of subscribers
  4. Vary the output format and delivery options.

In this post I will address point 1 only, hopefully at some point I will get around to creating a post about points 2, 3 & 4, but for now, it’s just 1.

Trigger the Execution of a report

Let’s say I have a sales report that needs to go out on a daily basis. This report contains sales for the previous day. But what happens when there are no sales? Our report shows up with no data on it. Now we, as data people, completely understand why this happens, but those in the C-Suite don’t always understand and they think the report is “broken”. This initiates a call to the help desk saying simply, “The Daily Sales Report is broken”. We freak out, thinking, “Great, who promoted what?” We instantly go into trouble shooting mode. But after running the report, seeing no data and running the underlying query, we now understand. The report is not “broken”, there were just no sales. I don’t know about anybody else, but I don’t like those in the C-Suite thinking that we in the dungeon are idiots. So instead of sending them a report with no data, we need to send them an email to let them know there were no sales the previous day. Problem solved, crisis averted, get back to work. Oh but wait, I don’t have the appropriate edition of SQL Server, how the heck can I do this? In three easy steps, that’s how:

  1. Create a subscription to your Daily Sales Report in Report Manager, schedule it for a one time execution at a time just a few minutes in the future, and remember the execution time. (This creates the SQL Agent job in SQL Server.)
  2. Take a look at your SQL Agent jobs in SQL Server. If you have never seen or noticed a job created by SSRS, then you will be wondering where your job is because SSRS does not use human readable names for its jobs, it uses those pesky GUIDs as names, ugh! If your server has lots of jobs you may need to sort your jobs by Category to get all the “Report Server” jobs together. Find the job that executed at the time you scheduled (this is why you needed to remember the time!), this will be the job you will need to reference later.
  3. Create a new SQL Agent job and add a step for executing Transact SQL script. In this newly created step write a query that checks for sales from the previous day, if sales exist, then execute the job that was created by SSRS, otherwise send an email stating that there were no sales. There are several ways you can do either one of these things, but here’s my T-SQL script:

DECLARE @SalesTotal numeric(18,2)
, @Yesterday date
, @BodyMessage nvarchar(max)

SET @Yesterday = DATEADD(DAY, -1, GETDATE())

SELECT @BodyMessage = N'There were no sales for ' + CAST(@Yesterday AS varchar(10))

SELECT @SalesTotal = SUM(InvoicedAmount)
FROM dbo.Sales
WHERE SaleDate = @Yesterday

IF @SalesTotal > 0
BEGIN
EXEC msdb.dbo.sp_start_job @job_name = '4D3D4A1F-F007-4045-B5F6-3C86445D153B'
END
ELSE
BEGIN
EXEC msdb.dbo.sp_send_dbmail @recipients = 'recipient@servername.com',
@subject = 'Daily Sales Report',
@body = @BodyMessage
END

Now schedule this newly created SQL Agent job for the time you need your Daily Sales Report to be executed and Wa-La, you now have a data driven subscription for your SSRS report.

There is one really big assumption here, since it’s a small shop, the SSRS instance and the instance where your data live are one in the same. This is easily adaptable if they are not on the same instance by creating a linked server, yes I feel dirty even suggesting it, but like I said, “Necessity is the mother of invention”.

Posted by: sqlswimmer | June 2, 2014

2014 NomCom Slate is Amazing

Serving on the Nomination Committee (NomCom) is such an honor. One that I was lucky enough to experience two years ago. It allows me to give back to the community that has given me so much over the years and continues to give back on a daily basis. This year’s NomCom will also be tasked with “streamlining the process for involving and evaluating candidates and with enhancing opportunities for community engagement in the elections.” I am very excited about this opportunity. When I served two years ago, I thought there was room for improvement in the process and it looks like this will be my opportunity to see if I can make a difference.

This year’s slate is a great one. There are former Board members, chapter leaders, former NomCom members & some very outstanding volunteers. I am excited to see there are so many taking an interest in their community. Please do your research on each candidate and make the choice that is right for you, then get out there and VOTE! It’s your community. Voting opens June 3 and closes June 6.

Posted by: sqlswimmer | March 21, 2014

Disheartening Apples

Question:  How many Oracle DBAs can you fit in Madison Square Garden?

Answer:  None, their egos won’t fit through the door.

No, this blog is not a slight towards Oracle DBAs.  I have several friends who are Oracle DBAs and they are some of the nicest, most humble people you will ever meet, but in my experience, they are the exception.  Early in my career I had to administer Oracle and the person I was supposed to learn from was a hard core Oracle DBA and had been for years.  Probably one of the smartest people I’ve ever worked with but only barely tolerable to be around because his ego was larger than the Goodyear Blimp.  When I was introduced to SQL Server way back in 1996 (yes, I am dating myself), there really weren’t any experts in the field readily available when questions came up.  The internet was just starting to flourish and Google hadn’t even been invented yet.  There were a few guys that I had heard of that were willing to help, Brian Knight, Andy Warren & Steve Jones.  I knew about them because Steve Jones was a local guy in Denver, where I was living at the time.  They created this thing called SQLServerCentral.com.  These few are the epitome of SQL Server stewards, for both knowledge and professionalism.  Smart, humble, easy going, willing to share knowledge freely and did I mention smart?  Honestly, they have been my role models for “lifting as you climb” in my career.  Okay, now I sound like some creepy stalker, so we’ll move on to the point of this blog.

For those that know me personally, you know I am not a speaker type, so I give back to the SQL community by volunteering behind the scenes in any way I can.  So when I was selected to work on the program committee for PASS Summit 2014, I was so excited I did a little happy dance in my cube at work.  My co-workers are used to seeing my head bob to the music of my headphones, but seeing me do a happy dance was a little startling for some.  This will be my 5th year on the committee, more specifically the abstraction review team.  I love being on the abstract review team, getting to read what people are passionate about teaching is always so exciting.  It kick starts my love of SQL Server and learning.  However, just because I had been on the committee in previous years, I never assumed I would be selected again.  It is such a privilege to be chosen, and trusted, with this huge task.

Recently there has been a lot of chatter in social media about the selection process (or lack thereof according to some).  It saddens me to see some people’s true colors.  Being chosen to speak for any engagement, not just the PASS Summit, is a privilege, not a <insert deity here> given right.  The selection process has improved over the years and has become more streamlined thanks to the investment made by the PASS IT team.  It still has room for improvement, but most things do.  One of the things I like best about this year is the increase in the amount of time we have been given to review the abstracts.  In years past, we’ve had a very small window in which to review the abstracts.  That small window made it very difficult to coordinate team members’ schedules to discuss final rankings and assign rejection reasons.  I am hoping we can do a much better job this year of providing useful feedback to speakers.  Another improvement that was implemented last year was the removal of the speaker from the abstract.  This is a huge deal.  In years past, we could see who submitted the session and I fear that it swayed team members’ opinions of abstracts, both good and bad.  Some would be chosen because they were “well known” speakers and/or authors, it didn’t matter that their abstracts were poorly written, which in my experience often translated into poorly presented sessions.  Some would be excluded for the exact same reason.   As a speaker you need to have enough respect for your audience to provide them with the best written abstract you can, it’s the surest sign of respect.

I congratulate all those that have submitted sessions for the Summit; it’s a huge step just submitting a session for the PASS Summit.  If you are selected to speak, I ask that you remember that it’s a privilege and an honor and that you treat those attending your session(s) with the respect they are due.  If you are not selected, I ask that you not give up.  If you have questions about why your session was not selected, ask.  I have been asked in the past and am always glad to provide additional feedback.

Question:  How many SQL Server DBAs can you fit in the Seattle Convention Center?

Answer:  Unlimited, as long as they remember why they attend and/or speak at the PASS Summit.

Older Posts »

Categories

Follow

Get every new post delivered to your Inbox.

Join 222 other followers