Posted by: sqlswimmer | September 11, 2015

Fall Means Election Season

While it’s not officially Fall yet, it is definitely Election Season. Elections for the PASS Board of Directors opens up on September 17, 2015. Campaigning starts today with the announcement of the Official Slate of Candidates.

It’s a great slate and it will be hard to choose. I appreciate all the hard work the NomCom has done to provide us with such a great list of candidates. Please take the time to read each candidate’s application. This is the first year the NomCom scores have been made public. Members of the NomCom spend countless hours researching, interviewing and agonizing over how the scores are reached, so pay attention to them.

PASS has also come up with some community events for you to get to know your candidates. Take advantage of them to help you make your decision.

This is your community. Do your research then cast your vote. This is your opportunity to make a difference.

Posted by: sqlswimmer | July 14, 2015

TSQL2sday #68 – Defaults

A big thank you goes out to Andy Yun (b|t) for hosting this month’s TSQL2sday party. This month’s topic is Just Say No to Defaults.


TSQL2sday #68

I have a Bachelor’s degree in Mathematics with a minor in Computer Science. So when I say I’m a fairly smart person, it’s not bragging, it’s simply the truth. So when people say, “Any monkey can be a SQL Server DBA”, I find it offensive. While that statement may be close to true, it’s not the whole truth. Yes, Microsoft made SQL Server easy to use right out of the box, but if you want SQL Server to run well, you better have one of those really smart monkeys.

I can’t tell you how many times I have been approached by my friends that are developers, DBAs for other RDBMs or SysAdmins and asked to “fix” their SQL Server environment. They are really smart people, but they made the mistake of thinking that they could simply install SQL Server, accepting all the defaults, be off and running and never look back. The biggest complaint I hear from these same people is that “SQL Server is slow”. My response is usually something like, “Hmm, I can’t imagine why that would be”, in my most snicker-y voice.

There are so many things that can be tweaked in SQL Server to improve performance, but there are two things you can change right off the bat that will make a world of difference. They are the defaults for database file location and max memory. In fact, if you don’t change the default location for your database files and the max memory setting, a lot of the other performance tweaks won’t make that big of a difference.

Database File Location

When the SQL Server install runs, it asks where you want to put your database files. The default location is on the same drive where SQL Server is installed, which is typically the same drive as the OS installation. Do NOT accept this default, just say NO! If you have a high volume transactional system, this will cause competition with the OS and guess who loses? You do. You should take this a step further and separate out your data files from your log files. And your tempdb should have a drive all to itself. (Note: When I say drive, I am referring to physically separate disks, not a single disk that has been partitioned into multiple drives. If you’re using a SAN, make sure you coordinate with your SAN administrator to get your drives created from the correct “LUN pools”.)

Max Memory

After SQL Server has been installed, the default max memory setting is to basically allow SQL Server use as much memory as it wants. Sounds like a good idea on the surface, but just say NO! SQL Server is very greedy when it comes to memory, it will take every byte your server has to offer, leaving your OS starved. My general rule of thumb is to allocate ¾ of the total memory to SQL Server, never leaving less than 2GB, but not more than 4GB, for the OS.

These are not the only defaults you should change, but these are two that will get you the most bang for your buck. They are easy to change and the implications/consequences of these changes are easy to understand. If you are a beginner with SQL Server, start with these two things, they will make you look like a very smart monkey.

Posted by: sqlswimmer | June 26, 2015

The Whole Pie

Today should be such an exciting day, the day the speakers are announced for the 2015 PASS Summit. But as someone who has been involved with the process for the last 6 Summits, it’s a day that I usually turn my Twitter feed off because there’s always those one or two people who don’t get selected or who don’t like the selections that were made and throw a very public tantrum. Then the insults and accusations start flying.

There has been a long history of criticism of the abstract review process for the PASS Summit. Speakers complain that they don’t get feedback or the feedback they get is not helpful and that the process takes too long. Community members complain that certain topics weren’t included, that the process is a black box or there is a “white” and/or “black” list of who gets to present and who doesn’t.

When I started working on the Program Committee six years ago as an abstract review team member, it was a lot of work but very rewarding. When I was asked to step up my game two years ago and become a team lead, it was more work, but even more rewarding. Being part of the process that builds the program for the Summit is a great honor and a tremendous responsibility. So when I was asked to be a Program Manager this year, I had to consider what it would mean. It would mean even more work but I wasn’t sure if it would be more rewarding. But I decided I was up for the challenge and accepted the role.

In years past I was only responsible for a very small piece of the pie. I had my track that I had to think about and that was it. As a review team member, you read the abstracts for the track that you are assigned, you rate them and then you provided the team lead with your rankings. That was it. The rest of the process was really a black box to me.

As a team lead there are a few more pieces of the pie that you get to sample. Not only do you get to read all the abstracts in your track, but you also get to wrangle reviewers (someone akin to herding cats) and make tough decisions based on input from your team members. You have to take into account things like topic distribution, session level and speaker distribution (speakers are limited to only two general sessions). Being a team lead is a very time consuming role. Last year in an effort to provide better feedback to submitters a report was introduced that allowed team leads to see the comments reviewers had made. This gave us the opportunity to have a bit more of an insight into what the team members were thinking when they scored an abstract. The flow of the data wasn’t perfect, but it was tremendously helpful to me as a team lead.

As a Program Manager you have to look at the whole pie. You have to do all the things that a team lead does, but now you have to do it for ALL the tracks, not just the one that you were assigned to review. Then you have to set out to “build the program”. Building the program is like a super-mega-charged game of Jenga. You move one piece and it can cause such a ripple effect, you might spend thirty minutes trying to “fill the gap” you just created. I have a whole new appreciation for the process after being a Program Manager.

Whether you are a speaker getting an email, or you are a community member looking over the sessions that were selected, remember that feedback is a GIFT. Everyone loves gifts, so think twice before you speak/blog/tweet about the process. That was a giant pie we just made and we are very proud of it.

Posted by: sqlswimmer | May 28, 2015

Earning my MCSE: Business Intelligence Certification

I earned my MCSE: Business Intelligence Certification on May 27, 2015. It was a long road, but I did it. Back in May of 2013, I wrote about being Certifiable and wasn’t really interested in pursuing any certifications. What made me change my mind you ask? The short answer is, being a speaker.

Last summer I was invited to speak for the Triad SQL BI User Group in Winston-Salem. I did a very introductory class on Integration Services. I was asked a question that seemed simple, but I didn’t know the answer. That got me thinking, if I don’t know the answer to that, what else don’t I know?

I started doing some research on the question and decided, if I am going to do this research, why not get something other than just an answer, there had to be other things that I didn’t know. I looked at the MCSA certification path again. I looked through the topics that the three exams covered and got really excited. There were so many pieces of the technology that I had never used or hadn’t used in years. This was a real learning opportunity. I decided I needed to get my SQL learnin’ on.

I did a little bit more research on the exams and what study guides were available and discovered the Microsoft Training Kit. It consists of three books, each dedicated to an exam and each book has its own practice exams. It seemed like the best candidate so I ordered it from Amazon and had it delivered in two short days (Thank you Amazon Prime!).

The MCSA certification consists of three exams, 40-461, 70-462 & 40-463. The first exam, 70-461, is all about querying SQL Server. I’ve been querying SQL Server for almost 20 years, so it didn’t take much effort for me to pass this exam. I read through the questions at the end of every lesson in each chapter and the case studies. For the questions I got wrong, I went back and read the lesson, re-answered the questions correctly and that’s it. I passed exam 70-461 on December 24, 2014.

Exam 70-462 was a bit more involved for me. It is focused on Administering SQL Server. I had never used Always On and it has been years since I worked with replication so I figured the best place to start was by taking a practice exam to see where I needed to focus. I failed that first practice exam, but it provided me with a road map of what I actually needed to focus on. On January 30, 2015, I passed exam 70-462.

Exam 70-463 is about implementing a data warehouse. I followed the same approach for 70-463 as I did for exam 70-462. That approach paid off and on February 20, 2015, I passed the exam and earned my MCSA for SQL Server 2012.

I was going to stop at the MCSA, but after I completed that with relative ease, I decided I needed a bit more of a challenge. The question came down to MCSE: Database Professional or MSCE: Business Intelligence, since most of the work that I do now is BI related, I decided on the later. I looked at the topics that were covered in the exams and realized there were going to be some huge gaps. I don’t use Reporting Services in SharePoint integrated mode nor do I do any work with the Tabular model for Analysis Services. I’ve only been using Analysis Services on a regular basis for about 2 1/2 years now, so I am certainly no expert, so definitely needed some work there as well.

There are two exams needed to earn your MCSE: Business Intelligence after your MCSA, they are 70-466 and 70-467. Since there are no Training Kits for the last two exams, I decided to take Microsoft up on its Second Shot offer. For a limited time, it allowed a person a second chance to take a qualifying exam for free if you fail it the first time. I figured, what do I have to lose? At best I’ll pass first time around. At worst, I’ll fail the exam, but will gain valuable experience in how the exam is structured, what it covers and learn where I need to focus my studies. Then I could retake the exam for free. I failed exam 70-466 the first time I took it, as I expected I would. But I did much better than I thought I would, so I knew there was hope of earning my MCSE.

I went out to Microsoft Virtual Academy (MVA) and found the training video for 70-466. I also found the video for Tabular Model training. In addition to MVA, I also used PluralSight and various other books. I studied up on the stuff that I had never seen or worked with before. Then I went through a few refresher videos on the stuff I already knew (but had forgotten) and retook the exam, passing the second time around with flying colors on May 6, 2015.

The last exam was the most nerve racking, 70-467. You basically have to take all your knowledge from the previous four exams and apply that knowledge to what seems like an endless barrage of case studies. If you were no good at story problems in school, then this exam is definitely going to challenge you. I passed the exam on my first try, but I really wish I hadn’t waited three weeks between taking it and 70-466. Since I do not use the Tabular data model or Reporting Services in SharePoint integrated mode, I forgot a lot of the material in the three weeks between the two exams. You are given 150 minutes to take the exam and I finished with only three minutes to spare because I had to rack my brain for those nuggets of information that I hadn’t had the opportunity to use out in the wild. I think that if I had taken the exam within a week of 70-466, I would have done much better and had more time remaining.

Overall it was a good experience. I plan on taking some of the things I learned (and “relearned”) and implementing them at work to provide a better experience for our users. I know they will be grateful and I will know that I’ve done the best possible job that I could for them.

The certification isn’t why I started this journey. I started this journey because there was something that I didn’t know. Don’t let certification be the only reason you take this journey, make it one of the many rewards when you reach the end.

Posted by: sqlswimmer | May 12, 2015

TSQL2sday #66 – Monitoring

A big thank you to Cathrine Wilhelmsen (blog | twitter) for hosting this month’s TSQL2sday party. Monitoring is this month’s topic and it’s a very important one. It could mean the difference between having a job and looking for a job.

TSQL2sDay150x150When I started working with SQL Server (a long time ago, in a galaxy far far away) there were no commercial monitoring tools available and quite often I would get called or paged (yes, it was a long time ago) in the middle of the night by a very angry boss because there was something “wrong” with the database. Or worse yet, I would get no call at all and show up at work the next morning with a line of angry people waiting for me when I got off the elevator. It only took a couple of these encounters for me to realize that I needed to be much more proactive or change my line of work (I had heard that underwater basket weaving was an easy gig).

I started looking at the reasons I was being called and discovered most of them were things that could easily have been avoided if I had known about them earlier. Things like database and transaction log files filling up, running out of disk space, processes/queries that were taking increasingly longer and longer. Since there were no commercial monitoring tools out there I decided I needed to essentially roll my own.

I had to start looking under the covers to find what I was looking for. This gave me an even greater exposure into how SQL Server worked. Did I mention that this was before Google? I couldn’t just search for easy answers, I had to really dig in the system databases to find what I wanted. This was in fact, one of the best things that could have happened to me so early in my career as a DBA. I was forced to learn how SQL Server worked on my own.

To this day, I still “carry” around my home grown monitoring solution in my toolbox. I have updated it and expanded it through the years to accommodate newer versions and functionality and made it more efficient based on both of those things. Not all shops have the budget for monitoring tools and even if they do, a lot of the time they are only willing to spend that money on production servers, not development or test (don’t get me started, that’s an entirely different blog post).

My little monitoring solution has come in handy over the years because it has afforded me the opportunity to discover what’s under the covers of the newest version/features of SQL Server and provide a no cost basic monitoring solution to my employers when the budget is tight or non-existent. If you don’t have your own monitoring solution I would highly recommend you create one, if for nothing more than the reasons I stated above.

Don’t get me wrong, I love the commercial monitoring tools that I have access to now, but knowing the how and why of SQL Server will only make you a better DBA and could possibly mean the difference between having a job and looking for a job.

Posted by: sqlswimmer | April 14, 2015

Becoming a PASS Chapter Leader – Challenge Accepted!

My first real introduction to PASS was back in 1998. I was an “Accidential DBA” and needed some training. I heard about this conference in Chicago and my boss agreed it was the best place to start. Little did I know how that conference would influence the rest of my career.

Fast forward to 2010, after taking a slight hiatus from SQL Server, I was ready to get back into it. What better place than a local user group? After moving over 1,600 miles (a couple of years earlier) from where I started my career in SQL Server, I needed to make some new local connections. That’s when I was introduced to the Triad SQL Server User Group. At the time, Miguel Cebollero (B | T) was the chapter leader. Miguel was so friendly and welcoming and I remember Kevin Goode was presenting a session on CLR and how it really is NOT the devil’s work, but a great tool if used correctly. I was hooked.

In early 2011, Miguel decided to move back to his home state of Florida and handed the reins over to Kevin. Kevin moved the meeting location from Winston-Salem to Greensboro and I attended meetings on a semi-regular basis, but let’s face it, life just gets in the way. By 2012, our meeting attendance started to drop off and a new BI Group had started up in Winston-Salem. It became obvious that something had to change. Kevin had the foresight to work with the new BI group and asked for volunteers to help him out with the chapter. I decided to step up as “Speaker Wrangler” and we worked out a schedule with the new BI group that would be mutually beneficial.

Now it’s 2015 and Kevin has taken on a new role in his company which means he has less and less time to devote to the user group. When Kevin approached me about taking over the user group I was both honored and excited (and a little scared) that he considered handing the reins over to me. I have accepted the challenge and I hope that I can continue to lead the group as well as both Kevin and Miguel did and provide a much needed resource to local SQL Server professionals.

We will be a little more active on social media, we tweet at @TriadSQL and use the #TriadSQL hashtag, and welcome any suggestions that will make your user group more useful to you.

I’ve been responsible for administering Analysis Services (SSAS) on a regular basis for the past 7-8 years via SQL Server Management Studio (SSMS), but developing is a whole different ball game (and set of tools). I only recently started doing development work with SSAS on a regular basis. Prior to that I dabbled once every two to three years, enough to be able to say, “Yes, I can do that, but I can’t remember exactly how.” Fortunately the tool used for SSAS development is a tool that I am very familiar with, Visual Studio (or as we data professionals now call it, SQL Server Data Tools).

I recently had to change some security for our cube and was tasked with modifying the role membership. Now this most likely seems like a no brainer and it is, as long as you know where the buttons are. What buttons am I referring to? Those precious Add and Remove buttons that allow us to modify the members. When doing development work in SQL Server Data Tools (SSDT), there are all these nice menus and toolbars at the top or on the left hand side. So when I went to do some work on the role membership, imagine my dismay when I could find no way to add/remove users to the role via the menu or toolbar.

I must have spent a good 10 minutes looking through all the menu options and toolbars, only to discover there is no way to add/remove members in any of those. So I of course freak out mode ensued. I must have the wrong version of SSDT, I must not have the right permissions to be able to add/remove users, the list goes on and on. I took another look, I mean look at all that screen real estate. You have that HUGE white space that is just dying to be used and no way to use it.


Then it happened, I the saw the light (or buttons as it were), hiding way down there at the bottom of my screen.


Like I said, this may seem like a no brainer, especially for someone who does SSAS development work on a daily basis, but for those that are new or only use it every so often, it’s frustrating. After seeing someone ask where the buttons were via #sqlhelp on Twitter, I decided I needed to write this up.

I hope this saves some other unsuspecting soul the frustration that I experienced.

Posted by: sqlswimmer | March 19, 2015

Achievement Unlocked: 24 Hours of Pass Session Submitted

I have finally submitted a session for a speaking engagement. Those who know me know that this is a very BIG deal. I have a horrible fear of public speaking, but I am slowly starting to embrace that fear. Last year I presented at the Triad SQL BI User Group and I didn’t hyperventilate, no one was more surprised than me.

I submitted the same session that I did at the Triad SQL BI User Group, Introduction to Integration Services (SSIS), to the upcoming 24 Hours of PASS: Growing Our Community.

I wanted to be a teacher when I was in college until I discovered my fear of public speaking. Now, more than 20 years later, maybe I can finally reach that goal.

Fingers crossed that my session is accepted.

Posted by: sqlswimmer | February 11, 2015

Automating SSAS Backups

Backing up databases is one of the most important jobs of a DBA. If your data is not safe, your job is not safe. Data is the lifeblood of a DBA. That said, there are so many products out on the market that will help with backing up transactional databases in SQL Server, but when it comes to Analysis Services (SSAS), you are on your own. That’s what I discovered when I became responsible for a SSAS database.

The good thing, is that there’s a very simple way to back up your SSAS databases. SQL Server Management Studio (SSMS) has this great feature that allows you to script just about anything you need to do, including backing up a SSAS database.

Here’s how:

  1. Open up SSMS and select the Analysis Services server type in the Registered Servers window.

Connect to Analysis Services

  1. Double-click your server name, so that it appears in the object explorer, then expand the databases folder. Right click on the database you want to backup and select Back Up…

Right-click your database

  1. The Backup Database dialog opens. Fill out the values appropriate for your environment. I highly recommend encrypting your backup files, just don’t forget what the password is otherwise you will never be able to restore your database.

Backup Database dialog

  1. Instead of clicking the OK button when you are done, click the little arrow next to the Script button at the top of the screen and select Script Action to New Query Window. Click the Cancel button to cancel the Backup Database dialog.

Script backup

  1. You should now have an XMLAQuery window in SSMS that contains the commands to back up your database.


Wow, that was easy. Now you can create a SQL Agent job and just paste this XMLA query in the job step (be sure to select SQL Server Analysis Services Command as the job step type) and call it a day. But you probably shouldn’t. As you will notice, I selected the Allow file overwrite option in the Backup Database dialog and that is reflected in my XMLA script with the AllowOverWrite tag set to true. So, if I created a SQL Agent job to run every day and used this as my job step, I would never have any backup history, I would only have the most current backup. For some shops, this will be okay, for others, it won’t. In my shop it wasn’t enough. Policy dictated that I keep one week of backups, regardless of whether it was a transactional database or an OLAP database.

Luckily, PowerShell and I have become good friends. I was able to quickly create two additional steps in my SQL Agent job that utilized PowerShell commands to achieve my goal of maintaining one week of backups. I created one step to rename the backup file by appending the current date to the file name and the other step I created to clean up any old backup files, so that I didn’t fill up my hard drive with backup files. Here are my scripts.

Rename file:

cd c:
$today = get-date -uformat "%Y%m%d"
$oldname = "\\uncfilepath\Databasename.abf"
$filepath = "\\uncfilepath\"
$newname = $filepath + "Databasename_" + $today + ".abf"
rename-item $oldname $newname


Clean up old files:

cd c:
$RetentionDate = (Get-Date).AddDays(-6)
$FilePath = "\\uncfilepath"
Get-ChildItem $FilePath -recurse -include "*.abf" | Where {($_.CreationTime -le $RetentionDate)} | Remove-Item –Force


I won’t go into detail about my PowerShell script here, it’s mostly self-explanatory, with the exception of the first line in each, cd c:. I discovered that since I was using a UNC path, I needed to add this little tidbit to the beginning of each script otherwise the steps would fail. This is because the version of PowerShell that is being invoked inside a SQL Agent job is not EXACTLY the same version that is invoked outside of SQL Server.

Posted by: sqlswimmer | February 10, 2015

Managing Security – TSQL2sday # 63

A big thank you goes out to Kenneth Fisher ( b | t ) for hosting this month’s TSQL2sday party. Security is a big deal. How many times have you opened the paper (I’m dating myself, I know – no one reads an actual newspaper anymore, it’s all online now) in the last 6 months and there’s a story about another security breach, more records compromised or flat out stolen? Too many. While securing your data is probably the key to keeping your current employment status, there’s also a piece of security that is quite often overlooked and could be the reason for a resume generating event. Recovering from a failed server when you don’t use any of the HA features that are now available.


The scenario:
Your production server has failed and you don’t use any of those new fancy HA features like Always On Availability Groups, Log Shipping or even Database Mirroring. Your server hosts a standalone instance for the HR/Payroll department. Payroll must be processed in the next two hours or your company will be out of compliance with Federal Regulations and face heavy fines, not to mention all the really mad employees who won’t get their paychecks on time. I don’t know about you, but I do NOT want to be responsible for every employee not getting a paycheck, including myself.

You have a good backup plan in place, you take full, differential and log backups on a schedule that meets the minimum required data loss SLA and send those backups to a remote SAN data store. Your Sysadmin stands up a new standalone server for you in 30 minutes. You install and configure SQL Server in about 60 minutes (those pesky service packs and cumulative updates can take quite a bit of time). Now you are left with 30 minutes to get your databases restored and functioning. No sweat! Easy as 1..2..3, right? Wrong!

You restore your database only to discover that all your logins no longer exist on your brand new server. No problem, just recreate the logins and give them brand new passwords (SQL Authentication). All will be right with the world. You give your HR/Payroll department the okay to proceed and you catch your breath with 20 minutes to spare. The phone rings 5 minutes later, it’s HR/Payroll and it’s not working. They are getting invalid login errors. You have that momentary flashback to when you helped with the application install 4 years ago – the vendor hard coded the password into their application code, so you can’t just change it or give it a new password. That’s when you remember that you created a job to script the logins with their passwords on a weekly basis and saved the results off to file on that same remote SAN data store as the backups. Hallelujah! You find your script on the remote SAN data store, clean up the logins you created, then execute the script with the logins and their passwords. HR/Payroll is back up and running with 4 minutes to spare.

Paychecks for everyone!

While some of this may seem far-fetched, it’s based on an actual incident very early in my career. I may have embellished a little, but you get the point. You need to make sure you can recreate any login on your server at any time due to disaster/failure. If you can’t, you may just be looking for a new job.

To this day I still script the logins on all my servers on a weekly basis. I store that file in a secure location on a remote server. I’ve never had to use one since this original incident, but it’s nice to know that I can recreate the logins if I ever need to. Can you?

Older Posts »



Get every new post delivered to your Inbox.

Join 277 other followers