Posted by: sqlswimmer | June 30, 2016

SQL Server 2016, Database Mail and .Net 3.5 Framework

There were so many cheers when Microsoft announced that the .Net 3.5 Framework was no longer a pre-requisite for installing SQL Server starting with SQL Server 2016.  Folks were so excited, until they started testing certain pieces of the product, specifically Database Mail.  That’s right, if you want to use Database Mail in SQL Server 2016 you have to install the .Net 3.5 Framework.

If you are installing SQL Server 2016 on Windows Server 2012 R2, this might be a problem for two reasons.  First, the .Net 3.5 Framework is no longer included by default when spinning up a new install.  Second, you will get no errors when testing Database Mail in SQL Server 2016.  Mail will be queued, but will not be sent and there will be no errors in the Database Mail log, the SQL Agent log or even the SQL Server error log.

So if you’ve gone through all the usual steps to troubleshoot Database Mail (or these steps) in SQL Server 2016 to no avail, be sure to double check that the .Net 3.5 Framework has been installed.  If not, you will need to install it, then apply ALL the patches for it.

Posted by: sqlswimmer | June 23, 2016

Struggling With Due Diligence

It’s been a while since my last post, there are various reasons for my absence:  work, illness, having our house rewired, etc., but one of the biggest reasons is my struggle with how a few folks in the SQL community treat the PASS organization and those that volunteer their time to PASS.

As many of you know, I do a lot of volunteer work for PASS.  I do this because it’s an organization that I truly believe in.  It was created by the community for the community.  It’s a place where data professionals can exchange knowledge freely, no strings attached.  To my knowledge there is no other community in the IT world quite like it, we even have our own hashtag on twitter, #sqlfamily.  But lately I’ve been struggling with how a few community members have reacted to policies/procedures/contracts.  Essentially starting a fire, pouring gas on it and walking away.

I’ve been involved with the Program Committee (they are the folks that select the content for the annual Summit) since 2010.  I took over my local PASS chapter when the existing chapter leader stepped down.  I help out with local/regional SQL Saturdays when my schedule allows.  I moderate 24 Hours of PASS when my schedule allows.  I volunteer while on site during the Summit as an Ambassador.  I’ve served on the NomCom (2012).  You get the idea, I am a true believer and not just in lip service, so when someone “attacks” an organization that I truly believe in, I get more than a little irritated.

One thing I have learned throughout the years of being a DBA is that you need to be able to prove a problem is NOT yours by exploring all the other possible areas that could possibly be causing the problem.  You have to look at it from all angles, not just the DBA angle.

We’ve all been there.  Customer calls to say application is slow and a trouble ticket is automatically created and assigned to the DBA team because the application uses a database.  This is somewhat akin to saying the issue with a car’s performance is the gas – all cars use gas so it must be the gas.  It’s tiresome and frustrating, but we go through motions to prove the issue is not ours.  In the financial world, it’s called due diligence.

This kind of due diligence has proven to be useful in other areas of my life, both personal and professional.  I would ask that those in the community please do their own due diligence BEFORE posting a blog, sending a tweet or starting the good old fashioned room mill.  Lately several community members seem to have forgotten the kind of influence they carry with the rest of the community and not done their due diligence before posting a blog, sending a tweet or starting the rumor mill.

You will notice that I did not name any names.  That would really defeat the purpose of this post.  I don’t want to start a fire with this post then allow gasoline to be poured on it with all the comments (not that there would be tons, because I don’t carry a lot of influence in our community – not a slight, just stating a fact) and then walk away.

I want you ALL to think about what you post, tweet or say BEFORE you do it.

Posted by: sqlswimmer | February 15, 2016

Did you know?

Did you know that the call for speakers for PASS Summit 2016 opened on February 3, 2016?

Did you know that the call for speakers for PASS Summit 2016 ends on March 2, 2016?

Did you know that there’s a Speaker Resource Page?

Did you know that you can have your abstract reviewed confidentially BEFORE you submit it for the Summit?

Did you know that your abstract is being reviewed by members of the SQL community?

Did you know that this is the second year the Program team has offered this service?

Did you know that only 32 people took advantage of this service last year?

Did you know that even experienced speakers use this service?

Did you know that you only have until February 26, 2016, to submit an abstract for confidential coaching?

Did you know that you have nothing to lose?

So what are you waiting for?  You’ve been thinking about submitting for a while now but were unsure if your abstract was good enough.  Well, now you have no excuse, use the confidential coaching service and find out.

Posted by: sqlswimmer | January 25, 2016

What is Power BI?

One of the “benefits” of being a chapter leader is that sometimes it means doing a presentation yourself when you can’t get a speaker.  I fell into this exact scenario for February’s meeting of Triad SQL.  I was trying to figure out what to present when the planets aligned. After reading the #EntryLevel post in this month’s PASS Connector News and my boss asking me about Power BI.  He wanted to know more about it and if it was something we could use.

I decided to put a presentation together to answer those questions.  This post is basically the flattening out of my PowerPoint presentation.

The What/Who/Why/Flavors of Power BI

What is Power BI?

When I Googled (yes, I used that as a verb!) “What is Power BI”, this is what I got, “Power BI is an amazing business analytics service that enables anyone to visualize and analyze data.”  This sounds cool, but isn’t all that helpful.  After further research, I found this definition, courtesy of powerbi.microsoft.com

Power BI is a cloud-based business analytics service that enables anyone to visualize and analyze data with greater speed, efficiency, and understanding. It connects users to a broad range of data through easy-to-use dashboards, interactive reports, and compelling visualizations that bring data to life.

Why use Power BI?

There are lots of reasons to use Power BI, other than, it’s so cool.  For instance, Power BI makes it easy to see, in one glance, all the information needed to make decisions.  It also allows you to monitor the most important information about your business.  Power BI makes collaboration easy and when I say easy I mean EZ!  You can also create customized Dashboards tailored to those C-Suite folks or make a completely different dashboard based on the same data for those that actually do the work.

Who can use Power BI?

Anyone who has a work or school email address can use Power BI.  Sorry, no personal email addresses.  Also no government (.gov) or military addresses (.mil).

Flavors of Power BI

There are two flavors of Power BI, Free and Pro.  You can do everything with Pro that you can do with Free plus a few other things.  Here’s a little comparison of the two, there are more differences, but these are the big ones.

Free

Pro

Data refresh frequency: Daily

Data capacity Limit: 1GB/user

Streaming rate: 10K rows/hour

Data sources are limited to content packs for services and importing files

Data refresh frequency: Hourly

Data Capacity Limit: 10GB/user

Streaming rate: 10M rows/hour

Data Sources include free ones plus direct query dataset and on-premises data

Collaboration with content packs

As of January 21, 2016, the Pro flavor goes for $9.99 USD per month per user.

Also, there is the previous version/flavor of Power BI referred to as Power BI for Office 365, which will be deprecated on March 31, 2016, so I am not including this version/flavor in this post.

The How of Power BI

The building blocks of Power BI are Dashboards, Reports & Datasets.

Dashboards

Dashboards are made of Tiles that contain a single visualization created from the data of one or more underlying Datasets.  When I first read this all I heard was “blah blah blah Datasets”.  What this means is simply this, it’s a collection of reports that are all displayed together for a specific reason.  It could be that you want all your sales guys to see different views of how they are doing compared to budget/forecast or it could be that you want to give your C-Suite people a quick overview of how the company is doing as a whole.  You can tailor these dashboards to whatever suits your purpose.  Now the only reference to the limit on the number of dashboards I could find was on the Office 365 site and it was listed as 100 per user or group.  I’m thinking of the old adage “just because you can doesn’t mean you should” would apply here though.

Reports

A report is one or more pages of visualizations.  Reports can be created from scratch within Power BI or Power BI Desktop.  They are very easy to create, you simply click on the type of visualization you want to display then select the data to be used.  One caveat that I will mention here is be sure your data is formatted so that is can be more easily consumed by Power BI.  See this link for tips and tricks on how to build a “proper dataset” for Power BI.  Just as with Dashboards, you have a limit as to the maximum number of reports, which is the same as Dashboards, 100 per user or group.

Datasets

A Dataset is something that you import or connect to.  It contains the actual data you want to translate into visualizations.  Right now you are limited as to the types of files you can import in to Power BI to Excel, Comma Separated Values (.csv)  and Power BI Desktop files (.pbix).  As far as connecting to data sources you can choose from many of the content packs that are available via the Power BI site like Google Analytics, Bing, Mail Chimp, Sales Force and GitHub, just to name a few or you can connect to a database.  As with anything that sounds too good to be true, you are limited to the databases you can connect to.  Right the now current list is limited as well, to Azure SQL Database, Azure SQL Data Warehouse and SQL Server Analysis Services (tabular model only).  There is a 250MB limit to the size of the dataset that you can import in to Power BI and a limit of 100 Datasets.

References

https://powerbi.microsoft.com/en-us/documentation/powerbi-service-get-started/

https://powerbi.microsoft.com/en-us/documentation/powerbi-videos/

https://powerbi.microsoft.com/en-us/documentation/powerbi-webinars/

https://technet.microsoft.com/library/mt282164.aspx

That’s it.  I hope this post provided a little bit of insight into Power BI and whether it’s something that can be useful to you and/or your company.  Check out the following links if you want a deeper dive into Power BI.

Power BI Blog

Melissa Coates Blog

Reza Rad Blog

Chris Webb Blog

Posted by: sqlswimmer | January 22, 2016

What Do You Want?

It’s that time of year, planning for PASS Summit 2016.  We’ve already put out the Call for Volunteers, which closes today, January 22, 2016, so get those applications in and be part of the team that helps determine content for the Summit.  Don’t want to volunteer but still want to help determine content for the Summit?  Then take the survey to tell us what you want to see.  It’s a quick survey, less than five minutes and you only have until Wednesday, January 27, 2016 to tell us what you want.  What are you waiting for, get to it!  You may even win a USB of the session recordings.

What would you like to see at Summit 2016?

Posted by: sqlswimmer | January 8, 2016

We Want You!

It seems like PASS Summit 2015 was just yesterday and here we are again, getting ready for Summit 2016 already.  This will be my seventh year of being a member of the Program Committee and my second year as a Program Manager.  If you have ever thought about volunteering for PASS this is a wonderful opportunity.  We need lots of volunteers to assist with everything from reading abstracts to special projects so that we can make Summit 2016 a great experience for the entire community.  Summit 2016 is still over nine months away but the work starts now.

The call for volunteers just went out this afternoon and we want you.  Use the link below to fill out the volunteer application.

PASS Summit 2016 Call for Volunteers

Posted by: sqlswimmer | November 20, 2015

Scripts – A Living History

As a DBA, I have a collection of scripts that I use for anything from auto-fixing logins to seeing who has the DAC.  Since I’ve been a DBA for a while (yeah, a while, we’ll go with that) I have quite the collection of scripts and I am constantly adding to it. 

In the Prehistoric days of dinosaurs and floppy disks, I used to keep a backup copy of them on a 3 1/2" floppy.  This was convenient and portable, so if I ever changed jobs, I could take my scripts with me. 

Then we entered the Golden Age of writable CDs and I could burn them to a data CD.  Still portable but a little more durable than a 3 1/2" floppy, I didn’t have to worry about keeping my CD away from magnets.

Carrying a CD around may have been more durable, but it certainly wasn’t more convenient.  Enter the Renaissance Age of USB/Thumb drives.  Holy Cow, I could copy all my scripts to a USB drive and fit it in my pocket, I could take it with me everywhere I went, now that’s convenient!

Enter the Industrial Age and we got smarter about how we did things.  Hello Google Drive.  No more having to carry around anything but lint in my pockets.  As long as I had access to the internet, I had access to my scripts.  Even if the internet were temporarily unavailable, I could still access the scripts on my local hard drive. 

But then a funny thing happened, I modified one of my scripts to accommodate a specific version of SQL Server and accidentally overwrite the original file.  We’ve all been there, that moment when you click the Save button instead of Save As.  All the expletives rumbling around in your head because now you have to remember what it was before you overwrote it.  Enter the Space Age, the days of redundancy checks and fail safes.  We in the development community call it source control.  When Microsoft announced it’s TFS Online offering three years ago, I couldn’t put my scripts in the the cloud fast enough.  Of course the name has changed, but the idea remains the same, source control in the cloud.  The great thing is that you can actually do it for free (for up to five people).

Will you learn from history and protect your scripts or are you doomed to repeat it? 

Posted by: sqlswimmer | November 19, 2015

Aggregation Design is Back!

If you use SQL Server Data Tools (SSDT) and SQL Server Data Tools – BI (SSDT-BI) for your SQL Server 2012 development, then you have no doubt been frustrated, like me, by the fact that if you have both of these installed you no longer have the ability to create new Partitions and AggregationDesigns when working with the SSAS MOLAP model.   You can find others that have run into this issue here.

The solution I found was to install both SSDT & SSDT-BI on my laptop then have a VM with just SSDT-BI on it. That way when I needed to work on Partitions or Aggregation Designs (which is very infrequently), I just fire up the VM and I’m off and running.

Well, with SQL Server 2016 development we get to use Visual Studio 2015 and SSDT is now included in that install (although you do not get the BI project types, more on that here), no more do you have to have separate machines. I tested CTP 3 and Partitions and Aggregation Designs work once again. Hooray!

Aggregation Design

Posted by: sqlswimmer | November 18, 2015

One Tool to Rule Them All – Almost

There we so many cool announcements at the PASS Summit this year, but one of my favorites was the “One Tool to Rule Them All”. The SQL Server Data Tools (SSDT) teams and the Visual Studio (VS) team have finally teamed up together to give us one tool to do all our development work for Databases, SSIS, SSAS & SSRS. No more will we have to install Visual Studio Shell, SSDT, SSDT-BI and for those source control minded folks (which should be everyone!) that use Team Foundation Server (TFS), Team Explorer. For SQL Server 2016 development we can do one install of Visual Studio 2015 and call it a day, well, almost.

SSDT Install

I was so excited when I got back from Summit, I downloaded SSDT (CTP3) from here. I was so happy to see the install screen.

SSDT Install Screen

There they were, in all their glory, all the SQL Server project types that I needed. No more having to download multiple install files. Oh happy day!

After the install completed, I was a bit dismayed to discover that it took 3GB of disk space to do this install but I guess that’s par for the course any more.

Visual Studio Install

Next I wanted to see if you got all these same project types with an install of Visual Studio. They announced at Summit that “SSDT” would now be “included” with Visual Studio. So I went out and downloaded Visual Studio (CTP3, Community Edition, i.e., free) from here. And look what shows up on the install features list, there it is in black and white, Microsoft SQL Server Data Tools, almost too good to be true.

Visual Studio Features

Well, we all know that if something seems too good to be true, then it usually is. This is no exception.  Let’s see if you can pick out the reason for my disappointment in the picture below.

Visual Studio Project Types

That’s right, the only SQL Server project types that are installed with Visual Studio are database projects. No SSIS, no SSAS & no SSRS. That was very disappointing. Also note that it installed the templates for Visual C#, Visual Basic, etc., when the only feature that I requested to be installed was SQL Server Data Tools. I guess that’s why this install took 5GB of disk space as opposed to the 3GB that SSDT required.

The good thing about the new Visual Studio is that if you use TFS as your source control, you no longer have to download the separate TFS Team Explorer, it is now built in to Visual Studio. No additional installs are required.

Visual Studio Team Menu

Right “out of the box”, you get the Team menu item. However, this is NOT included in the SSDT install. I guess someone thinks we don’t really need to source control our SQL Server projects <sigh>.

Almost One Tool

Because I use TFS as my source control, I still have to do two installs, SSDT to get ALL the SQL Server project types AND Visual Studio so I can add all my SQL Server project types to source control.

This is definitely better than what we have to do now if we are doing development work prior to SQL Server 2016, but it’s not “One Tool to Rule Them All” yet. I’m hoping that since this is a CTP, the final products will contain “all the things”, but I certainly won’t hold my breath.

Now I’m off to test if they’ve overcome the issue of database projects playing nicely with SSAS projects. For those that use the multidimensional model with partitioning, you know exactly what I’m talking about. I’ll keep you posted with my results.

Posted by: sqlswimmer | November 5, 2015

My Build and Deploy Process (as Requested by Bill Fellows)

Recently I attended Reg-Gate’s SQL in the City event in Seattle, WA. I was in Seattle for the annual PASS Summit, you can read about my Summit adventures here. While at RedGate’s event, I attended a session that called on SQL Server Data Tools (SSDT) users. RedGate wanted to get a better handle on what pain points we had in SSDT with respect to source control. I use Team Foundation Server (TFS) as my source control product and it ties in very nicely with SSDT.

After this discussion, Bill Fellows (B | T), asked if I would be willing to blog or speak about my own build and deploy process for databases. Well, given that I am so NOT a speaker type, the seed for this blog post was planted.

I will not be diving into technical detail on how to do all these things (that would be a very large book), but more giving an overview of the features of TFS and why I think they are important and how I use them. Think of this as the 50,000 foot overview of my build and deploy process.

Integration with SSDT (Visual Studio)

Since I don’t want to use umpteen gazillion tools for development, I want my source control to integrate seamlessly with Visual Studio. TFS does this better than any other product I’ve used, probably because they are both Microsoft products. This way I don’t have to use multiple IDEs to work on SSIS, SSRS, SSAS and database development projects. I have one tool with the same source control experience for all.

Design your Branches Accordingly

TFS uses the Branch paradigm to split code out for different development efforts. I like this paradigm, it’s easy to visualize and makes sense to me. Designing your branches is probably the most important part of the source control process. Think of this as your data model, if you get this wrong, you will pay dearly for it in the end. Think about how your organization is structured and how your code moves through the development process. What environments do you have: Development, QA, Staging, Hotfix, etc.? How does your code move through those environments? Is it strictly one-way or can your code move in more than one direction?

Gated Check-ins

Because no matter how many times you tell your developers to do a build locally before checking in their changes, someone will inevitably forget. The last thing you want is bad code getting into your code base. Then you’re left with all your developers sitting around while changes are backed out/corrected, we all know what happens when developers sit around with idle hands. Gives me nightmares just thinking about it.

Automated Builds

This is so important. You most likely have more than one developer working on code. You want to make sure that all those changes they are making are not stomping all over each other and breaking things. Just because developers can get their code past the gated check-in, doesn’t mean it won’t break something else. You should actually be doing this for all your environments, not just development. In a large shop I recently worked in, we scheduled our automated builds twice per day. The first one was for 3 a.m., which allowed enough time for correction before staff came in if a build failed. The second one was at lunch time. This one allowed us a “sneak peek” at the big picture before the nightly processes kicked off. While TFS does provide some default build templates, so many of us have such custom applications and database projects that you may have to learn how to write xaml, I did.

Build Notifications

This is one of my favorite “tattle tale” features of TFS. You can set up notifications to find out when things are checked in successfully, when check-ins fail, when builds fail, all kinds of things. Use this feature. I can’t stress this enough, USE THIS FEATURE!

Power Tools

While TFS has some great features, some of them are a bit hard to navigate/use. This is where Power Tools comes in. It’s available freely for download from MSDN. It makes some great features just a click away, instead of having to write some obtrusive custom code to get what you want – like, who has what checked out in a branch or wild card searching or copying a query or cloning builds, etc.  The list is quite extensive.

Default Settings

All of these things don’t really do a lot of good unless you change the default settings for source control in SSDT. One of the biggest bang for your buck settings is to automatically get code when you open a solution. By default this is not enabled, silly I know, but it’s not. The other setting is to check out objects automatically when they are edited. These two settings will make your source code life much easier.

Wrapping it up

I’m not going to lie, getting this all set up in TFS is no small effort. This is your livelihood, treat it as such. Do your research into how your company’s processes currently work and then compare them to how you want them to work. Once you have all that you can come up with a build and deploy process that works for you.

Good luck!

Older Posts »

Categories

Follow

Get every new post delivered to your Inbox.

Join 354 other followers