Posted by: sqlswimmer | January 25, 2016

What is Power BI?

One of the “benefits” of being a chapter leader is that sometimes it means doing a presentation yourself when you can’t get a speaker.  I fell into this exact scenario for February’s meeting of Triad SQL.  I was trying to figure out what to present when the planets aligned. After reading the #EntryLevel post in this month’s PASS Connector News and my boss asking me about Power BI.  He wanted to know more about it and if it was something we could use.

I decided to put a presentation together to answer those questions.  This post is basically the flattening out of my PowerPoint presentation.

The What/Who/Why/Flavors of Power BI

What is Power BI?

When I Googled (yes, I used that as a verb!) “What is Power BI”, this is what I got, “Power BI is an amazing business analytics service that enables anyone to visualize and analyze data.”  This sounds cool, but isn’t all that helpful.  After further research, I found this definition, courtesy of powerbi.microsoft.com

Power BI is a cloud-based business analytics service that enables anyone to visualize and analyze data with greater speed, efficiency, and understanding. It connects users to a broad range of data through easy-to-use dashboards, interactive reports, and compelling visualizations that bring data to life.

Why use Power BI?

There are lots of reasons to use Power BI, other than, it’s so cool.  For instance, Power BI makes it easy to see, in one glance, all the information needed to make decisions.  It also allows you to monitor the most important information about your business.  Power BI makes collaboration easy and when I say easy I mean EZ!  You can also create customized Dashboards tailored to those C-Suite folks or make a completely different dashboard based on the same data for those that actually do the work.

Who can use Power BI?

Anyone who has a work or school email address can use Power BI.  Sorry, no personal email addresses.  Also no government (.gov) or military addresses (.mil).

Flavors of Power BI

There are two flavors of Power BI, Free and Pro.  You can do everything with Pro that you can do with Free plus a few other things.  Here’s a little comparison of the two, there are more differences, but these are the big ones.

Free

Pro

Data refresh frequency: Daily

Data capacity Limit: 1GB/user

Streaming rate: 10K rows/hour

Data sources are limited to content packs for services and importing files

Data refresh frequency: Hourly

Data Capacity Limit: 10GB/user

Streaming rate: 10M rows/hour

Data Sources include free ones plus direct query dataset and on-premises data

Collaboration with content packs

As of January 21, 2016, the Pro flavor goes for $9.99 USD per month per user.

Also, there is the previous version/flavor of Power BI referred to as Power BI for Office 365, which will be deprecated on March 31, 2016, so I am not including this version/flavor in this post.

The How of Power BI

The building blocks of Power BI are Dashboards, Reports & Datasets.

Dashboards

Dashboards are made of Tiles that contain a single visualization created from the data of one or more underlying Datasets.  When I first read this all I heard was “blah blah blah Datasets”.  What this means is simply this, it’s a collection of reports that are all displayed together for a specific reason.  It could be that you want all your sales guys to see different views of how they are doing compared to budget/forecast or it could be that you want to give your C-Suite people a quick overview of how the company is doing as a whole.  You can tailor these dashboards to whatever suits your purpose.  Now the only reference to the limit on the number of dashboards I could find was on the Office 365 site and it was listed as 100 per user or group.  I’m thinking of the old adage “just because you can doesn’t mean you should” would apply here though.

Reports

A report is one or more pages of visualizations.  Reports can be created from scratch within Power BI or Power BI Desktop.  They are very easy to create, you simply click on the type of visualization you want to display then select the data to be used.  One caveat that I will mention here is be sure your data is formatted so that is can be more easily consumed by Power BI.  See this link for tips and tricks on how to build a “proper dataset” for Power BI.  Just as with Dashboards, you have a limit as to the maximum number of reports, which is the same as Dashboards, 100 per user or group.

Datasets

A Dataset is something that you import or connect to.  It contains the actual data you want to translate into visualizations.  Right now you are limited as to the types of files you can import in to Power BI to Excel, Comma Separated Values (.csv)  and Power BI Desktop files (.pbix).  As far as connecting to data sources you can choose from many of the content packs that are available via the Power BI site like Google Analytics, Bing, Mail Chimp, Sales Force and GitHub, just to name a few or you can connect to a database.  As with anything that sounds too good to be true, you are limited to the databases you can connect to.  Right the now current list is limited as well, to Azure SQL Database, Azure SQL Data Warehouse and SQL Server Analysis Services (tabular model only).  There is a 250MB limit to the size of the dataset that you can import in to Power BI and a limit of 100 Datasets.

References

https://powerbi.microsoft.com/en-us/documentation/powerbi-service-get-started/

https://powerbi.microsoft.com/en-us/documentation/powerbi-videos/

https://powerbi.microsoft.com/en-us/documentation/powerbi-webinars/

https://technet.microsoft.com/library/mt282164.aspx

That’s it.  I hope this post provided a little bit of insight into Power BI and whether it’s something that can be useful to you and/or your company.  Check out the following links if you want a deeper dive into Power BI.

Power BI Blog

Melissa Coates Blog

Reza Rad Blog

Chris Webb Blog

Posted by: sqlswimmer | January 22, 2016

What Do You Want?

It’s that time of year, planning for PASS Summit 2016.  We’ve already put out the Call for Volunteers, which closes today, January 22, 2016, so get those applications in and be part of the team that helps determine content for the Summit.  Don’t want to volunteer but still want to help determine content for the Summit?  Then take the survey to tell us what you want to see.  It’s a quick survey, less than five minutes and you only have until Wednesday, January 27, 2016 to tell us what you want.  What are you waiting for, get to it!  You may even win a USB of the session recordings.

What would you like to see at Summit 2016?

Posted by: sqlswimmer | January 8, 2016

We Want You!

It seems like PASS Summit 2015 was just yesterday and here we are again, getting ready for Summit 2016 already.  This will be my seventh year of being a member of the Program Committee and my second year as a Program Manager.  If you have ever thought about volunteering for PASS this is a wonderful opportunity.  We need lots of volunteers to assist with everything from reading abstracts to special projects so that we can make Summit 2016 a great experience for the entire community.  Summit 2016 is still over nine months away but the work starts now.

The call for volunteers just went out this afternoon and we want you.  Use the link below to fill out the volunteer application.

PASS Summit 2016 Call for Volunteers

Posted by: sqlswimmer | November 20, 2015

Scripts – A Living History

As a DBA, I have a collection of scripts that I use for anything from auto-fixing logins to seeing who has the DAC.  Since I’ve been a DBA for a while (yeah, a while, we’ll go with that) I have quite the collection of scripts and I am constantly adding to it. 

In the Prehistoric days of dinosaurs and floppy disks, I used to keep a backup copy of them on a 3 1/2" floppy.  This was convenient and portable, so if I ever changed jobs, I could take my scripts with me. 

Then we entered the Golden Age of writable CDs and I could burn them to a data CD.  Still portable but a little more durable than a 3 1/2" floppy, I didn’t have to worry about keeping my CD away from magnets.

Carrying a CD around may have been more durable, but it certainly wasn’t more convenient.  Enter the Renaissance Age of USB/Thumb drives.  Holy Cow, I could copy all my scripts to a USB drive and fit it in my pocket, I could take it with me everywhere I went, now that’s convenient!

Enter the Industrial Age and we got smarter about how we did things.  Hello Google Drive.  No more having to carry around anything but lint in my pockets.  As long as I had access to the internet, I had access to my scripts.  Even if the internet were temporarily unavailable, I could still access the scripts on my local hard drive. 

But then a funny thing happened, I modified one of my scripts to accommodate a specific version of SQL Server and accidentally overwrite the original file.  We’ve all been there, that moment when you click the Save button instead of Save As.  All the expletives rumbling around in your head because now you have to remember what it was before you overwrote it.  Enter the Space Age, the days of redundancy checks and fail safes.  We in the development community call it source control.  When Microsoft announced it’s TFS Online offering three years ago, I couldn’t put my scripts in the the cloud fast enough.  Of course the name has changed, but the idea remains the same, source control in the cloud.  The great thing is that you can actually do it for free (for up to five people).

Will you learn from history and protect your scripts or are you doomed to repeat it? 

Posted by: sqlswimmer | November 19, 2015

Aggregation Design is Back!

If you use SQL Server Data Tools (SSDT) and SQL Server Data Tools – BI (SSDT-BI) for your SQL Server 2012 development, then you have no doubt been frustrated, like me, by the fact that if you have both of these installed you no longer have the ability to create new Partitions and AggregationDesigns when working with the SSAS MOLAP model.   You can find others that have run into this issue here.

The solution I found was to install both SSDT & SSDT-BI on my laptop then have a VM with just SSDT-BI on it. That way when I needed to work on Partitions or Aggregation Designs (which is very infrequently), I just fire up the VM and I’m off and running.

Well, with SQL Server 2016 development we get to use Visual Studio 2015 and SSDT is now included in that install (although you do not get the BI project types, more on that here), no more do you have to have separate machines. I tested CTP 3 and Partitions and Aggregation Designs work once again. Hooray!

Aggregation Design

Posted by: sqlswimmer | November 18, 2015

One Tool to Rule Them All – Almost

There we so many cool announcements at the PASS Summit this year, but one of my favorites was the “One Tool to Rule Them All”. The SQL Server Data Tools (SSDT) teams and the Visual Studio (VS) team have finally teamed up together to give us one tool to do all our development work for Databases, SSIS, SSAS & SSRS. No more will we have to install Visual Studio Shell, SSDT, SSDT-BI and for those source control minded folks (which should be everyone!) that use Team Foundation Server (TFS), Team Explorer. For SQL Server 2016 development we can do one install of Visual Studio 2015 and call it a day, well, almost.

SSDT Install

I was so excited when I got back from Summit, I downloaded SSDT (CTP3) from here. I was so happy to see the install screen.

SSDT Install Screen

There they were, in all their glory, all the SQL Server project types that I needed. No more having to download multiple install files. Oh happy day!

After the install completed, I was a bit dismayed to discover that it took 3GB of disk space to do this install but I guess that’s par for the course any more.

Visual Studio Install

Next I wanted to see if you got all these same project types with an install of Visual Studio. They announced at Summit that “SSDT” would now be “included” with Visual Studio. So I went out and downloaded Visual Studio (CTP3, Community Edition, i.e., free) from here. And look what shows up on the install features list, there it is in black and white, Microsoft SQL Server Data Tools, almost too good to be true.

Visual Studio Features

Well, we all know that if something seems too good to be true, then it usually is. This is no exception.  Let’s see if you can pick out the reason for my disappointment in the picture below.

Visual Studio Project Types

That’s right, the only SQL Server project types that are installed with Visual Studio are database projects. No SSIS, no SSAS & no SSRS. That was very disappointing. Also note that it installed the templates for Visual C#, Visual Basic, etc., when the only feature that I requested to be installed was SQL Server Data Tools. I guess that’s why this install took 5GB of disk space as opposed to the 3GB that SSDT required.

The good thing about the new Visual Studio is that if you use TFS as your source control, you no longer have to download the separate TFS Team Explorer, it is now built in to Visual Studio. No additional installs are required.

Visual Studio Team Menu

Right “out of the box”, you get the Team menu item. However, this is NOT included in the SSDT install. I guess someone thinks we don’t really need to source control our SQL Server projects <sigh>.

Almost One Tool

Because I use TFS as my source control, I still have to do two installs, SSDT to get ALL the SQL Server project types AND Visual Studio so I can add all my SQL Server project types to source control.

This is definitely better than what we have to do now if we are doing development work prior to SQL Server 2016, but it’s not “One Tool to Rule Them All” yet. I’m hoping that since this is a CTP, the final products will contain “all the things”, but I certainly won’t hold my breath.

Now I’m off to test if they’ve overcome the issue of database projects playing nicely with SSAS projects. For those that use the multidimensional model with partitioning, you know exactly what I’m talking about. I’ll keep you posted with my results.

Posted by: sqlswimmer | November 5, 2015

My Build and Deploy Process (as Requested by Bill Fellows)

Recently I attended Reg-Gate’s SQL in the City event in Seattle, WA. I was in Seattle for the annual PASS Summit, you can read about my Summit adventures here. While at RedGate’s event, I attended a session that called on SQL Server Data Tools (SSDT) users. RedGate wanted to get a better handle on what pain points we had in SSDT with respect to source control. I use Team Foundation Server (TFS) as my source control product and it ties in very nicely with SSDT.

After this discussion, Bill Fellows (B | T), asked if I would be willing to blog or speak about my own build and deploy process for databases. Well, given that I am so NOT a speaker type, the seed for this blog post was planted.

I will not be diving into technical detail on how to do all these things (that would be a very large book), but more giving an overview of the features of TFS and why I think they are important and how I use them. Think of this as the 50,000 foot overview of my build and deploy process.

Integration with SSDT (Visual Studio)

Since I don’t want to use umpteen gazillion tools for development, I want my source control to integrate seamlessly with Visual Studio. TFS does this better than any other product I’ve used, probably because they are both Microsoft products. This way I don’t have to use multiple IDEs to work on SSIS, SSRS, SSAS and database development projects. I have one tool with the same source control experience for all.

Design your Branches Accordingly

TFS uses the Branch paradigm to split code out for different development efforts. I like this paradigm, it’s easy to visualize and makes sense to me. Designing your branches is probably the most important part of the source control process. Think of this as your data model, if you get this wrong, you will pay dearly for it in the end. Think about how your organization is structured and how your code moves through the development process. What environments do you have: Development, QA, Staging, Hotfix, etc.? How does your code move through those environments? Is it strictly one-way or can your code move in more than one direction?

Gated Check-ins

Because no matter how many times you tell your developers to do a build locally before checking in their changes, someone will inevitably forget. The last thing you want is bad code getting into your code base. Then you’re left with all your developers sitting around while changes are backed out/corrected, we all know what happens when developers sit around with idle hands. Gives me nightmares just thinking about it.

Automated Builds

This is so important. You most likely have more than one developer working on code. You want to make sure that all those changes they are making are not stomping all over each other and breaking things. Just because developers can get their code past the gated check-in, doesn’t mean it won’t break something else. You should actually be doing this for all your environments, not just development. In a large shop I recently worked in, we scheduled our automated builds twice per day. The first one was for 3 a.m., which allowed enough time for correction before staff came in if a build failed. The second one was at lunch time. This one allowed us a “sneak peek” at the big picture before the nightly processes kicked off. While TFS does provide some default build templates, so many of us have such custom applications and database projects that you may have to learn how to write xaml, I did.

Build Notifications

This is one of my favorite “tattle tale” features of TFS. You can set up notifications to find out when things are checked in successfully, when check-ins fail, when builds fail, all kinds of things. Use this feature. I can’t stress this enough, USE THIS FEATURE!

Power Tools

While TFS has some great features, some of them are a bit hard to navigate/use. This is where Power Tools comes in. It’s available freely for download from MSDN. It makes some great features just a click away, instead of having to write some obtrusive custom code to get what you want – like, who has what checked out in a branch or wild card searching or copying a query or cloning builds, etc.  The list is quite extensive.

Default Settings

All of these things don’t really do a lot of good unless you change the default settings for source control in SSDT. One of the biggest bang for your buck settings is to automatically get code when you open a solution. By default this is not enabled, silly I know, but it’s not. The other setting is to check out objects automatically when they are edited. These two settings will make your source code life much easier.

Wrapping it up

I’m not going to lie, getting this all set up in TFS is no small effort. This is your livelihood, treat it as such. Do your research into how your company’s processes currently work and then compare them to how you want them to work. Once you have all that you can come up with a build and deploy process that works for you.

Good luck!

Posted by: sqlswimmer | November 3, 2015

The Whirlwind That Is October – Part 3

October has been such a whirlwind of PASS activity for me. Two SQL Saturdays and the PASS Summit. This post is about the PASS Summit in Seattle, October 27 through 30. You can read about my SQL Saturdays here and here. Settle in, get a cup of <insert caffeinated beverage of choice>, this is going to be a long one.

I arrived in Seattle on Saturday, October 24th. Since I spent a lot of my formative years in the Pacific Northwest, I usually go early and have family and/or friends meet up with me for the weekend, but this year life just got in the way so I spent Saturday afternoon and Sunday wandering around Seattle alone doing touristy things and stocking up on souvenirs for those left at home.

Monday morning came bright and early and I headed over to RedGate‘s SQL in the City event. This is the fourth year that I’ve attended this event. It mostly showcases how to use RedGate products, but there are some other useful sessions as well. One that I particularly liked was the workshop that called on SSDT users. They broke us up into two groups and had a RedGater leading the conversation. I got to meet some new folks like Phil Helmer (B | T) and know that I wasn’t alone with some of my frustrations when using TFS in SSDT. Of course Bill Fellows (B | T) was there providing valuable insight as well. And yes Bill, I will blog about my build and deploy process sometime in the near future. I also got to meet Andrea Allred (B | T) in person. We had connected over Twitter via our musical interests and really hit it off in person. Andrea I can’t thank you enough for encouraging us to drive 4 hours to see The Struts (B | T), it truly was an experience I will never forget. I also got to officially meet Sheila Acker (T). She has been a familiar face for the last five years, but we officially met this year. So nice to finally meet you Sheila.

I ended my Monday by catching up with my dear South African friend Martin Phelps (B | T) at Rock Bottom Brewery. He has a lot of work ahead of him, he and his teammates are trying to make it to the World Championships of sky diving in April 2016. Good luck Martin!

I got to sleep in a bit on Tuesday before I hit my favorite hole in the wall eatery, Blue Water Taco Grill (BWTG). Let me just say that I LOVE BWTG. I live in High Point, NC, where they think that a good breakfast burrito is what you get at Chik-Fil-A during their breakfast hours – NOT! I miss my breakfast burritos from Pete’s Kitchen in Denver and while the one that I get at BWTG is not smothered in green chile, it does have chorizo in it – food fit for a king (or queen as it were). But I digress, on with the adventures of Tuesday.

Tuesday was a day for meetings, the SQL Saturday Organizer and Chapter Leader meetings. These were fabulous, got some great ideas for ways to advertise SQL Saturday and my local chapter. After my meetings I hung out with Andrea and her husband Ryan Allred (T) for a while talking music. We exchanged some of our favorite band names, which I am still going through. Then it was off to be a PASS Ambassador for the Welcome Reception. For those that don’t know me, this is my “Most favorite-est” (as my youngest niece would say) thing to do at Summit. I can’t stand up in front of a room of thirty people and present a session without almost hyperventilating, but I have absolutely no problem standing in a crowd of people and greeting them with smiles and assistance when needed.

If you couldn’t tell, I am a big music fan, so it was no contest when I found out that Florence + the Machine (B | T) was playing in Seattle on Tuesday night. After my PASS Ambassadorship ended, I skipped the volunteer party and headed straight to Key Arena. Florence did not disappoint, she performed barefoot (as usual) and was very “twirly”. After a very long day of nonstop action, I headed back to the hotel to get some much needed sleep.

Wednesday started off very early with being a PASS Ambassador once again. Did I mention that this is my favorite volunteer job at Summit? I was at the top of the escalators at 6:45 a.m. greeting attendees, speakers and sponsors. One thing that was new this year was the Ask Me! hand sign. I still haven’t found out whose brain child that was, but when I do, look out, you will be getting a serious #SQLHug from me. Most IT folks are such introverts that they seldom make eye contact with people, so the fact that I had a sign giving them permission to ask a question was AMAZING. I even had one attendee ask if he could have his picture taken with me and my sign (and if this was you, please share that pic, I didn’t get your name and would love to see how it turned out).

Since I was manning the top of the escalators until the start of the Keynote, I missed breakfast completely, so I headed over to BWTG for my morning burrito. I sat there eating my burrito and watching the Keynote – streaming live – Thank you PASS TV! After that I was able to attend the Microsoft Foundation Session on Business Intelligence. Man oh man, I can’t wait for SQL 2016, the enhancements to SSRS alone are enough to make me want to skip over Thanksgiving, Christmas and New Year’s.

Lunch time came around and it was time to say fair well to outgoing Director Amy Lewis (B | T). Amy has been the Director with the Program Portfolio for the last two years and prior to that she was heavily involved in the Program Committee, so I have worked with Amy directly or indirectly for five years. I was sad to see her not run for the Board again, but I understand that life just gets in the way. We have a new fearless leader in Ryan Adams (B | T) and I can’t wait to work with him. I was able to make it to two more sessions in the afternoon, then it was on to the Exhibitor Reception. It was nice to get a chance to chat with some of the vendors and see their products. I also ran into more #SQLFamily than I can name here. I was also “coerced” into giving an interview for PASS TV. If you were unfortunate enough to see that take place, you now understand why I am not a speaker. If you did not witness it, be thankful and leave it at that.

The night ended with SQL Karaoke hosted by Pragmatic Works at Hard Rock Cafe. This is always a good time and this year was no exception. I only wish I could have stayed longer. I retired early as I was to be a PASS Ambassador once again at 6:45 a.m. on Thursday.

The highlight of Summit came when Lance Harra (T) was presented with the PASSion award during Thursday’s keynote. This was long overdue, Lance has been on the Program Committee in some shape or form for eleven years, being a Program Manager for the last three or four. As a member of the Program Committee for the last five years and now a Program Manager, I see how hard Lance works. Next time you see Lance, be sure to congratulate him. We are very proud of him.

Unfortunately this is the point during Summit when I come down with a nasty virus and miss Thursday afternoon and all of Friday. I ended up sleeping in my hotel room for the rest of the conference, missing out on some cool sessions and most importantly #SQLFamily time. I so wanted to catch up with Sebastian Meine (B | T) in the Community Zone to talk about tSQLt. I was also looking forward to hanging out with AZ (T) and so many others during the Community Appreciation party. But in true #SQLFamily fashion, AZ checked in on me every day until I made it home. Thank you AZ!

I ended up at Urgent Care on Sunday morning after I got home.  Needless to say my poor excuse for a respiratory system was in dire need of medical attention.  Four prescriptions and one shot in the butt later, I was sent home to rest and recuperate.

While my experience at Summit ended way too early, I still had a great time. If you’ve never attended a Summit, what are you waiting for? If you’ve attended before, I am so glad you came back and I hope to see you next year.

One last reminder – you can still submit session evals online until November 6, 2015 via the Guidebook app. So do it now! The speakers and the Program Committee need your feedback so we can continue to make Summit a success.

Posted by: sqlswimmer | October 18, 2015

The Whirlwind That Is October – Part 2

October has been such a whirlwind of PASS activity for me. Two SQL Saturdays and the PASS Summit. This post is about the second SQL Saturday I attended in Charlotte, SQL Saturday #452, on October 17, 2015, where I was a member of the organizing team. You can read about my first SQL Saturday here.

I have been an attendee and volunteer for the SQL Saturday in Charlotte that is put on by CBIG for the last 3 years. This year I was approached by one of their fearless leaders, Rafael Salas (B | T), to help out as one of the organizers. How cool is that?!

Let me start by saying I love, absolutely love, helping out as an organizer.   But with that said, wow, it’s a whole lot of work! So the next time you attend a SQL Saturday, be sure to find the organizers and thank them for the event. They do this on their own time and don’t get paid for any of it.

Since I’ve volunteered for registration at several SQL Saturdays (in addition to Charlotte) over the past 4 years and I’ve been on the PASS Program Committee for six years, I decided I could be of most use helping out with speaker selection, session scheduling and registration on the day of the event.

We had a great team of organizers this year, Rafael Salas (B | T), Javier Guillen (B | T), Jason Thomas (B | T), Elizabeth Ricks (T), Don Sparks and me. In the past Melissa Coates (B | T) has been one of the organizers and even though she is not on the team this year she definitely deserves an honorable mention as she put a lot of the processes in place to help make sure things run smoothly. Thank you Melissa!

One of the reasons SQL Saturday was developed was to encourage local speakers. Well, the CBIG team believes in that mission wholeheartedly. We were able to accept sessions from all the local speakers plus some great regional/national speakers.

The event was held at Central Piedmont Community College (CPCC) again this year. It’s a great venue with lots of light and room. All the sponsors were wonderful and had great give-a-ways for the attendees. We only had one speaker cancel in the last week prior to the event and that was beyond his control – clients, what can you do?! Andy Leonard (B | T) was so gracious and stepped up to fill the empty slot without hesitation, Thank you Andy!

We changed things up for the speaker dinner on Friday night. Instead of having just speakers, we included our sponsors and some of our very dedicated volunteers and called it our Appreciation Dinner. The food was delicious and the venue was perfect. I had the opportunity to talk with some speakers and sponsors I’d never met before. Thank you to both groups for all that you do for the community, we wouldn’t be able to put on such a great event without your support.

There were 588 people registered and 328 people actually attend. That’s a 55% turn out, which isn’t great, but it is pretty good. We had some amazing volunteers there helping us out both the night before for setup and during the event on Saturday. This is the fourth year I’ve been involved with this SQL Saturday and it really does just keep getting better every year.

Needless to say the day of the event went by in the blink of an eye. One minute my alarm was going off at 5:45 a.m., the next we were headed out the doors to the after event party. Several speakers and volunteers joined us at the after event party at Grapes Bistro & Wine Bar. The food was spectacular and the setup of the venue allowed for an easy flow of conversation. Slava Murygin (B | T) was there to take lots of pictures too. BTW – You can view them here for the Appreciation Dinner and here for the Event and after event party. Thank you Slava for all the pics, you did a great job.

I’m kind of sad that it’s over, but I really do need to catch up on some sleep. I am looking forward to next year and hope that we will see you there.

Well done CBIG!

Posted by: sqlswimmer | October 16, 2015

The Whirlwind That Is October – Part 1

October has been such a whirlwind of PASS activity for me. Two SQL Saturdays and the PASS Summit. This post is about the first SQL Saturday I attended in Raleigh, SQL Saturday #445, on October 10, 2015, where I was both volunteer and attendee.

This event is put on by the Triangle Area SQL Server User (TriPASS). Brett Tomson (L | T) has long been the leader of this group but recently handed the reigns over to Kevin Feasel (L | T). Kevin and Brett did a great job. They had awesome speakers, great sponsors and wonderful volunteers (yes, I am including myself in that).

This year’s event was at a new venue, William Peace University. It was a smaller venue than last year, but it worked. It’s a beautiful campus with very friendly staff who were willing to help out in any manner they could.

Being the morning person that I am, I got up at 4:30 a.m. on Saturday to make the 90+ mile drive to Raleigh. I helped out with registration in the morning, which is always one of my favorite volunteer activities. You get to see EVERYONE that’s attending; speaker, sponsor and attendee alike. It gives me a chance to give #sqlhugs to my #sqlfamily to start the day off.   What could be better than that?!

I was able to attend two sessions, which were fabulous. Then I finished off my volunteer duties by taking lunch tickets at the Dining Hall. That’s right, I said Dining Hall. Because we were at WPU, the organizers were able to take advantage of the campus Dining Hall and the people that staff it. There was quite the selection from the salad bar, sandwich bar and dessert bar. The only bar missing was the bar with alcohol, but we still had afternoon sessions, so I guess that was for the best. This is the first SQL Saturday that I’ve attended where lunch was not brought in. I’m sure it was nice for the organizers because they didn’t have to worry about getting volunteers to help with lunch setup, wrangling coolers for soda/water or clean up and attendees had a very wide selection to choose from.

After lunch I was able to go visit the vendors and chat with them while attendees were in session. I even snagged one of the coveted bacon scented T-shirts from Micron (be forewarned, even after three washings it still smells like bacon).

After getting up so early, I was spent, and since I had to make the 90+ mile trip home, I decided to head home early. I missed the end of the day raffle and after event party, but I hear they were both entertaining, to say the least.

Huge thank you and congratulations goes out to the organizers of SQL Saturday #445, job well done.

Older Posts »

Categories

Follow

Get every new post delivered to your Inbox.

Join 321 other followers