SQL Server with Mr. Denny


July 15, 2015  6:00 PM

Multi-threading Backups

Denny Cherry Denny Cherry Profile: Denny Cherry
Backup and Recovery, Database performance, SQL Server

In the modern world of huge amounts of data backing up databases within a reasonable amount of time has become harder and harder for people which massive amounts of data to do. Several of my clients have so many databases, with so much data on them that they simply can’t complete full backups of the databases within a 24 hour window. This creates some major problems for them as backups are running all through the business day. In these cases where there are lots of databases, which each have large amounts of data the best bet becomes to multi-thread the database backups so that multiple backups can be taken at once.

But doing that gets tricky, fast. If you create a job for each database you have to track that, and manually add new database backup jobs when new databases are added, then you need to track how long it takes to get each database backed up so that you aren’t running to many backups, but you also don’t want the system to be very idle during the maintenance window either.

To get around this for these clients I put together a process that uses 8 database backup jobs per server, allowing for 8 databases to be backed up at a time. As a part of the process, the databases are sorted based on database size so that the duration of the backups across the backup jobs is spread out as evenly as possible. Depending on the skew in your database sizes the jobs may or may not run for the same duration.

In my deployment script it creates a new database called “BackupData”. You can change the name of this if you want, but you’ll need to update all the stored procedures to use whatever database name you specify. This database contains a single table which is a cached copy of the output of sp_helpdb. This is done to reduce the number of times that sp_helpdb is called as this is an expensive stored procedure to run, and the exact values from the stored procedure aren’t needed every time. There’s as separate job which runs this stored procedure nightly to get the updated values.

When you run the below SQL script it’ll create a bunch of SQL Agent jobs as well as the needed database, table and various stored procedure. If you don’t want differential backups you can simply delete the differential jobs and change the full backups to run daily instead of weekly.

You’ll want to change the network paths listed on lines 160-164 to the network share or folder that you want to backup to. In this case I’m using a network share. If you are going to use a network share you’ll want 10 Gig networking in place as well as SMB3 on both the SQL Server and the target server (Windows 2012 or newer) and storage that is fast enough to support writing this quickly. If you are writing to a virtual tape library like a data domain you’ll want to ensure that it supports SMB3 as you’ll be writing a log of data to it very quickly.

This stored procedure code is a little rough and the documentation isn’t all that great, but it gets the job done.

You can download the SQL Script to make all this happen here.

Thanks,
Denny

July 8, 2015  6:00 PM

Be Careful When Starting Up Azure VMs Running SQL Server

Denny Cherry Denny Cherry Profile: Denny Cherry
Firewall configuration, SQL Azure, SQL Server, Windows Azure

So Microsoft has done something pretty dumb with the Azure VMs which are running Microsoft SQL Server. By default the front end firewall (the one that allows or blocks traffic from the public Internet to the VMs) allows traffic to the default SQL Server port 1433. At first this is fine, until you change the firewall port on the Windows firewall to allow the other VMs to connect to SQL. Now you’ve got a problem as the public firewall is open, and your Windows firewall is open, so anyone who attempts to connect to the SQL port 1433 from the outside will have direct access to the SQL Server instance.

So when creating VMs which will be running SQL Server that you are creating from the default SQL Server template you’ll need to go into the Azure portal and change the firewall endpoints. Do to this edit the properties of the VM, and edit the settings. Then edit the Endpoints.

If you see the “SQL Server” endpoint as shown below, and you’ve disabled the Windows Firewall on the VM from blocking TCP port 1433, then the entire public Internet has access to your SQL Server VM.

VM_Settings

To remove this mouse over the SQL Server endpoint and click the menu button shown below, then click “Delete” from the context menu that appears.

VM_Settings2

For each SQL Server VM that you’ve deployed using Microsoft’s SQL VM Template.

If you’ve setup SQL Server VMs in Azure within the last couple of months you’ll want to go and check the Azure Endpoints and make sure you don’t have a firewall hole that you weren’t expecting. I’ve spoken to Azure team at Microsoft about this and the default template is being fixed so that it isn’t setup this way any more, if it isn’t fixed already.

Denny


July 1, 2015  10:00 AM

Splitting Files for SQL DW

Denny Cherry Denny Cherry Profile: Denny Cherry
Azure, SQL Azure, SQL Server, Windows Azure

When doing data warehouse loads into SQL DW you’ll want to break your single large file that you extract into multiple smaller files in order to do loads into the SQL DW (via Azure Blob Storage) as fast as possible. There’s a few ways that you can do this. One way would be to handle this in your ETL that extracts the data from your source database. Depending on how you are extracting the data this can be anywhere from painful to REALLY, REALLY painful.

A much easier way is to just extract the data to a single text file, then split that single text file using a command line tool. Now you are probably asking yourself where you might get this handy command line tool. Well thankfully you can download it right here. While doing some work for a client on SQL DW we went ahead and put this tool together for you which as a command line tool you can easily build into your ETL process. The only requirements for using the tool is that you have .NET 4.0 installed on the machine which is running the tool. It runs very quickly and will create evenly sized files from your large source file. I used it recently on a ~16.7 Gig file and it was able to process the file in about 20 minutes (keep in mind I was using a slow 7200 RPM consumer hard drive. Smaller 2 Gig files processed in about a minute (again same slow disk). As time permits we’ll work on making the software run even faster.

Denny


June 27, 2015  5:36 AM

Recommended reading from mrdenny for June 26, 2015

Denny Cherry Denny Cherry Profile: Denny Cherry
Internet of Things, Net Neutrality, SQL Server, SQL Server transaction logs

This week I’ve found some great things for you to read. These are a few of my favorites that I’ve found this week.

This weeks SQL Server person to follow on Twitter is: Rad_Reza also known as Reza Rad

Hopefully you find these articles as useful as I did.

Don’t forget to follow me on Twitter where my username is @mrdenny.

Denny


June 24, 2015  6:00 PM

Why I WILL Be Speaking at @VMworld

Denny Cherry Denny Cherry Profile: Denny Cherry
Conferences, IT conferences, IT conferences and events, SQL Server, VMware, VMworld

Back on June 17th, 2015 I posted about how I wouldn’t be speaking at VMworld 2015. Oh what a difference a week can make. Apparently that got the attention of some people at VMware, and I’m happy to report that now I will be speaking at VMworld 2015.

Now sadly I won’t be at VMworld for the entire conference as just before VMworld I’ll be in India at SQLServerGeeks Annual Summit just before. So VMworld is going to schedule my sessions for Thursday morning (thanks to the jetlag I’ll be up first thing in the morning without issue) as I won’t be able to get back to San Francisco until late Wednesday afternoon.

The two sessions that I’ll be giving will be a talk on SQL Server High Availability, and a second session on SQL Server Clustering and VMware VSAN. These should be some fun sessions as there’s a lot of new information coming out on both of these topics between now and when VMware comes around in September.

Personally I’m thrilled that I was able to work with the VMworld team and the fine folks at VMware to make it happen at this event. I’ve attended VMworld several times over the years, and I’ve always loved attending and I’m thrilled to say that I’m able to give something back to the event; and I look forward to meeting some great people at the conference.

Denny


June 19, 2015  11:46 PM

Recommended reading from mrdenny for June 19, 2015

Denny Cherry Denny Cherry Profile: Denny Cherry
SQL Server

This week I’ve found some great things for you to read. These are a few of my favorites that I’ve found this week.

This weeks SQL Server person to follow on Twitter is: AlvinRamard also known as Alvin Ramard

Hopefully you find these articles as useful as I did.

Don’t forget to follow me on Twitter where my username is @mrdenny.

Denny


June 17, 2015  6:00 PM

Why I Will Not Be Speaking at VMworld

Denny Cherry Denny Cherry Profile: Denny Cherry
Conferences, IT conferences, IT conferences and events, SQL Server, VMware, VMworld

I love speaking at conferences. I get to meet new people, I sometimes get to visit a new city. But I’m sad to say that I will not be speaking at the North America or Europe VMworld events this year, even though I was accepted.

My reasons are pretty basic. VMware (who puts on VMworld) has decided that they will not only, not pay their speakers but they expect their speakers to pay their own costs to go to the event. VMware expects me as a community speaker to pay for my own hotel and airfare to get to the conference. VMware doesn’t even give speakers a full conference pass, the speaker gets a day pass for the day which they are speaking.

Now in reality for the North America conference my flight costs, probably aren’t all that much as I’m a short flight to San Francisco from San Diego (I’d actually be coming from India from another conference so it’d basically be a flight change fee). The hotel for the conference will probably be in the $300 a night range I’m assuming. VMware will pay for one night in the hotel if I’m giving two sessions, which is what was selected for the North American conference.

For the European conference things get even worse. I’m expected to pay for my own international flight from California to Spain, and I have to pay for my own hotel as only one of my sessions was selected for VMware Europe. So according to them, apparently I should just fly in give my talk and fly out the same day. That really doesn’t seem like it’s conductive conducive to me giving a good presentation. There’s a 9 hour time difference between here and Spain, so I’d need at least a couple of days to dejetlag. Those jetlag days, I’m used to paying for those myself as that’s pretty normal.

So on top of not billing clients for two weeks (assuming that I actually attended the full conference, and if I’m going why wouldn’t I want to attend), I’m expected to pay thousands in flight and hotel (not to mention a full conference badge so I can attend the rest of the days) for the privilege of presenting at a conference.

Now yes, I’m a consultant so if you went to the SQL Server booth at Ignite or if you’ve been to the PASS Summit the last couple of years you’ve seen my in my booth, but at VMworld I’ve got zero chance to meet attendees unless it’s after my session. So time to really network and meet people, and yes drum up business (here’s a little secret, this is why consultants go to conferences). So where’s the upside for me? I get to say that I spoke at VMworld? Big deal.

So needless to say, I’m not willing to drop $10k+ (between not billing, flights, hotels, and a conference pass) for the privilege of speaking at a conference that VMware makes a ton of money on (I’m guessing several million dollars at the least).

Denny

UPDATE (2015-06-18): After reviewing the speakers portal some more, I’ve learned that external speakers do in fact get a full conference pass. So I’ve updated the post to reflect that.


June 12, 2015  6:00 PM

Recommended reading from mrdenny for June 12, 2015

Denny Cherry Denny Cherry Profile: Denny Cherry
SQL Server

This week I’ve found some great things for you to read. These are a few of my favorites that I’ve found this week.

This weeks SQL Server person to follow on Twitter is: IT_Recruiter_FL also known as Bianca Diosdado

Hopefully you find these articles as useful as I did.

Don’t forget to follow me on Twitter where my username is @mrdenny.

Denny


June 10, 2015  6:00 PM

Buffer Pool Extension File Location

Denny Cherry Denny Cherry Profile: Denny Cherry
SQL Server, SQL Server 2014

If you are planning on using Buffer Pool Extension (BPE) in SQL Server 2014 or SQL Server 2016 you need to think about where you want to put the BPE file on the SSD, especially as it relates to the Windows OS Security on the file system.  The most important think to remember when placing the BPE file on the hard drive is that you want to place it into a folder, not directly into the root of the drive.  The reason for this is that the Windows OS has User Access Controls (UAC) enabled which will prevent the SQL Server from writing to the root of the drive.  Even if you have UAC disabled today (which I don’t recommend) if it gets enabled in the future this could cause problems with access to the BPE file.  It is much easier to simply put the file into a folder and not have to worry about UAC biting you.

Denny


June 10, 2015  5:58 PM

Recommended reading from mrdenny for June 05, 2015

Denny Cherry Denny Cherry Profile: Denny Cherry
SQL Server

This week I’ve found some great things for you to read. These are a few of my favorites that I’ve found this week.

This weeks SQL Server person to follow on Twitter is: way0utwest also known as Steve Jones

Hopefully you find these articles as useful as I did.

Don’t forget to follow me on Twitter where my username is @mrdenny.

Denny


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: