SQL Server with Mr. Denny


April 16, 2014  4:00 PM

Naming Drives

Denny Cherry Denny Cherry Profile: Denny Cherry

I’m a DBA (shocking I know), so I plan everything possible that I can to the servers that I’m setting up for clients. In this case that includes thinking about what’s going to happen when the server (or VM) totally fails and they need to rebuild the OS from scratch and get everything back up and running as quickly as possible. In order to help in this task I make sure that all the drives on these servers are named with some sort of descriptive name which includes the drive letter that they’ll be using. This way when I have to rebuild the OS on the server I don’t have to think about which drives get which letters when I present them all back to the server. Something like this (which is for the tempdb drive of a server). Can you guess which drive letter I’ll be using?

DriveLetter

If you said the “T” drive you would be correct. It’s that last letter that gives it away. Now granted I have to assume that no one is going to change the drive label for this to work, but so far that hasn’t been an issue as it makes perfect sense to people as soon as I tell them why I do this.

I started doing this back in the pre-VM days when we had all the servers as SAN attached servers and when doing system migrations involved moving LUNs from server to server and you needed to reassign drive letters to 10 LUNs quickly. This became a nice easy way to figure out what was what, and it doesn’t require any remembering what the server config is.

Denny

April 13, 2014  10:17 PM

Recommended reading from mrdenny for April 11, 2014

Denny Cherry Denny Cherry Profile: Denny Cherry

 

This week I’ve found some great things for you to read. These are a few of my favorites that I’ve found this week.

This weeks SQL Server person to follow on Twitter is: paultebraak also known as Paul te Braak

Hopefully you find these articles as useful as I did.

Don’t forget to follow me on Twitter where my username is @mrdenny.

Denny


April 9, 2014  2:00 PM

Viewing Maintenance Plan Logs

Denny Cherry Denny Cherry Profile: Denny Cherry
SQL Server, SQL Server Maintenance Plan

For many of my customers I’ve got them setup to use maintenance plans because they are small shops without a DBA and the maintenance plans give them something visual which they can easily tweak later on if needed without having to have me come in to make a small change to their reindexing or backups, etc.

One of the big failings of the maintenance plans is the reporting in Management Studio. If you right click on a maintenance plan you get an option to view reports and it gives you a whole bunch of really nice info … sometimes. After the plan has been running for a while it just sits there not giving you any useful information. And that’s because the code behind the maintenance plan report page … SUCKS!

The process does a bunch of row by row querying instead of just dumping a single large recordset like any normal application. So how do you get that useful information from SQL Server. The query below will give you the exact same information that the GUI gives you, but without waiting forever for the GUI to open up. Just change the plan and subplan names to the ones that you are looking for and run the query.

SELECT
ld.line1 AS [Line1],
ld.line2 AS [Line2],
ld.line3 AS [Line3],
ld.line4 AS [Line4],
ld.line5 AS [Line5],
ld.server_name AS [ServerName],
ld.start_time AS [StartTime],
ld.end_time AS [EndTime],
ld.error_number AS [ErrorNo],
ld.error_message AS [ErrorMessage],
ld.command AS [Command],
ld.succeeded AS [Succeeded]
FROM
msdb.dbo.sysmaintplan_plans AS s
INNER JOIN msdb.dbo.sysmaintplan_subplans AS sp ON sp.plan_id=s.id
INNER JOIN msdb.dbo.sysmaintplan_log AS spl ON spl.subplan_id=sp.subplan_id
INNER JOIN msdb.dbo.sysmaintplan_logdetail AS ld ON ld.task_detail_id=spl.task_detail_id
WHERE
((sp.subplan_name= ‘Full’)and((s.name=’Backups’)))
ORDER BY
[StartTime] DESC

Denny


April 4, 2014  2:00 PM

Recommended reading from mrdenny for April 04, 2014

Denny Cherry Denny Cherry Profile: Denny Cherry

This week I’ve found some great things for you to read. These are a few of my favorites that I’ve found this week.

This weeks SQL Server person to follow on Twitter is: tjaybelt also known as tjaybelt

Hopefully you find these articles as useful as I did.

Don’t forget to follow me on Twitter where my username is @mrdenny.

Denny


April 2, 2014  5:00 PM

How I Setup TempDB

Denny Cherry Denny Cherry Profile: Denny Cherry

How TempDB should be setup is a question that comes up a lot, mostly because there is a lot of bad information out there.  So here’s how I typically setup TempDB for the bulk of my clients, and these settings will probably work for you as well.

Number of Files

The number of files that you need for tempdb, in most cases is 8.  There’s plenty of websites and blogs out there which say that you should have one tempdb database file for each CPU core that you have.  And that was true, for SQL 2000 because we only had 4-8 cores in the machines.  For 99.9%+ of SQL Server instances 8 tempdb database files will do the trick.

You’ll need to add more files for the tempdb database when you see waits on the PFS data pages (the easiest way to see this is by using sp_whoisactive as it’ll say PFS if the waits are on a PFS page).

File Size

The size of the tempdb database should be what ever size it grows to normally.  If you’ve got one database file today, and that gets up to 15 Gigs, when you break that out to 8 files, make each one about 2 Gigs in size.  This way when SQL starts up the space is already allocated and you don’t need to worry about the database growing.

 

Auto Growth Settings

Typically I set the auto growth settings for the data files for the tempdb database at 1 Gig.  For systems which are a lot larger I might go with a bigger number, but for most clients we set a 1 Gig auto-growth setting.

Transaction Log

For the transaction log things are a little big different.  I’ll typically start with an 8000 Meg file.  If I think that it’s going to grow a lot (based on experience for this instance) I’ll set the growth for 8000 Megs.  If I don’t think it’ll grow much or at all I’ll set it for 1000 Megs.  These numbers look a little funky because of some auto-growth bugs which have existed in the past.

So that’s it.  That’s how I’ll typically setup the tempdb for my clients.

Denny


March 29, 2014  5:11 AM

Recommended reading from mrdenny for March 28, 2014

Denny Cherry Denny Cherry Profile: Denny Cherry


This week I’ve found some great things for you to read. These are a few of my favorites that I’ve found this week.

This weeks SQL Server person to follow on Twitter is: daveberm also known as David Bermingham

Hopefully you find these articles as useful as I did.

Don’t forget to follow me on Twitter where my username is @mrdenny.

Denny


March 26, 2014  7:00 PM

Excel 2013 and Analysis Services 2000

Denny Cherry Denny Cherry Profile: Denny Cherry

So I got a strange question from a client the other day.  Can I get Excel 2013 connected to Analysis Services 2000?

Yes you read that correct, SSAS 2000.

The answer is “it depends”.  If you are using Excel 2013 x32 (the 32bit version) then yes you can connect to SQL Server Analysis Services 2000.  If however you are using Excel 2013 x64 (the 64bit version) then no you can not connect to SQL Server Analysis Services.  At least part of this reason is because there is no 64bit drive for SQL Server Analysis Services 2000 which is a bit of a problem.  So if your uses have Excel 2013 x64 and need to connect they’ll need another machine (or a VM hosted somewhere) where they can run the 32bit version of Excel 2013 so that they can use it to export data from SSAS.  Then they can save the file, and open it in Excel x64 on their main machine and do whatever fancy stuff they need to do.

Denny


March 21, 2014  6:40 PM

Recommended reading from mrdenny for March 21, 2014

Denny Cherry Denny Cherry Profile: Denny Cherry

This week I’ve found some great things for you to read. These are a few of my favorites that I’ve found this week.

This weeks SQL Server person to follow on Twitter is: SQLSatExeter also known as SQL Saturday Exeter

Hopefully you find these articles as useful as I did.

Don’t forget to follow me on Twitter where my username is @mrdenny.

Denny


March 19, 2014  5:00 PM

Is Your DR Plan Ready For The Zombie Apocalypse?

Denny Cherry Denny Cherry Profile: Denny Cherry

When setting up DR plans are you covering all your potential options? Are you thinking about the entire location of your production site being unavailable? All to often when I talk to companies about their DR plans it takes me all of about 5 seconds to poke some major holes in the DR plan. Maybe this is because I’m from California and when we have a disaster (Earthquake) we really have a disaster.

So think about your DR plans for a minute. They are probably built around the assumption that the office and/or the data center (if the data center isn’t in the office) don’t exist any more. Right? So what kind of disaster have you planned for?

  • Building fire
  • Power Outage
  • Roof Gives Way
  • Earthquake
  • Flood
  • Hurricane

OK, that’s a decent list. But I’m going to say that you probably have only planned for 3 of those 6 that you think you’ve planned for. Can you figure out which ones I think that you’ve planned for?

I’ll give you a minute.

Take your time.

If you guessed Building File, Power Outage and Roof Gives Way you are correct, but you only get 1/2 credit for Power Outage.

The reason that I don’t think you’ve actually planned for an Earthquake, Flood or Hurricane is because what sort of planning around people have you done. Many companies have the bulk of the IT team in a single site with maybe just a couple of help desk guys and a sysadmin at which ever office is the secondary site. When it comes time to execute the disaster recovery plan what staff members are you planning on having available? Are you planning on having the IT staff from your primary site available? If the answer you gave way “yes” then you answered incorrectly.

When a disaster hits that’s big enough to take out our primary site the odds are that everyone who works at the primary site probably won’t be available either.

If there’s an Earthquake that’ll effect the entire region. Even if some staff members life far enough away that they aren’t impacted how are they going to get to the office. The roads probably won’t be passable, the power will probably be out, the airports will be closed, trains won’t be running, etc.

The same applies to a flood. Think back to the flooding in New Orleans a few years ago. How many of the companies there expected their employees to be available to execute the DR plan? The employees were busy evacuating their families, odds are the company was the last thing on their minds. The same rules apply for a hurricane.

When planning for power outages, how large scale of a power outage are you planning for? Here in the US the power grid is very unstable and it’s pretty easy to crash the entire grid. Hell it’s happened a couple of times now in the last few years. That again means that airports are going to be having problems, trains will be slow or down, roads will be a mess, users won’t have power to VPN into the office, etc. And again the staff members will be busy worrying about their family members being secure, fed and warm, not worrying about getting the company failed over to the DR data center.

While you probably don’t need to plan for every possible disaster (the odds of the zombie apocalypse are pretty small) there needs to be a lot of thought put into the plans to account for when the staff will and won’t be available. And the DR tests need to take this into account. Doing a DR test and having the staff at the primary site who manage the systems normally doing the test isn’t the best test as those people won’t be available for a real disaster.

The best plan is to have the DR test done by the people who don’t work at the primary site, and have them do it without contacting the people from the primary site. That’s a proper DR test. And I think you’ll be surprised just how quickly that sort of DR test fails. After all, do the people who don’t work at the primary site even have access to the DR run books without accessing the primary site?

Denny


March 14, 2014  5:14 PM

Recommended reading from mrdenny for March 14, 2014

Denny Cherry Denny Cherry Profile: Denny Cherry

This week I’ve found some great things for you to read. These are a few of my favorites that I’ve found this week.

This weeks SQL Server person to follow on Twitter is: sqlargentina also known as SQL Argentina

Hopefully you find these articles as useful as I did.

Don’t forget to follow me on Twitter where my username is @mrdenny.

Denny


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: