All to often when I give a SQL Service Broker presentation, or when I’m talking to people about it they want to know what the odds are that SQL Service Broker will be removed from the platform like how Notification Services was ripped from the product after just one release.
So here’s just some of the SQL Server features which would need to be re-architected if SQL Service Broker was removed from the SQL Server product.
- Database Mail – Introduced in SQL Server 2005
- Database Mirroring – Introduced in SQL Server 2005
- Always On (HADR) – Introduced in SQL Server “Denali”
There is some other stuff which uses SQL Service broker, mainly BizTalk can use it, and I’ve heard rumors that Microsoft CRM will start using it at some point (if it doesn’t already). I’m sure based on the way the SQL Server product team has been going that even more new features will be based on SQL Service Broker.
So if you were worried about using SQL Service Broker because you’ve heard that not many people are using it, don’t be. The feature isn’t going anywhere.
Apparently people still haven’t figured out that taking backups of SQL Server databases is actually a requirement of having a SQL Server database. There was a conversation recently on Twitter about yet another forum question where the OP talked about the crashed database that they had without any backups what so ever.
Some people don’t seem to understand that server hardware failing isn’t a question of if the hardware will fail, but rather when the hardware will fail. If you don’t have backups (that you have verified will work) then you haven’t done what needs to be done to protect yourself from WHEN that hardware fails. If you’ve been working with SQL Server for a while now and you haven’t had a hardware failure, then remember one simple thing… That hardware failure is coming, and when it comes to kick you in the junk, it is going to hurt.
But I work for a small company, and we can’t afford to do backups.
If you work for a small company, then doing backups if even more important. If you are a small company, can you afford to lose all of the data within your SQL Server database? How much money will it cost the company to rebuild all that data? How much money will it cost the company if you loose all of the contacts that are stored within the database? Will the company that you work for go out of business if the data within the database is lost? Will you as the systems admin or database administrator be fired if the data within the database is lost?
Most of those questions I can’t answer for you, except for the last one. If I was your manager, and you weren’t backing up the database that ran our business, and that system failed you would be out on the street in a heart beat. Odds are as your manager I’ll be getting fired about an hour after you, since I didn’t make sure that you were backing up the data which runs the company.
But I don’t no how to do backups.
Setting up backups doesn’t need to be all hard and complex. For companies with just a couple of database servers, you don’t need any sort of large complex backup solution, or hard written scripts. Using the database maintenance plan wizard will work just fine for database backups. People in large companies don’t like using it, and myself and other MVPs complain about it, because it doesn’t scale when you have a lot of servers. But as far as the actual backups them selves, it’ll work just fine for you.
Assuming that you need to be able to loose as little data as possible, you’ll want to put your databases into full recovery mode (not the system databases, just your user databases). Then setup full backups to run either daily or weekly, with transaction logs running at regular intervals. This is where you need to ask your manager or the company owner how much data you can afford to loose. This will tell you how often you need to setup the transaction log backups.
To setup all these backups you’ll want to setup two maintenance plan packages (I’m talking about SQL Server 2005 and newer here, I’ll talk about SQL 7 and SQL 2000 in a minute). One to do the full backups and one to do the log backups. In both maintenance plans use the “Back Up Database Task”. In the plan which will do the full backups, change the backup type to “Full” and change the Database(s) option to “All databases”. Change the backup folder to a folder on the local server which doesn’t have any databases on it. If there aren’t any drives on the server that don’t have any databases on it, change it to a network path on another server, like a file server. If that isn’t an option a large USB drive is better than nothing. Schedule the job to run daily.
For the transaction log backup maintenance plan do the same thing, but change the “Backup type” option to “Transaction Log” and change the Databases to “All users databases (excluding master, model, msdb, tempdb)”. Change the path to the same one as the full backups, or a similar one. Schedule the job to run every 15 minutes (or whatever schedule is needed based on discussions with management). You’ve now got a SQL Server backup, so that when the server fails you can restore.
You also need to clean up the backups, so that you aren’t keeping 5 years of backups on disk. In the full backup maintenance plan add a “Maintenance Cleanup Task”. Point this to the same folder, and have it delete files based on the file age (the bottom check box). I recommend keeping at least 1 week or data on disk.
In SQL Server 2000 you just need to run through the maintenance plan wizard. This wizard will allow you to select which databases you want to backup via both full and log backups. Once this wizard has been completed there will be a couple of different jobs which are created to do these backups. In some respects it is easily to setup the backups via the maintenance plans in SQL 2000 as you get a nice quick and easy wizard to run through. SQL 2005 and up don’t really give you this feature as the wizard in SQL 2005 and up really sucks. When going through this wizard do NOT select the automatically fix index problems checkbox. You want to be notified of problems, but not be fixing the problems automatically.
But I backup the server using backup exec or something that backs up all the files.
In other words you aren’t doing any sort of backups. Backing up the SQL Server database files from in Windows means that you aren’t doing anything useful for a couple of reasons.
- SQL Server keeps the data files locked so that other applications can’t access them and corrupt the databases. Because of this when a backup application comes through to backup the files it can’t and you get corrupt backups, which are in other words no backups.
- Even if the files weren’t locked, the backups would still be useless because the data file and the log file wouldn’t be backed up at the same time. With SQL Server the files must be in sync with each other. If you are backing up the data file and log file, the data file would probably be backed up first, then when it is backed up the log would be backed up. So the timestamp of the backup of the data file would be several minutes or hours (depending on the size of the data file) before the log file. Depending on the amount of data change, the data file might not be consistent with it self.
If I haven’t convinced you yet that you should start doing backups (remember that part above about losing your job, and if they fire you for not backing up the databases I wouldn’t count of a good recommendation when you are job hunting), what will it take to get you to start actually doing it? Now here I’m being serous, comment here, post a blog post, tell me on twitter (@mrdenny), whatever just let me know what it’ll take to get you to start doing the backups.
So I drug myself out of bed this morning to an email from Sean McGown (Blog | Blog | Twitter) saying that Amazon had just told him that they would be shipping my book. So I hopped over to the Amazon page for the book, and low and behold it is no longer listed as pre-order. In fact there was another notice up there, that they only have three copies left in stock.
This means that either Amazon either ordered just enough copies to cover the pre-orders, or they ordered a heck of a lot of copies and the SQL Server community bought them like crazy. Personally I’m hoping for the second reason and not the first. But when I look at the sales graph that Amazon shows me, it might actually be the second.
If you look at the graph to the right (I’m a DBA, we love data) you can see how the book has been ranked on Amazon’s best seller list over the last month. What I really like is that yesterday, on my birthday, the book breached the 100,000 mark for just the second time. The first time was back on October 22, 2010 which the book was ranked #99,971 overall. Well the rank yesterday was #49,111. And unlike every prior spike, this wasn’t a one day spike. There were actually two days of positive climbing in the rankings. Does this tell me that the book has been selling like crazy? No, well sort of no. The book rankings are based on daily sales, and as you get closer to being #1, it takes more and more sales to move up the rankings.
There’s no specific sales data available for the book yet that I can see, that all comes from BookScan (a company that gathers sales info from all the major retailers) so it’ll take a week or two for data to show up, but at least I can see some general info about the book so I’ve got a general idea.
Now I get to move into another party of the book writing phase that I like to call “Please tell me you liked it”. Now I have to sit around and wait for the revues to be posted about the book. Personally I’m pretty proud of it, I should be proud of it I wrote the thing after all, but I hope that you the SQL Server community at large like it, and more importantly that you (or someone you know) find it useful. If just one (or maybe two) people are able to better protect their SQL Server data because of the book I’m going to call it a success.
I think I’ve rambled enough for the day. I’ve got to get back to my MCM study.
As many of you know, I took the Microsoft Certified Master (MCM) knowledge exam back in November, and received notice that I passed back in December. The MCM knowledge exam is the first of two tests that you have to pass in order to receive the MCM certification, after you take and pass the MCITP database administrator and MCITP database developer exams of course. The second of the exams, is a 6 hour lab which is supposed to be just as hard as the knowledge exam, if not harder.
I’ve been asked a couple of times what my test taking experience was like, how much experience I have as a DBA, etc. So that’s pretty much what I’m going to talk about in here. I won’t be talking about the kinds of questions, or whats on the test as that would be an NDA violation, but this may give you some ideas about the experience it self.
The testing center
When you go and take the MCM exam, you can’t just go to any testing center and take it. You have to go to a special high security testing center. Normally when you take an MSL (Microsoft Learning) exam you just turn off your phone and leave it in your car. At these facilities you take nothing with you, except for your drivers license, which is to remain on the desk at all times, and your little white board for notes that you want to take during the exam. Everything else, wallet, keys, etc. goes into a locker which you have a key for with you (also sitting on the desk). IDs are checked every time to enter and exit the testing room (for a bathroom break, etc). When you go take the test, expect an almost TSA like inspection. OK they don’t actually pat you down, but you do have to pull your pockets out to show they are empty.
The test it self
The test itself is hard, very hard; as well it should be. The goal of the exam is to weed out the people that don’t have the requisite knowledge and experience at the level expected. I’ve been told that this exam is actually harder than the prior one that the people that went through the class room training took. I’ve got no idea if this is true or not, but either way, the test was very tough. I had no idea if I had passed or not until I got the email stating that I had passed (people that aren’t in the beta program for the exam probably won’t get an email, but will instead just get the notification from Prometric).
My SQL Server Experience
A lot of experience with the SQL Server database engine is definitely a requirement to passing the MCM Knowledge Exam. Personally I’ve been using Microsoft SQL Server since SQL Server 6.5, although most of my experience has been with SQL Server 7.0 and beyond. I really started getting into SQL Server back in 1998 or 1999, so I’ve got about 12-13 years experience with the product as both a database administrator as well as being a database developer. A long amount of experience is, in my opinion, absolutely critical to passing the MCM exam. Without it you won’t have the breadth of knowledge of all the various features which the MCM knowledge exam is going to test you on. The MCM exams which Microsoft has released (and were recorded by SQL Skills) are a good starting point for people that don’t have a similar level of experience, however they are not all that you need. A working knowledge of the production from practical on the job experience using all the features for several years will greatly increase your odds of passing.
If you plan on taking the MCM exam, be very sure that you know your stuff, well … very well. Several people that know SQL Server well have taken the MCM knowledge exam and have not yet passed the exam. This should drive home the point that you need to know the database engine very well before taking it.
Syngress, my publisher, is doing a one day sale for my book Securing SQL Server. If you order from them directly and use discount code 50170 you’ll get 50% off the price, bringing the price down to just $24.98. Just click through to the Syngress order site and use the discount code 50170 during check out.
This 50% off code is only good today Feb 1, 2011 so ACT NOW.
UPDATE: To order the paperback, click the “Buy Now” link with the arrow under Paperback and select “Elsevier” from the list. On the new page add the book to your cart, click the cart in the upper left and use the code there.
UPDATE2: I’ve been told that if you use Chrome, it will say the site is not secure. This is apparently because some stuff isn’t protected by HTTPS even after you change the URL to HTTPS.
So this last weekend was the 13th SoCal Code Camp (the 6th at this location). I presented three different sessions at this code camp. My slide decks can be downloaded below.
If you attended the sessions please rate my presentations on SpeakerRate.com/mrdenny.
OK, everyone it’s time to come out west. There are two SQL Saturday’s scheduled for April 9th, 2011 here on the west coast. Personally I’d love to see you all show up at our SQL Saturday event down here in Huntington Beach (Orange County, CA). The weather will be beautiful (OK so I can’t guarantee it, but there’s like a 95% chance of great weather in April in Orange County).
The SQL Saturday is just a few minute drive from the beach, if you want to bring your families and they like the beach. If the beach isn’t their thing Disneyland is only a 30 minute drive away (the multi-day tickets are a MUCH better deal than paying for single day tickets, Costco usually has a pretty good deal on tickets as well) and what family doesn’t like going to Disneyland while mom or dad is at SQL Saturday. There’s plenty of great shopping in the area at South Coast Plaza (just 5-10 minutes down the freeway) as well as The Spectrum (about 20 minutes down the freeway. We’ve got a great after party in the planning, that some of the local MVPs are putting together.
Hopefully you can make it out here for the SQL Saturday. We had about 150 people last year (with 6 weeks notice) and this time we are hoping for 250. If you can make it down, we’d love to have you and you’ll probably get a pretty good turn out.
If you’ll be flying in Orange County (SNA) airport is the closest, Long Beach (LGB) is the next closest, Los Angeles (LAX) is the next (but this airport is huge and is a pain to fly through). Your next best choice is Ontario (ONT) but is about an hour from the SQL Saturday site. Andrew and his team have gotten a great hotel setup for us at the Huntington Beach Marriott. If you are going to stay at a different hotel, check with Andrew or me first as some of the hotels in the area really suck. I used to work down the street from the college where SQL Saturday will be held, so I know where the decent hotels are.
See you in April.
In case you missed me when I was in Tampa last week at the awesome SQL Saturday 62 event, all is not lost. I’ll be back at the Dev Connections March 27th-30th, 2011.
I’ve got three sessions scheduled for this event.
- Exploring the DAC and Everyone’s Favorite Feature, the DACPAC
- SQL Server Clustering 101
- Getting SQL Service Broker Up and Running
Hopefully I’ll see you at the event while I try and rip a few attendees away from Paul and Kimberly’s sessions.
So a couple of nights ago Paul Randal (Blog | Twitter), Wendy Pastrick (Blog | Twitter), Jonathan Kehayias (Blog | Twitter) and I were talking about how some of our demo’s have completely and totally failed, often in front of a live audience. As we all were chatting about our various demo’s crapping out we came to the realization (ok, we had all already come to this realization separately) that when your demo fails, how you handle it is how you prove that you are a truly good speaker.
Speaking by it self isn’t all that easy for a lot of people (oddly not me, but I’m kind of an attention whore) but when that demo blows up, and it will at some point, how you handle that is where you really show that you know the product. Since it is a little hard to tell a good story in 140 characters, I decided that a blog post would be a good way to give my story.
Windows just won’t cluster
So my first mega fail was actually not in front of a live audience (unless you count the recording engineer). The first time I did a recording for the SSWUG conference (I think it was the first, I’m going to record by third or fourth set of sessions soon) one of the sessions that I did was on SQL Clustering. During the session I walked through the Windows 2008 clustering process and got the SQL Installation started. I would jump back to it through out the session so the demo was throughout the session because sitting there watching the wizard running which I do nothing isn’t very exciting.
So I tested everything before I left home and everything was working perfectly. I decided to test everything from the hotel the night before (I always fly in the night before when I do the recordings) and when I ran through the clustering validation wizard it took 3 or 4 times for me to get a successful validation. At this point it was late, so I called it a night. When we did the recording the next day, wouldn’t you know it the damn validation wizard just wouldn’t work. Which means that SQL Server wouldn’t install on the cluster as SQL won’t install on a cluster that hasn’t passed validation. So I just had to wing it, and explain that this is what you should be seeing on this screen, etc.
Needless to say the session went great, and was a pretty good length (I think I went really close to the one hour limit on it) and the reviews were great. I even got some positive reviews for leaving in the problem demo and not refilming it to make it look correct.
OK, who’s set of the fire alarm?
My second story is more recent, and was in person. At the 2010 SQL PASS Summit I was doing a quick 45 minute storage session during Buck Woody’s post-con on Sharepoint. During my talk the fire alarm for the building started going off. It wasn’t so loud that you couldn’t here me, but it was a bit annoying to listen to (I suppose that is the point). Someone come in to tell us that we didn’t need to leave, which was good since I was still talking, with occasional pauses to yell at the building, but we worked right through it.
What’s happened to you when you present?
So what sort of fun would this telling of the nightmares be if I didn’t open it up for you to tell us about what sort of problems you’ve had with your demo’s or sessions (hopefully Buck Woody (Blog | Twitter) will post something, he’s got some great stories). If you aren’t syndicated on SQLServerPedia.com or SSC please link back to me, so I can compile a master list later.
So it is time for another vote for the second category that I have a SQL Rally session submitted for. This time the vote is for the “Enterprise Database Administration & Deployment” track. I’ve got a session up titled “Using SQL Server Denali’s Always On” again in the Summit Spotlight section.
The abstract for my session is:
In this session we will look at the features which are provided with Microsoft SQL Server “Denali” as part of the “Always On” features including site to site configurations to allow of a large scale high availability solution without the need for any high end SAN storage solution. Additionally we will be looking at the ability to have redundant servers which can be used for reporting or for taking your backups reducing the load from the production database.