SQL Server with Mr. Denny


June 28, 2011  2:00 PM

Lets learn about Virtualization at SQL Excursions

Denny Cherry Denny Cherry Profile: Denny Cherry

The next session from SQL Excursions that I want to talk about is my session on virtualization.  During this session, we will be talking about some best practices about running SQL Server in a virtual machine so that you can get the best performance possible out of your virtual SQL Servers.  We will also look at when is a good time to not visualize your SQL Server.  With VM World happening just a few weeks before there should be some great new VMware (and probably Hyper-V as well) announcements happening just before SQL Excursions, so you can be sure that the information we’ll be presenting is very up to date, and we’ll probably to talking about some futures for one or both platforms as well.

Being able to get SQL Server up and running within a virtual machine is key in today’s IT world.

If you haven’t signed up yet, now is the time.

See you in Napa September 22nd-24th,

Denny

June 27, 2011  7:39 PM

SQL Excursions Sessions

Denny Cherry Denny Cherry Profile: Denny Cherry

As SQL Excursions is getting closer, I wanted to talk a little about the sessions that I’ll be presenting at the event which I’ll be doing over this week.  The sessions that I’ll be presenting are:

  • Storage
  • Virtualization
  • High Availability
  • Indexing Internals
  • Server Consolidation

All told I’ll be presenting for about 8 hours over the two days, with Tom LaRock (blog | @sqlrockstar) presenting over the other 8 hours.  Because this isn’t a traditional conference not all the sessions will be locked in at an hour long.  Some may be longer, some shorter depending on how the day moves along.  The goal with these sessions is to get the questions answered, not to get the session presented in the time allotted.

The first session on the list above is Storage, so I’ll start with that session for today’s blog post.  During this session on storage I’ll be talking about some of the more advanced storage options which are available for storage array tuning as well as data replication and how these can be used to improve your storage performance not just for your SQL Servers, but overall for your entire environment as well as to reduce your backup windows.

If you haven’t signed up yet, now is the time.

See you in Napa September 22nd-24th,

Denny


June 27, 2011  2:00 PM

#SoCalCodeCamp Slide Decks

Denny Cherry Denny Cherry Profile: Denny Cherry

This weekend at the Code Camp I gave three presentations. Here are the slide decks in case you wanted to download them.

What’s New in Manageability for Microsoft SQL Server Code-Named Denali
Indexing Internals
Using SQL Server Denali’s Always On

If you attended the sessions, don’t forget to rate the sessions on SpeakerRate.com/mrdenny.

Denny


June 23, 2011  1:02 PM

How to find out to someone is cheating on a job interview

Denny Cherry Denny Cherry Profile: Denny Cherry

If you are the person giving the technical interview you’ve got a couple of jobs on your hands.  First is the most obvious, to gauge the technical skills of the person sitting in front of you.  The second is to see if the technical skills that they are presenting to you are actually their own, or if someone has coached them to get past your technical interview.  The third is to see if they’ll be a good person to fit on your team.

Gauging Their Technical Chops

This is the most obvious thing that needs to be done, as this is the whole reason that they are there sitting in front of you.  You need to make sure that what’s written on their resume is actually what they know.  Before you start giving them the third degree, make sure that the resume that you have is actually the correct one.  Often recruiters will make some slight adjustments to the person’s resume so that the experience listed more closely matches your job requirements.  They are doing this because they don’t get paid unless you hire their client, and the only way that’s going to happen is if their client matches the requirements on your post posting.

Once the person in front of you has said yep, that’s mine or they’ve given you a corrected one (I always bring a couple of copies with me just in case) the grilling can start.  Start by asking some basic questions about everything listed on their resume that they know about.  It’s OK if you jump around, it’ll make them think more and you’ll get more realistic answers from them.  For things that they say they are an expert level on, dig into those topics.  Ask crazy obscure stuff if needed, really see how well they know it.  I like asking people to rate themselves on a 10 point scale on various topics.  Also ask them how their co-workers would rate them on the topics.   If you are sitting in front of me, and you tell me that you are a 9 out of 10 on index tuning, you had better at least know who Kimberly Tripp is (being able to name her favorite food and the name of her company is some bonus points).  You also better know some pretty good information about the B-Tree structure that SQL Server uses, how SQL navigates the tree, etc.

Some people believe that they need to ask the exact same questions to every person that they interview, and this isn’t true.  Every interview should be customized for the person that you are talking to.

Are they blowing smoke up your back side?

Depending on how well someone was coached this can be a bit harder.  The deeper you dig into a topic the easier it’ll be.  If they have to sit a think for a minute for each answer, that’s probably a sign that they don’t know the answer.  If they take a kind of wandering path to the answer they probably know it on their own, they just need to find it in there.  If they can only answer the basic questions on anything you ask, they were probably coached (see my point above about asking them if the resume you have is correct).  Let them know that not knowing a few answers is OK.  Not everyone knows everything about everything in SQL Server.  If they say that they do they are lying.

One thing to note, is that if they just get up and leave they probably figured out that you’ve figured out that they don’t know squat.

Are they are good fit for your team

Once you’ve figured out if they are a good technical fit for your company, you need to see if they will be a good social fit for your team.  Get the team (or some of the team) together with the person and just chat and see how it goes.  The talk can be about the weather, sports (yuck), food, SQL, what ever, as long as HR isn’t going to fire anyone for the topic at hand (the smoking hot receptionist probably isn’t the best topic).  Someone could be the best technical person ever, but if they aren’t a good social fit with the team, there is going to be some major friction at some point and that isn’t going to serve anyone at all.  Now I’m not saying that you need to be best friends with this person from day one, but you need to be able to fight, argue and agree with each other without coming to blows and without anyone’s ego getting to bruised.  This is especially true of junior and senior level people.  Junior level people need to be able to take criticism without wanting to quit or kill you, and senior level people need to be able to have people ask them questions without feeling the need to kill everyone in the room.

Hiring a specialist without having a specialist around

One of the problems that I’ve seen a lot of companies have when it comes to hiring new staff, if that they don’t have a specialist in that field that can conduct a really good interview.  For example, it takes a knowledgeable VMware person to interview a VMware admin candidate, just like it takes a knowledgeable Windows admin to interview Windows admin candidates.  For times like this it might be good to bring in a person from the outside that can be trusted to conduct the technical interview.  Paying for a few hours of a senior consultants time to do the technical part of the interview can safe you a lot of time and money if you bring the wrong person on.  You want someone who is going to conduct the actual interview.  A pre-written set of questions isn’t going to help you out much.  A proper interview needs to be a fluid conversation not just a caned set of questions.

While this outside person doesn’t necessarily know the specifics of your shop, you can give them the run down pretty quickly about the technology that you have, the problems you’ve faced, and they can craft this into an interview when talking to the candidates.  For the record, I am more than happy to help out with this if you need a hand, just drop me an email.

Denny


June 20, 2011  1:00 PM

Why you shouldn’t cheat on a job interview

Denny Cherry Denny Cherry Profile: Denny Cherry

I see questions on various forums all the time from people who are trying to land that first job who are asking for brain dumps on what they need to know to pass an interview to get a job.  Sometimes these are people who have been working in the field for a few years, and want that next job with the next pay raise.

Getting someone to give you interview question answers before you take an interview isn’t going to do much for you.  Most people who conduct technical interviews will be able to tell pretty quickly that you don’t actually know the material.  Saying that you don’t know something during an interview is a perfectly acceptable answer.  But follow it up with, I’d look on Website X, and Website Y, or in Books OnLine under “this or that” to find that out.  The reason that they are asking these questions is that these are the technologies which are being used in their shop, and they need someone who knows how to support them.  If you do successfully fake your way though the interview and get hired, you probably won’t be staying around very long once it is clear that you don’t know the technologies that you said during the interview that you knew.  And seeing as how you would have left your prior just at this point, you won’t exactly have a lot of job options to fall back on.

When you interview, be honest about your experience.  This is doubly true when you are being interview by a senior level person like myself.  If when I’m interviewing you and you tell me that you are a replication expert, you had better be able to handle every screwed up replication question that I ask you (and you will probably want to keep in mind that I know it pretty well, as I’ve rewritten SQL replication to increase performance at a prior company).  If you tell me that you know a lot about SQL Service Broker be prepared to answer some serious performance tuning questions, and maybe even setup service broker on a laptop (that doesn’t have internet access).

On the other hand if you tell me that you don’t know these technologies, but I feel that you’ve shown interest in learning them that’s probably good enough for me for a junior or mid level position.  If you are going for a senior level position you should probably have a decent amount of experience in a decent number of SQL Server components.

If you want to read some about some interview nightmares check out Sean McCown’s blog.

Denny


June 15, 2011  5:27 PM

It’s #sqlpass summit session announcement time

Denny Cherry Denny Cherry Profile: Denny Cherry

So I’ve managed to trick the kind folks of the SQL PASS program committee once again this year.  I’ve gotten two sessions accepted for the summit.

The first is a “regular session” which is the normal 75 minute community session during which I’ll be presenting “Where should I be encrypting my data” during which I’ll be talking about all the various ways to encrypt data within the SQL Server database.  These techniques will for the most part work on any version of SQL Server from SQL 6.5 all the way through SQL Server “Denali” (which we’ll hopefully know the name of before the summit).

The second session is one of the new half day sessions where I’ll be presenting with the lovely, brilliant and highly talented Stacia Misner called “So how does the BI workload impact the database engine?”.  During this session we’ll be looking at a variety of things including how ETL extracts and loads actually impact the databases they touch, and why running queries from SSAS and a data warehouse are faster than running them from the OLTP application.  This session won’t be so much a BI session on how to do BI tasks, but how those BI tasks function under to the hood of the core SQL Server engine.

See you at the summit.

Denny


June 14, 2011  2:21 AM

Auto Close, Contained Databases and DDoS

Denny Cherry Denny Cherry Profile: Denny Cherry

One of the features which has been announced to be coming in SQL Server “Denali” is the “Contained Database” feature.  The feature which I’m looking for the most from Contained Databases is the ability to create a user within the database without having to first create a login for the user.  This will make database consolidation and migration projects so much simpler in the future as you won’t have to first create the user in the destination instance with the same SID, or risk having to resync up the logins and users using the sp_change_users_login system stored procedure.

However, there is a little catch with having a lot of contained databases, using contained authentication on a single server, especially if the auto close flag is enabled like it often is on hosting companies servers.  If you are using contained databases, and a user attempts to log into the contained database, but has the wrong password, the database must be opened the password checked, then the database closed.  If this was to begin happening to a large number of contained databases the SQL Server could end up crashing itself as it’s trying to open and close all these databases.  The reason that I see this happening on hosting company servers more than anywhere else, is because hosting companies put lots, and I mean lots, of databases on a single SQL Server instance.  If that server was exposed to the Internet (which they often are so their customers can log into the server via Management Studio) then this becomes an even bigger problem.

Basically what I’m trying to say here is if you have a lot of databases on the server, and you use the auto-close flag on the databases to keep databases that aren’t being used from taking any memory, you’ll need to change this practice before you start deploying contained databases on SQL Server “Denali” when it releases.

Denny


June 9, 2011  3:00 PM

Tech Ed Recap

Denny Cherry Denny Cherry Profile: Denny Cherry

I know that Microsoft’s Tech Ed 2011 ended a couple of weeks ago, and that my recap is way late, but better late than never.  It’s been a crazy few weeks with EMC World 2011, Tech Ed 2011 the next week, then a two day train trip to Seattle which was supposed to have WiFi but didn’t, then SQL Cruise, and as I’m writing this I’m in New York for a week working.

Anyway, back to Tech Ed 2011; which was a blast.  This was my first year (hopefully of many) speaking at Microsoft’s Tech Ed conference.  If you haven’t heard of Tech Ed  before, it is Microsoft’s premier IT Pro conference.  While there are some developer tools covered the conference is mostly about the sysadmin side of the house which is ok seeing as how the dev side of the house has MIX and PDC to name just a couple.  Needless to say I was honored that Microsoft asked me to present a session on SQL Server “Denali”, specifically on the new Manageability features of “Denali”.  The only problem was that there weren’t a whole lot of earth shattering manageability changes being rolled out in “Denali” (or if there are we don’t know about them yet), I think I made the session work ok though going over what is coming out, as well as some other new features that while not related to manageability are very cool.

This was my first time being a booth babe (I looked damn hot in that SQL Server mini-skirt, you should have been there to see it), which was a requirement of getting a session.  I’ve got to say, it was pretty cool talking to all the attendees and working through their problems.  I know that we got a few peoples problems solved right there on the middle of the show floor.  There were some interesting questions, some questions that I was surprised that were asked, and there were some damn hard questions.  Fortunately there were a ton of really smart people always working the booth that were able to field the questions.  In true SQL Server community fashion no one had a problem telling the person asking the question, I’ve got no idea lets get the guy that does.  He happens to be right over here just a few feet away.

Outside of the exhibit hall and convention center I had some great conversations about SQL Server, community, work, etc. with some really great people.  I had lots of fun hanging out and talking with Ed Hickey from the SQL Server product team, and I had a blast talking to Adrian Bethune also from the SQL Server product team (he’s the new Data Tier Applications PM so we’ll be having some fun with him later).  It’s always a blast to see everyone at these large events.  Atlanta was a really great place for SQL Server to have Tech Ed because there are just so many people in the Atlanta area that are involved in the SQL Server community and happen to be friends of mine like Geoff Hiten (twitter), Audrey Hammonds (blog | twitter), Julie Smith (blog | twitter), and Aaron Nelson (blog | twitter).  I know that I’m missing some locals from this list, everyone was a blast to see as always.

As for my session, I thought it went pretty well.  Especially when someone came down and found me at the SQL Server booth later that day to tell me that my session was the best one that he had seen all week so far.  So to the random person, hopefully you read this, and Thank You because people doing that is why speakers are up there doing these presentations.  As long as just one person gets something useful from the session I consider my job done.  I’ve looked at my scores a little, and I landed right in the middle of the pack with a 4 something out of 5.  All in all not to shabby.  Hopefully Microsoft will see it the same way and invite me back next year to Tech Ed 2012 in Orlando, FL.

If you came to my session at Tech Ed thanks, if not I believe that they are all posted on the Tech Ed website for all to see.

See you at the next conference,

Denny


June 7, 2011  9:00 PM

Taking locks without doing SELECT WITH UPDLOCK

Denny Cherry Denny Cherry Profile: Denny Cherry

All to often developers need to force some locks on a table so that they can be sure that the records aren’t going to change between the time that they first look at the records and when the transaction is completed.  The most common method that I’ve seen to do this involves at the top of the transaction running a select statement against the table, with the UPDLOCK or XLOCK which forces the database engine to take higher locks than it normally would against the table.  While this does have the desired end result of locking the table, is causes a lot of unneeded IO to get generated, and takes a lot more time than is needed.

For example, lets assume that we want to lock the Sales.SalesOrderHeader table in the AdventureWorks database so that we can do some processing on it without allowing anyone else to access the table.  If we were to issue a SELECT COUNT(*) FROM Sales.Individual WITH (XLOCK) against the database we lock the table as requested, however it generates 3106 physical reads  against the database as we can see below in the output from the Messages tab.

/*————————
DBCC DROPCLEANBUFFERS
GO
SET STATISTICS IO ON
GO
BEGIN TRAN
    SELECT COUNT(*)
    FROM Sales.Individual WITH (TABLOCK)
   
    /*Business Logic*/
COMMIT
GO
————————*/
DBCC execution completed. If DBCC printed error messages, contact your system administrator.

(1 row(s) affected

Table ‘Individual’. Scan count 1, logical reads 3090, physical reads 8, read-ahead reads 3098, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

If we look at the sys.dm_tran_locks DMV we’ll now see that we have taken an exclusive lock against the table (don’t forget that you have to query the DMV within the transaction in order to see the lock).  That is a lot of IO to generate in order to generate a single lock within the database engine.  You can imagine what would happen if this was a much larger table, say a fact table within a data warehouse.  A large multi-year fact table could end up generating millions of IO just to lock the table.

A better solution to this sort of problem would be the sp_getapplock system stored procedure.  This procedure allows you to table table level locks without running queries against the table.  It can lock tables with are Gigs in size in just a second.  When we run the command telling it to lock the Sales.Individual table, we get no IO being generated and yet we still see the object being locked.  In this case we would run the below command to generate the needed lock.

exec sp_getapplock @Resource=’Sales.Individual’, @LockMode=’Exclusive’

The only difference we should see in the output of the sys.dm_tran_locks DMV is that the value in the resource_type column has changed from OBJECT to APPLICATION.  Once the lock has been taken against the database we can do all the processing that we want to against the table without having to worry about another user coming in and changing the base data of the table.

The sp_getapplock procedure must be run within an explicit transaction, and has several parameters so that you can control what it is doing. 

The first parameter is @Resource which we used above.  This parameter is how you tell the stored procedure what object you wish to lock.  It accepts the input as schema.object or just the object if the object is within your default schema.  It is recommended that you use the two part name to ensure that you are always locking the correct object.

The next parameter is @LockMode which we also used above.  This parameter allows you to tell the database engine what locking level you used.  Your options are "Shared, Update, IntentShared, IntentExclusive, and Exclusive”.  Any other value specified will throw an error.

The third parameter is @LockOwner.  This parameter allows you to tell the stored procedure to take the lock for the duration of the transaction (the default) or the duration of the session.  To explicitly specify that you want to take the lock for the duration of the transaction specify the value of “Transaction”.  To specify that you want to take the lock for the duration of the session specify the value of “Session”.  When the value of “Session” is used the procedure does not need to be called within a transaction.  If a value of “Transaction” or no value is specified then the procedure does need to be called within an explicitly defined transaction.

The fourth parameter is @LockTimeout.  This parameter allows you to tell the procedure how many milliseconds to wait before returning an error when attempting to take the lock.  If you want to procedure to return immediately then the specify a value of 0.  The default value for this parameter is the same as the value returned by querying the @@LOCK_TIMEOUT system function.

The fifth and final parameter is @DbPrincipal.  This parameter allows you to tell the procedure the name of the user, role or application role which has rights to the object.  Honestly I haven’t really figured out what this parameter is used for.  What I do know, is that if you specify a user, role or application role which doesn’t have rights to the object the procedure call will fail.  This parameter defaults to the public role, if you get an error when using the default value create a role with no users in it, and grant the role rights to the object then specify the role within the parameter.  No users need to be assigned to the role to make this work.

Releasing the lock that you’ve just taken can be done in a couple of different ways.  The first is the easiest, commit the transaction using COMMIT (ROLLBACK will also release the lock, but you’ll loose everything that you’ve done).  You can also use the sp_releaseapplock system stored procedure.  The sp_releaseapplock procedure accepts three parameters which are @Resource, @LockOwner and @DbPrincipal.  Simply set these values to the same values which you used when taking the lock and the lock will be release.  The procedure sp_releaseapplock can only be used release locks which were taking by using the sp_getapplock procedure, it can not be used to release traditional locks that the database engine has taken naturally, and it can only be used to release locks which were created by the current session.

Hopefully some of this knowledge can help speed up your data processing times.

Denny


June 2, 2011  2:00 PM

Why do I do all this work for free?

Denny Cherry Denny Cherry Profile: Denny Cherry

Besides the insanity? It’s fun. If it wasn’t, I wouldn’t do it. God knows it isn’t for the money.

Yeah the presentations can be a pain to come up with, and coming up with topics to present on is probably my least favorite part of doing all this. But sharing the information that I know, and being able to learn more through it thanks to my NDA is just awesome.

What really makes all this work worthwhile is when I get emails from people saying that the information that they learned from an article or session helped them with their job, or to fix a problem that they were having.

And that right there is why I do it.

Denny


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: