The Multifunctioning DBA

October 5, 2011  1:52 PM

I need some help

Colin Smith Colin Smith Profile: Colin Smith

Yesterday I had an interesting issue that I had never run into before. I attempted to drop a database in SQL Server 2008 SP1 and it failed and it moved the database into single user mode. I attempted the drop again and it failed saying the database was in single user mode and a user was already in it. I ran an sp_who and i found that spid 12 was in my database and the command was checkpoint. That never changed and as you know I can not kill that spid since it is not a user session. I had to restart the instance in order to get that session out of my database and then I was able to drop the database. Why did that spid stick in my db though? How can I get it out without restarting if it happens again? If you have any clue please help me out. I am stumped by this one. Please leave a comment here or hit me up on twitter, @smithco32


October 2, 2011  3:23 PM

Just a little Index

Colin Smith Colin Smith Profile: Colin Smith
I ran into a SQL Server that was performing poorly and when I looked at it the only metric that looked like it was causing an issue was the CPU at about 95% avg. I Ran the below query to check and see what my top CPU consumers on the system were. It also showed me that they were executed a lot. About 200 times per minute each.
select top 10
sum(qs.total_worker_time) as
sum(qs.execution_count) as
count(*) as ‘#_statements’,
qt.dbid, qt.objectid, qs.sql_handle,
from sys.dm_exec_query_stats as qs
cross apply sys.dm_exec_sql_text
(qs.sql_handle) as qt
group by qt.dbid,qt.objectid,
order by sum(qs.total_worker_time)

This Query showed me two queries that were causing a the CPU to be running so high. I took a look at them and they were very simple queries. one was a simple select with a where clause and the second was an update of a single row in the same table that the select query was hitting. I grabbed plans for both and took a look. Both were doing Clustered Index Scans. So that is really a table scan. The table is not huge at about 200K rows. But since both queries could be covered by the same index and the table did not have any other non clustered indexes, I thought it was worth a shot.
I created the one index on one column of the table and immediately the CPU dropped off a cliff to about 30 – 40% and not only that but I also saw the batch reqs per second jump from around 20 to almost 1000. I then looked at the new plans and both were doing index seeks now so they were much more efficient and CPU use for both dropped to next to nothing as well as the exec times dropped from around 200ms to less then 5ms. Now the queries are executing over 10K times per minute and the server is in much better shape. The application team that uses this database said that the app went from processing 1 record every 1 to 2 minutes to one record every 2-3 seconds. Now that was some big bang for the buck. So I hope that when the application support team takes this to the application vendor that they will take a look at indexing strategy and implement this index and perhaps some others that will help that I just have not come across yet.

September 30, 2011  4:13 AM

Powershell to add logins to SQL Server

Colin Smith Colin Smith Profile: Colin Smith
This is a script that I threw together for an app that needed to have 60 logins created on the sql server. We had all the ad groups created using a standard naming convention.
was the standard that we went with so here is the code that I put together to make this easy to do and very fast. For this you must have the QAD cmdlets installed.
Function Server

$devserver = “dev\qa servername”    ## this server houses both the DEV and QA databases
$prodserver = “Prodservername”
$server = read-host “What instance do you want to create the logins on? Dev, QA, or Prod?”
if ($server -eq “Prod”)
$groups = get-qadgroup app_PRD_*
foreach($group in $groups)
$group = $
Invoke-Sqlcmd -ServerInstance $prodserver -database master -query “create login [ssg\$group] from windows WITH DEFAULT_DATABASE=[master]”
echo “All Groups have been created on $server. Please run SQL Scripts to create users and map to roles.”

if (($server -eq “Dev”) -or ($server -eq “QA”))
$groups = @()
$QAgroups = get-qadgroup app_QA_*
$DevGroups = get-qadgroup app_DEV_*
$groups = $QAGroups + $DEVGroups
foreach($group in $groups)
$group = $
Invoke-Sqlcmd -ServerInstance $devserver -database master -query “create login [domain\$group] from windows WITH DEFAULT_DATABASE=[master]”
echo “All Groups have been created on $server. Please run SQL Scripts to create users and map to roles.”
if ($c -ne 1)
Echo “Incorrect Selection made. Please try again”


So this is pretty simple and it is so much faster then manually creating all 60 groups or more on each instance.
Hope that this will help out.

September 28, 2011  1:27 PM

Dynamic SQL

Colin Smith Colin Smith Profile: Colin Smith

The other day I wanted to write a query that would create my actual query in the results pane. I wanted to generate the sql to delete all the logins on the instance that met a certain criteria. Well I had some trouble but I worked it out and once I did I thought wow that was way easier than I was making it. So I thought a post might help a few people.

So say I wanted to create the sql to drop the logins for all logins that were like ‘%smi%’. Well I was doing something like this:

select ‘drop login ‘ + (select name from sys.syslogns where name like ‘%smi%’)

If you run the subselect all alone it will return all the logins that you are looking for, however if you do it like this you will get an error that the subselect returned more than 1 result and that is not allowed. This was very frustrating to me but then it hit me…. So I did the following:

select ‘drop login ‘ + name from sys.syslogins where name like ‘%smi%’

and that gives you the drop statements in the results pane.

So all you have to do is run your actual select statement to get the names but add the text you want around it.

Hoe that helps.

September 27, 2011  10:00 AM

Some Powershell Help

Colin Smith Colin Smith Profile: Colin Smith

Since I started my new job, I let the guys that I used to work with know that I would be happy to help them out if they have any questions about anything especially powershell since I took care of pretty much all of our Powershell needs. And the other day I got an email asking about how to find out how many lines were in an array that they created doing a get-content on a file.They were doing:

$var | measure-object

This will get you what you want but it does not just give you the value. It will output more information then they were looking for. They also tried:

$var| measure-object | select-object count

This will work as well but this will also give the header and they did not want that so the easiest way to do this is to do a get-member on var and see if it gives you what you want as a property on that object.

you will see a length property. so if you say $var.length that will tell you how many entries are in the array…

hope that helps

September 26, 2011  1:37 PM

On to Week Two

Colin Smith Colin Smith Profile: Colin Smith

Today I start my second week of work at my new job. The first week was alright but a little slow. That is to be expected as I need to learn the new environment and get up to speed with the way they are doing things. I did have a few fun things come up like the memory testing and I also got to write some code for dynamic SQL Generation of dropping users and creating users and adding them to groups. This is new for them and they need it in order to do refreshes from the Production servers to Dev and QA. All and all it was not a bad week. How bad can it really be when I get to bring one of my dogs to work on Friday? That was strange but my dog actually had a great time at work with me.

Today will not be as fun as today I have to go to an orientation and those are really never fun. But I am sure it will not be as bad as I think. I just really want to dig in and get some actual work done. That is when I am having the most fun so that is what I want to get back to.

September 26, 2011  1:27 PM

Another DR Test

Colin Smith Colin Smith Profile: Colin Smith

So we have what sounds to be a great idea for our SQL Server DR. We are doing block level replication between datacenters and so we thought it might be a good idea to joust mount up our disk on another server that is in the DR Datacenter in case of a disaster. We want to do that with the disk that holds all databases including the system databases. The thought is that if we build out the DR servers to look for the system DB’s in the same location as the prod server, once we bring that disk up and mount it to the DR servers we should be good and not have to worry about anything like, for example logins like we would have to with Mirroring.

So that is the thought and I am going to build out a couple servers in Virtual Box and try to test this out. Should be fun and simple enough to POC this idea. I will let you know more details on the test and if it works out or not.

Hope this might help

September 23, 2011  1:32 PM

DR SQL Server Memory Test

Colin Smith Colin Smith Profile: Colin Smith

My new company has a DR Test coming up and we came up with a great idea on how to make the SQL Servers test pretty simple. We use SAN Block level replication so we should be able to mount up our shared disk to any server that has the correct version of SQL Server installed. One question that we had though was that if SQL Server is configured to take say 1024 MB of memory minimum but the server only has 4 GB total would it start up. The answer is yes it will. I tested this using SQL Server 2008 running in Virtual Box. I had SQL Server set up to grab 1024 MB minimum at startup. I shut down the VM and configured the maching to only have 52MB and then I started it back up. SQL Server came up just fine. So this tells me that it will start up and then we can modify the settings to be what we would like.

Hope this helps.

September 20, 2011  3:59 AM

Day One

Colin Smith Colin Smith Profile: Colin Smith

Day one at the new job was a bit slow but I guess that is to be expected. Had to do a lot of paperwork and then get situated at my new desk, log into my new machine and start to figure out the new environment.  While I was doing that I found my own blog very useful. I downloaded some of my own scripts from here to use at the new job. One was my SQL Server Health Check and another was  my Register Servers. I was able to use these to check out all the instances that they have and see what type of shape they were in for the most part. Then I registered all the servers so that I could dig in a bit deeper by running some queries against multiple instances. Overall I would say it was a good first day but a bit slow as I am sure the next week or two will be as well. I am excited for this new opportunity but it was so strange being at a new desk after so long at my old one.

September 19, 2011  10:00 AM

SQL Server Health Check Follow Up

Colin Smith Colin Smith Profile: Colin Smith

A long while ago I wrote a script that does a SQL Server Health check and I posted it here. Recently I have gotten some comments on it and I want to help some people out. So I thought I would review it. So it is a script that goes out and connects to sql servers that are defined in a list, actually a csv file, and gathers some information and outputs that data into an Excel sheet. You can modify the script in any way you want to gather any data you want. The script is available in that post.

It seems that the questions are related to reading in the file that holds the instance information. You can see that this is the line that reads in the file.

$servers = Import-Csv "\\ent-pocpacapcx01\d$\monitoring\sqltab.txt"

So you need to make sure that you change that path and file name to where your server file is. Lets go over the block that defines what has been read in so we can define what the csv file needs to look like.

foreach ($entry in $servers)
$torp = $entry.TorP
$mon = $entry.monitor
$machine = $entry.server
$errorlog = $entry.errorlog
$os = $entry.os2000
$iname = $entry.Instance
if ($iname -eq "Null")
$instance = "$machine"
$instance = "$machine\$iname"
$instance = $instance.toupper()

What we can see is that I am looking at everything that the import-csv brought in to the $servers variable. Then I am defining some variables and you may not have all of these in your csv file. but if you look at $entry._____, that ______ is what my header information is. You need to make sure that your CSV file has the header information that you want to have in it. You might not have a errorlog or a TORP header because you may not need them. Just make sure that you modify either your file or this code so that they all match up. Then you should not have issues with the file not reading in correctly. Please let me know if you have any questions. leave me a comment and I will try to respond but you can also hit me up on twitter @smithco32 and I will get back to you.

Thanks and I hope that this helps.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: