The Multifunctioning DBA


August 23, 2012  3:17 PM

Index Fragmenting on Purpose



Posted by: Colin Smith
SQL Admin, SQL Administration, SQL Administration. MS SQL, SQL Server

In most cases we would like to reduce fragmentation and try to come up with ways, like reducing fill factor, to minimize the amount of fragmentation that occurs when doing inserts and updates. I , however, need to create fragmentation. I am sure that you are asking yourself why I might need to do this. That is a great question and the answer is because I need to test some re-indexing scripts and get timings on how long they are going to run. To accomplish this, I am going to use another feature of SQL Server that I try to NEVER use. The dreaded Shrink Database. The reason I am using shrink database is that it, by its nature, causes fragmentation.  So today is just a day of worst practices for me. I am doing this is a sandbox testing environment and I would not recommend ever using Shrink Database Unless you absolutely have to. If you do run Shrink Database for any reason, just know that you are going to need to clean up your indexes after and that is going to grow your database again.

August 21, 2012  9:10 PM

Partition Table Load Mistake



Posted by: Colin Smith
SQL Admin, SQL Administration, SQL Server

I have a table that has 54 partitions and is a range right partition function. So ideally I like to have 52 partitions with data in them and the first and the last partitions are empty. Well the other day, our ETL team ran a load job that is meant to run after our weekly load process. They had not run the weekly load process for three weeks prior to running this other job though. This means that I do not have partitions created for dates that they are going to be inserting data for. I looked at my table and I saw that my last end partition, that I like to have 0 rows, had about 15 million rows in it.  I contacted them and they told me that since they had not run the weekly job for so long they thought they would just catch up by loading all of it at the same time. Well this is an issue, becuase now if I split my partition on the weekly date that I need to, SQL Server will have to re-sort all that data into the correct partitions. This is not a quick operation. So I had to do some clean up.

This is fairly easy to do since I did not split my partition yet.

1. Create a stage out table with the same structure, indexes included, as the base table and make sure it is on the same filegroup.

2. Swithc that last partition out to the stage table that you just built.

3. Drop the stage table.

Now all that data is gone and my partitions looked good again. Now I asked the ETL team to run our standard weekly process to catch up and they did. Now all is well with my table.


August 21, 2012  4:29 PM

Powershell to get DB Role Members Report into Excel



Posted by: Colin Smith
Database, Excel, Powershell

Recently I had an application support group ask if I could get a report of all the members of the database roles that the application uses. I thought about it and, you know me, I thought, POWERSHELL!! Of course I can do that, I will get you a spreadsheet with that data as soon as I can. Off to Powershell I go and start thinking about how to do this.

  1. Connect to the SQL Server instance
  2. Connect to the database of interest
  3. Gather all the database roles
  4. Enumerate the members of the roles
  5. Output to Excel in a readable format

That is the task and here is how I did it.


#Create a new Excel object using COM
$Excel = New-Object -ComObject Excel.Application
$Excel.visible = $True
$Excel = $Excel.Workbooks.Add()
$Sheet = $Excel.Worksheets.Item(1)
#Counter variable for rows
$intcol = 1
connect-sql "Servername" ### See function Below
$db = $s.databases | where{$_.name -eq "Databasename"}
$roles = $db.roles | where{$_.Name -notlike "DB_*"}
foreach($role in $roles)
{
$rname = $role.Name
$Sheet.Cells.Item(1,$intcol) = $rname
$Sheet.Cells.Item(1,$intcol).Font.Bold = $True
$members = $role.EnumMembers()
$introw = 2
foreach($member in $members)
{
$Sheet.Cells.Item($intRow,$intcol) = $member
$introw = $introw + 1
}
$intcol = $intcol + 1
}
$Sheet.UsedRange.EntireColumn.AutoFit()

And here is the connect-sql function that I used.


function connect-sql ($arg)
{
if($arg -eq $null)
{
$server = Read-Host "enter instance name to connect to"
}
else
{
$server = $arg
}
$s = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $server
}

hope that helps.


August 21, 2012  4:18 PM

MSSQL Server and BCP



Posted by: Colin Smith
SQL Admin, SQL Administration, SQL Administration. MS SQL, SQL Server

I recently had a need to BCP a very large data set in and out of a database. The out is no issue.

bcp [databasename].[schema].[table] direction path_to_output_file.txt -Sservername -c -T

is the format that I was using and it worked fantastic. I was able to BCP out my 934Million rows.

Now I had to push it back into a different table. I built the table and started my BCP.

bcp [databasename].[schema].[table] IN path_to_output_file.txt -Sservername -c -T

Well that got me about 800 Millione records before I blew up my log file. Ouch… So I had to change the BCP command and tell it to commit after so may rows had been inserted. This is an important step so that you do not blow up your log. Simple as this.

bcp [databasename].[schema].[table] direction path_to_output_file.txt -Sservername -c -T -b2000000

That will push the data in and ever 2Million rows inserted it will commit and save your log.

 

have fun using BCP from the command line.


August 18, 2012  10:36 PM

Windows 8 Enterprise Activation



Posted by: Colin Smith
Database

I mentioned that I downloaded Windows 8 from my TechNet subscription, but I had one heck of a time getting it to activate. When installing the Enterprise version the installer never asks you for your product key. I thought this was strange but I went with it until I could not customize my desktop with an un-activated copy. So I went and got my legit key from TechNet and attempted to go change my key. I went into the activation wizard and found no option to change the key. I did some research and found a command line option to set the key. Here you go.

slmgr.vbs /ipk <Product Key>

change your key

slmgr.vbs /ato

Activate windows.

 

I hope this will help someone.


August 17, 2012  12:00 PM

Powershell to Drop Procs



Posted by: Colin Smith
POSH, Powershell Tips, Scripting, Scripts, SQL Admin, SQL Administration, SQL Administration. MS SQL, SQL Server

I was asked by a developer if I could drop all the procs in a database except the ones that he knew were being used. I thought Powershell can do that. So I told him that I could if he provided me a list of the ones that he did not want to be droped. So he sent me a list and I got to work. I thought it would be a good idea to script out all the procs before I dropped them just in case. So below is the code that I came up with.

 

 

connect-sql “name”
sqlscript-options

$db = $s.Databases | where{$_.Name -like “*DEV”}
$db.name
$procs = $db.StoredProcedures  | where{$_.IsSystemObject -eq 0}
$procs | select name
“###################################”#
“#####################################”
foreach($proc in $procs)
{
$pname = $proc.name
$pname
$proc.script($so) | Out-File C:\Procs\$pname.sql -Force
if(($pname -notlike “dba*”) -and ($pname -ne “procname”))
{
$proc.Drop()
}
}

 

So lets take a look at this. First you will notice that I am saying connect-sql, and you probably do not have this cmdlet. Well that is because I wrote it as a function that I import in my profile so that it is available. Here is what that function does.
function connect-sql ($arg)
{
if($arg -eq $null)
{
$server = Read-Host “enter instance name to connect to”
}
else
{
$server = $arg
}
$s = New-Object (‘Microsoft.SqlServer.Management.Smo.Server’) $server
}

 

So really it is just a shortcut for me to instatiate a SMO connection.

 

You also see a sqlscript-options and that is also not a real cmdlet. Also a function and here is what that does.

 

function sqlscript-options
{
$so = new-object (‘Microsoft.SqlServer.Management.Smo.ScriptingOptions’)
$so.DriIndexes = $TRUE
$so.IncludeIfNotExists = 1
$so.SchemaQualify = 1
$so.AllowSystemObjects = 0
$so.DriPrimaryKey = $TRUE
$so.DriForeignKeys = $TRUE
$so.driallconstraints = $TRUE
$so.IncludeDatabaseContext = $TRUE
$so.Indexes = $true
$so.AllowSystemObjects = $False
$so.triggers = $true
}

 

So it sets up some options for when I script out objects. Give this a shot and see what you think.


August 17, 2012  4:00 AM

Windows 8



Posted by: Colin Smith
OS

I have downloaded the RTM Windows 8 bits from my technet account and installed them on a laptop that I use to do testing. I must say that I am actually, so far, somewhat impressed. I have not played with it for long so I still have a lot of things to do before I give it a a thumbs up or down but compared to the CTP release that I installed last, they did some good things.

I am still getting used to it and I am attempting to keep an open mind about the Metro interface. I am not sure that I will ever fully come around to it, especially since, as an IT Profesional, most of the apps that I am going to be using are not written to be Metro Apps and they will have to switch out of Metro and into the Windows 7 style desktop in order to run anyhow.  So I think that for me and others like me, we just need to change to always run with the Windows 7 style desktop. I have not figured that out as of yet but I will. For standard users, the Metro interface is actually very nice. It will take some getting use to but I actually like it way more than I thought I would. I can definatly see the appeal of this for tablets, phone and even the XBOX.  I have to give MS some props on this as I thought I would hate it through and through. I will be playing with it more and installing some more things on it to see what it can do and I will let you know.


August 16, 2012  3:45 PM

Database Virtulization



Posted by: Colin Smith
Database Administration, Database Virtualization, MS SQL, MS SQL Admin, MS SQL Server

Some of you may see the name of this post and think that I am going to talk about running SQL Server on VMWare or HyperV. Well that is sort of correct but not really. I am involved in the very early beta testing of a new product that will let you provision VDB’s. I am not going to mention names just yet but so far the product is very nice. No GUI yet so I have to interact and do pretty much everything via the command line. But, let me share this with you.

I picked a database that is about 180Gigs, not very big, as my virtualizaton candidate. After doing the pre-work that is needed I was able to provision my VDB to a SQL Server. If you look at it on SQL Server you will see that the database is 180Gigs and you are able to do anything, so far, that you would with a real database. I am also running all this on VMWare guests. So now if you look at the size of the VDB via the virtualization stack, you see that the database is only about 3MB. That is a very big savings. Now that comes with one pretty big caveate, but it is worth it. You do have to have a separate SQL Server instance running where the virtualization software is housing a compressed copy of the database that is in a restoring state at all times. This is what the VDB is actually provisioned from. So how big is that database, well I am getting about 3:1 compression ratio, so that restoring copy that is the VDB source is about 40GB. So 180 to just over 40GB and now I have a second copy of that database. Now the really cool thing, Say I need a third copy or more, well all of those can be taken off the same source and they actually only take a few MB to start. That is a huge savings on disk. Think about if I was working with a database that was 1T or more.

Now another really great thing about this, to provision the VDB, it takes a couple minutes. So even for my 180Gig database, for me to do a restore to another instance for developers, I am talking over an hour, now that is literally less than 5 minutes, after the initial backup of the actual source is taken and compressed and presented as a restoring database. But that is a one time process and now I can provision as many VDB’s as I want and it will only take me about 5 minutes or less per copy that I need. Plus I am only going to be taking up Megabytes and not Gigabytes or Terra bytes. The possibilities of this are just awesome.

There is a lot more to talk about around this and I am just scratching the surface here. But keep an eye out as I will be posting more and more about this as I do more testing with the product. I am working with this companies engineers and providing feedback on features that would be nice to have as well as letting them know about bugs that I run into.

 

 

 


August 16, 2012  3:08 PM

Easy way to find bytes per cluster on your disk



Posted by: Colin Smith
Administration, Data Files, Database Administration, Disk Performance

I recently learned a new windows command that is not the most intuitive, but it is nice.  I know that when I get a server and I am told that all my volumes are ready to go for SQL Server, I want to make sure that I have a 64K block. This command makes it easy to verify.

 

fsutil fsinfo ntfsinfo <path>

 

That is it. Really simple but not the most intuitive. Here are the results.

 

FSUTIL

fsutil example. Only 4k blocks on my c: drive


June 29, 2012  10:05 PM

Weight Loss Contest Complete Part 2



Posted by: Colin Smith
Weight Loss, Work Life, Work out

So all the results from my team are in and we did very well. I am pretty sure that we will win the competition but time will tell that. So my team had 8 guys that I work with, including me. At the start our combined weight was literally over a ton. Crazy right? The total starting weight for the team was 2005.6 LBs. As a team we lost 165.7Lbs and that equates to 8.3%. Between the 8 of us we literally lost the wife of a guy on my team. We had one guy that was a slacker and brought our numbers down. This person was assigned to our team and he actually works in another dept. Over the 8 weeks he only lost 2.1Lbs and that brought our group number down, but hey, he still lost and that is what matters most.

I challenge you to do this at your office. I am sure that my team is fairly typical of most IT departments. Good luck and let me know how you do.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: