The Multifunctioning DBA


April 28, 2009  10:15 PM

SQL Version Script



Posted by: Colin Smith
Database, Database Administration, MS SQL, SQL Server

I needed a quick audit of some of the SQL Servers in my environment. We wanted to know what version and what SP each instance was running. I threw this together and it is pretty simple. It only checks for SQL Server 2000 SP4 and SQL Server 2005 RTM thru SP3.

New-PSDrive
i
-PSProvider
filesystem
-Root
Path
to
where
input
file
is

Clear-Content
“Path to results file”

$OUTFILE
=
path
to
outfile

$servers
=
Import-Csv
-Path
“i:\sqltab.txt”

$sqlquery1
=
“select @@version”

$sqlcatalog
=
“master”

foreach ($server
in
$servers)

{

$machine
=
$server.server

$instance
=
$server.instance

$torp
=
$server.torp

$ping
=
$server.ping


if ($instance
-eq
“NULL”)

{

$SqlServer
=
$server.server

$folder
=
$server.server

}

else

{

$SqlServer
=
“$machine\$instance”

$folder
=
“$machine-$instance”

}

$sqlcommandpath
=
“c:\Program Files\Microsoft SQL Server\90\Tools\binn\”

cd
$sqlcommandpath

$version
= ./sqlcmd.exe
-Q
“`”$sqlquery1`” -S $SqlServer -d $sqlcatalog -Y 255″

if ($version
-like
“*1399*”)

{$sqlversion
=
“SQL Server 2005 No Service Pack”}

if ($version
-like
“*2047*”)

{$sqlversion
=
“SQL Server 2005 SP1″}

if ($version
-like
“*3042*”)

{$sqlversion
=
“SQL Server 2005 SP2″}

if ($version
-like
“*4035*”)

{$sqlversion
=
“SQL Server 2005 SP3″}

if ($version
-like
“*2050*”)

{$sqlversion
=
“SQL Server 2000 SP4″}

echo
“`n######################################################################” >> $OUTFILE

echo
”         Server = $SqlServer                                ” >> $OUTFILE

echo
“            $sqlversion                                     ”>> $OUTFILE

echo
“######################################################################`n” >> $OUTFILE

If you have read my other posts then you know that sqltab is a csv file with all the information about my servers. If you have any questions about this or anything else head over to
http://sysadminsmith.com
and click on ‘Submit a Question’


April 24, 2009  6:53 PM

MS Employee Comments on Blog



Posted by: Colin Smith
Database Administration, DBA, Installation, Microsofr, SQL Server

Yesterday I posted a blog about installing SQL Server 2008 SSIS DTS Package Migration Wizard. This can be found at http://itknowledgeexchange.techtarget.com/dba/dts-migration-to-sql-server-2008-ssis/. I received a comment from Andrew Fryer who is a Microsoft Employee about the post. His blog can be found here and he has some pretty cool stuff that you might want to check out. http://blogs.technet.com/andrew/default.aspx

First off I think it is very cool that an MS Employee read my post and actually commented on it. Here is Andrew’s comment.
The backward compatibility stuff is part of the SQL Server feature pack,
http://www.microsoft.com/downloads/details.aspx?FamilyId=C6C3E9EF-BA29-4A43-8D69-A2BED18FE73C&displaylang=en

and we don’t put all this in by default to keep the size of the install down and reduce it’s
complexity as it’s not always needed. My other comment would be that DTSXChange is the killer
tool for migrating DTS to SSIS.

Andrew

I really appreciate the comment and I responded to him telling him that I appreciate that MS
wants to keep the install as lean as possible but I would like to have the option in the installer
to install this feature instead of having to drill down to the MSI file on the installation media.
Andrew again replied saying that I should make this suggestion to the product team know via connect (http://connect.microsoft.com). I ask that all of you, if you agree with me, make this known to the Project team.

Thanks again Andrew.

If you have any questions please let me know by heading to http://sysadminsmith.com and clicking he ‘Submit a Question’ Link.


April 24, 2009  4:15 PM

SQL Ping Servers Improved



Posted by: Colin Smith
Database Administration, DBA, Monitoring, Powershell, SQL Server, Windows Administration

Not long ago I did a post about pinging the interface of a server since you may encounter a problem connecting to WMI even if a server is up. The article is at http://itknowledgeexchange.techtarget.com/dba/powershell-ping/.

So Like I describe in that post I did not want our DBA team to be paged about a SQL Server being down when it was really an issue with the host server. In that case I would like the Windows Administration team to be notified so that they can take care of the issue. I also wanted to check on some other things about the server. I wanted to check the physical interface, check that the agent service is running, and also verify that connections to the database can be made. I have finished the script to do this and I will go over it with you a function at a time. First we will look at the Main function. This is where everything gets set up.

#################################

####### Start Script ############

#################################

$ErrorActionPreference
=
“SilentlyContinue”

$time
=
Get-Date
-displayhint
time

$date
=
date

New-PSDrive
i
-PSProvider
filesystem
-Root \\SomeServer\SomeShare

if (test-path
“i:\OUT\ping_with_service_check.txt”)

{

Clear-Content
-path
“i:\OUT\ping_with_service_check.txt”

}

##Read in List of SQL Servers And set up all needed variables for script

$NTusers
=
“phx.it.systems.nt@pni.com”

$servers
=
Import-Csv
-Path
“i:\inputs\sqltab.txt”

foreach ($server
in
$servers)

{

$machine
=
$server.server

$instance
=
$server.instance

$torp
=
$server.torp

$ping
=
$server.ping

if ($instance
-eq
“NULL”)

{

$SqlServer
=
$server.server

$folder
=
$server.server

$sqlagent
=
“sqlserveragent”

$sqlserverservice
=
“mssqlserver”

}

else

{

$SqlServer
=
“$machine\$instance”

$folder
=
“$machine-$instance”

$sqlagent
=
“sqlagent`$$instance”

$sqlserverservice
=
“mssql`$$instance”

}

echo
“`n############################”

echo
“`n############################” >> “i:\OUT\ping_with_service_check.txt”

$machine >> “i:\OUT\ping_with_service_check.txt”

$machine

if ($ping
-eq
“Y”)

{

if (test-path
“i:\$folder\wmi_Failure.txt”)

{

$wmierror
= 1

$wminoemail
= 1

$wmifile
=
Get-ChildItem
“i:\$folder”
|
where{$_.name
-eq
“wmi_failure.txt”}


$filetime
=
$wmifile.lastwritetime

if($filetime-le
$date.AddDays(-1))

{

del
“i:\$folder\wmi_Failure.txt”

}

}

if (!(test-path
“i:\$folder\wmi_Failure.txt”))

{

$wmierror
= 0

$wminoemail
= 0

}

if (test-path
“i:\$folder\ping_Failure.txt”)

{

$pingfail
= 1

$pingnoemail
= 1

$pingfile
=
Get-ChildItem
“i:\$folder”
|
where{$_.name
-eq
“ping_failure.txt”}

$pingfiletime
=
$pingfile.lastwritetime

$pingfiletime


if ($pingfiletime
-le
$date.AddHours(-6))

{

echo
“File was created more than 1 minute ago. $pingfiletime”

del
“i:\$folder\ping_failure.txt”

}

}

if (!(test-path
“i:\$folder\ping_Failure.txt”))

{

$pingfail
= 0

$pingnoemail
= 0

}

if (test-path
“i:\$folder\agent_Failure.txt”)

{

$agentfail
= 1

}

if (!(test-path
“i:\$folder\agent_Failure.txt”))

{

$agentfail
= 0

}

Ping_Interface


}

if ($ping
-eq
“N”)

{


echo
“$sqlserver – is not being pinged at this time. Check SQLTAB” >> i:\OUT\ping_with_service_check.txt

echo
“############################” >> “i:\OUT\ping_with_service_check.txt”

}

}

Ok so you can see from looking at the script that this is where the script starts even though, you will see when you see the entire script, that this is the end of the script. That is because in Powershell you must define all functions before the script begins.

As you can see I start out by setting some things upand clearing out my log file. You could just keep appending to the file but I chose not to. You can also see that I set up a variable called $servers. This is populated by a .csv file that I have that has all the information about my SQL Servers that I am going to need for this script and for my other monitoring script. The heading of that file is as follows.

Monitor,Server,Instance,TorP,ErrorLog,Ping,OS2000

This way I know if I want to monitor the server, I may choose not to if it is out of commission for a while, I know the hostname, instance name, if it is Test or Production, if I want to run this Ping Servers Script on it and if it is running Windows Server 2000 as I can not run powershell locally on those servers.

Now that I have that variable populated I can start my loop. I start setting up each individual variable that I am going to need later on. Now I start looking for some touch files called wmi_failure.txt and ping_failure.txt. These are so I will know if either a wmi call or a physical interface ping failed on the last run of the script. On both of these I do not want to send out notification every 10 minutes but for wmi failures just once a day and for interface failures once every 6 hours. Now you may be asking why only every 6 hours if a servers interface is down since that means the server is basically useless. I would agree with you, but remember I wanted to notify our Windows Team about this issue and this was there call. No pages and only email us once every 6 hours about the issue. I was shocked at this response. Anyway, I look at the file if it is there and then I check how old it is based on the current time of the script running. If it has been passed the allotted amount of time then I remove that file and set the failure variables to 0 so the script will know to send out notifications about the failures should they still occur. Once I set all that up I can call my firs function if this script should be run against the server. That is noted by $ping being equal to Y or N. Next time I will show you my Ping_Interface function.

If you have any questions or comments about this script please let me know at http://sysadminsmith.com and click the ‘Submit a Question’ link to the right.


April 23, 2009  9:31 PM

DTS Migration to SQL Server 2008 SSIS



Posted by: Colin Smith
Database Administration, DBA, Installation, SQL Server

Today I had a developer ask about moving old DTS packages into SQL Server 2008 SSIS. This is our first real test SQL Server 2008 instance and we had never done this. He was attempting to use the Package Migration Wizard to move the package into SSIS. We installed the BI Visual Studio Components and also the Backward Compatibility Components during the SQL Server 2008 installation. We verified all this and asked if he could try again. He did and sent us the same error. I found the following article that explains how to resolve the issue.

http://msdn.microsoft.com/en-us/library/ms143755.aspx

Apparently, in MS’s infinite wisdom, the decided that you will also need to install the following.

SQLServer2005_BC*.msi

This on the install media you just need to dig down and find it. Once we installed this everything was fine and the developer was able to move the DTS package over using the Wizard. Not sure why they make us do an additional install that is not available during the install of the product.


April 23, 2009  2:25 AM

Oracle Upgrade Complete



Posted by: Colin Smith
Database Administration, DBA, Oracle, Upgrade

I was able to finish up the Oracle Upgrade that I wanted to do today. First I removed the previous installation of Oracle 10 on the server. I then installed the software and all patches that I needed. I then changed my variables to reflect the new Oracle Home Directory that I wanted to use. After that I launched the dbua and walked thru each step, I did not move the files during the upgrade so it was an in place upgrade. I did have a good backup that I could fail back to if needed. This is very important. After a couple hours the upgrade had finished. I still had to copy over the tns folder and change the listener.ora file also to reflect the new Oracle Home. I also needed to change some parameters in the spfile.ora. The application that uses this DB has provided a tool to edit the spfile directly. I edited the file and I started the DB with no issue. This took me just under 4 hours and that was my target time. I think that I am ready to upgrade our production servers and feel confidant that I will not have any issues that I can not overcome.


April 22, 2009  3:34 PM

Oracle Upgrade Testing Once Again



Posted by: Colin Smith
Database Administration, DBA, Oracle, Upgrade

Today I will again embark on the journey that is Upgrading Oracle. This will be the third and final time that I do this on our test system in preparation to upgrade our Production Cluster that also has a DataGaurd Backup solution. I expect this process to take a about 4 hours including the removal of the current Oracle 10 software that I have installed on the server. I am running through this from the very beginning and expect to hit no road bumps this time around. If all does go well this time then I will start to plan how I am going to handle the production environment and how much downtime I am going to require. I will require downtime since we are doing an in place upgrade. It would be nice to have enough disk to install 10 on one Cluster node and upgrade the Database files while moving them to a new disk location and then failing over the cluster once that is complete. I, however, will have to upgrade the files in place and use only the DR server as a backup. That should not be a problem though. I think I will upgrade one node bring the DB up and then install Oracle on the second node. Once that is complete the Cluster should be done. Then in a day I will upgrade the DataGaurd server as well and re-enable the replication.


April 21, 2009  9:06 PM

SQL Server Education



Posted by: Colin Smith
Database Administration, DBA, Education, SQL, SQL Server

In an earlier post I said that I was about to start reading some books from the advanced Sybase Training course. I have changed course since then and I have decided to move on with some SQL Server education instead. I have decided that I would like to get my MCITP: Database Administrator certificate. I have never really felt like certificates were very meaningful while I was doing workstation and Windows administration. I have changed my views of certificates when it comes to the Database world. I have decided to tackle SQL Server first for a couple of reasons.

  1. It is a MS Product and has a nice GUI. I come from a Windows Administration background and feel that this is the logical course.
  2. My company has roughly 50 production SQL servers that we really do not do much with. I have written a Powershell script to monitor the SQL Servers for errors but we need to do more.

I have already started my journey and have finished reading my first MS SQL Server book. I read Beginning SQL Server 2005 Administration by Wrox publishing. I enjoyed the book and I ordered the next in the series called Professional SQL Server 2005 Administration. I like the way the books are laid out and have hands on examples that you can try if you have installed the AdventureWorks Database.

I have also ordered the study material for the three tests that I will have to take in order to get the certification. I am hoping to get these books in the mail in the next day or two. I would like to try to take one test a month and have my certification in about three months. I am looking forward to doing more in SQL Server and have learned a lot just from the first book. I will continue to keep you updated on this and hopefully you can learn along with me.

If you have any questions please head over to http://sysadminsmith.com and click the submit a question link to the right.


April 21, 2009  7:04 PM

The Security Battle



Posted by: Colin Smith
Database Administration, DBA, MS SQL, SQL Server

I just had a battle with some applications support team members about securing our data. We have a system that connects to a SQL Server backend, this system has been in place for over a year now and is about to get a major application update. When this system was first implemented management let security go by the wayside and did not really care what we did as long as users were able to do what they needed to do and fast. Because of this management mandate the system was left essentially wide open to the world. This has been an annoyance to us DBA’s as we do not want to be responsible for any issues that may arise that are due to the concerns that we voiced when this was put in place.

Since a big upgrade is about to occur I have again brought up all the concerns that we have and everyone again agreed that we need to make these changes but tried to give us the run around again. They say things like it is changing to much at once. We will not know if we need to look at the application or the Database Security changes if we have an issue. To be honest all I hear is “blah blah blah”. I was involved in the initial set up of all this but at that time I was not a DBA and had little understanding of the issues that can arise with the lack of security that was in place. Now that I am a DBA and I do have a much better understanding I am unwilling to move forward without making the changes. I talked to the applications support team manager and expressed my concern and told him that I do not think this is something we can do without. The concern of what to look at when a problem arises should not be an issue as security should be the number one priority when setting up a system that contains sensitive data.

I finally got my point across and I have them on board for now. I think that I have learned that no matter who says what, I am responsible for this data being safe and sound and ready for users to use. I will not take the risk of losing this data for any preventable reason. If you are new to the DBA world just know that your data is your data and you need to do whatever possible to keep it safe and confidential. Do not give up on that battle. You can win and if you do not then I would update resume and get ready for when your data gets stolen or corrupt because you did not stand your ground on the issue of security.


April 7, 2009  5:14 PM

Powershell Ping



Posted by: Colin Smith
Database Administration, DBA, Monitoring, Powershell, Windows Administration

I have a script that goes out and pings all of my SQL Servers. This script, right now, simply makes a connection to the SQL Server instance. If I am able to connect I call it good and if not I send out notifications saying that the SQL Server is down. I found that more often than not I was getting these alerts because of some issue with the OS. This could be that the network interface was down, the server was powered off, or a number of other issues that could impact connecting to the SQL Server instance. My team usually ended up looking into it and having to call one of our Windows Systems Administrators in order to get the issue corrected.

The DBA Team did not really like that we got notified on OS issues before the Windows Team. The Windows Team was happy that we were letting them know of issues but they do not really like that the DBA team sees the problem before they do either. Because of this I decided to redo the SQL Ping Servers thing that I am going to do is to ping the servers interface and make sure that the server is up and responsive. If it is not then I as a DBA will not really be able to do anything anyway so if the ping fails I will just notify the Windows team directly. To do this Powershell needs to connect to WMI and look at the win32_pingstatus like so.

$pingresult
=
Get-WmiObject
win32_pingstatus
-f
“address=’$machine’”

This does work well but I ran into another issue. What if WMI of a server is not working for some reason? I have had this issue come up and then it may look like I cannot ping the physical interface on a server when in fact I can. I thought that I need a more effective way just to test the PING. Why not just use good old ping. So I did the following and it works great.

$pingresult
=
ping
pni-vmdbasql
-n 1

$pingresult

if ($pingresult
-like
“*(0% loss*”)

{Echo
“Ping Success”}

if (($pingresult
-like
“*100% loss*”) -or ($pingresult
-like
“*could not find host*”))

{echo
“Ping Failed notify Systems Team”}

This way I can test the actual ping of the server. If the hostname cannot be resolved or if my loss is 100% then I consider this a failed test and I will notify the Windows Team. If I have 0% loss then I can move to my next function and continue with my basic health check of the SQL Server.

I will but more about this PING SQL Server script out when I have completed it. I just thought this was a very important distinction to make you aware of.

Any questions please head over to http://www.sysadminsmith.com and click on the ‘Submit a Question’ link to the right.

Script and incorporate some more steps in it that will better determine what the cause of the issue is and will notify the appropriate team.

The first


March 31, 2009  4:36 PM

Perfstat Cron Jobs



Posted by: Colin Smith
Database, Database Administration, Oracle, Perstat, Statspack

Here is the update on changing the perfstat statspack job that I spoke of earlier. The question was how can I get statspacks to run every 15 minutes during OLTP and every 2 hours during batch. My thought is a Cron Job that will run the SQL for you. Just make a connection to the DB using SQLPLUS and call the SQL file with the appropriate sql to change the job. Since the user asking the question had the SQL done here is a sample of how I would write the script to run in Cron.

#!/bin/sh
sqlplus -S User/Password@$instance @/full/path/to/file/filename.sql

That is it. Pretty straightforward really. Hope that helps. If you have any questions about this or anything else just head over to http://sysadminsmith.com and click the ‘Submit a Question’ link and I will get back to you.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: