SQL Server with Mr. Denny


March 14, 2011  2:00 PM

Free online classes “Microsoft Virtualization for VMware Professions”



Posted by: Denny Cherry
Hyper-V, SQL Server, Training, VMware, Webcast

Microsoft is putting on a great free three day online class for VMware professionals that need to learn more about Hyper-V.

Day 1 will focus on “Platform” (Hyper-V, virtualization architecture, high availability & clustering)

Day 2 will focus on “Management” (System Center Suite, SCVMM 2012 Beta, Opalis, Private Cloud solutions)

Day 3 will focus on “VDI” (VDI Infrastructure/architecture, v-Alliance, application delivery via VDI)

The class is being taught by two top notch presenters Microsoft Technical Evangelist Symon Perriman (a good friend of mine, who knows his stuff) and leading Hyper-V, VMware, and XEN infrastructure consultant, Corey Hynes.

Each of the days is a separate event so you need to sign up for each one separately.  I’ve made each of the days a separate link to make your life easier.   Just click through each link and register.  The only downside to the amazing training opportunity is that it is happening during the Dev Connections conference, so if you are attending Dev Connections in April, 2011 you won’t be able to take advantage of this amazing FREE (did I mention FREE) training event.  As I’ll be at Dev Connections I won’t be able to make it, which is something which I really bummed about.

Denny

March 10, 2011  2:05 PM

SQL Excursions launched yesterday.



Posted by: Denny Cherry
In Person Events, SQL Server

In case you missed all the talk about SQL Excursions yesterday, I’ll recap here since a lot of people read blogs but aren’t on Twitter.

Yesterday, with great fanfare I announced the launch of SQL Excursions.  You can read all about it on the announcement blog post.  The important parts are:

SQL Excursions is designed to provide fun, educational SQL Server training events.  The excursions will be held in beautiful locations across the US (and eventually world wide). Each excursion will also have social events for the session attendees as well as spouses, partners, guests, etc. so that they can come to the excursion and not be bored by the technical sessions.

Each excursion will have top notch speakers, typically Microsoft MVPs and/or Microsoft Certified Master recipients.  As each excursion is announced the speaker and session outline will be posted as well.  What will make this training different from other kinds of training is that while a basic outline will be posted with what the speakers are planning on speaking about, the exact content will be voted on by you, the session attendees.  This will ensure that the you receive the sessions that will be of the most use to you in your day to day work life.

If you are on Facebook I’ve got a page there, and if you are on Twitter there’s the official SQL Excursions account (@SQLExcusions).  As the excursion is ready to be announced (there’s a bunch of legal stuff which I need to get out of the way) it’ll be announced on Facebook, Twitter as well as the SQL Excursions web page.  There’s a news letter which you can sign up for on the home page of the SQL Excursions web page so that the new information will be sent directly to your inbox.

Denny


March 10, 2011  2:00 PM

Thinking about attending the SSWUG vConference? Sign up now and save $30.00



Posted by: Denny Cherry
In Person Events, SQL Server, SSWUG

If you were planning on attending the SSWUG virtual conference “DB Tech Con” which is happening April 20-22, 2011; and you’d like to save $30.00 of the cost of the conference, have I got a deal for you.  When you sign up using the discount code SP11DBTechDC you’ll be given a $30 discount off of the current price.

You can sign up on the normal registration page and enter the code SP11DBTechDC into the VIP Code box then press the “Update Registration” button.

Denny


March 7, 2011  2:00 PM

Join me and Marathon Technologies as I talk about Consolidation and Virtualization



Posted by: Denny Cherry
Consolidation, SQL Server, Virtualization, Webcast

Join me Wednesday March 9th at 8am Pacific (11am Eastern) as I join Marathon Technologies as I present a webcast titled “Controlling SQL Server Sprawl: The Consolidation Conundrum and Availability Imperative“.  During this session I’ll be talking about some of the benefits and risks consolidating SQL Server databases and instances.

Denny


March 3, 2011  2:00 PM

If you have a SAN RAID 5 LUN for your data and a SAN RAID 10 LUN for your logs, and say a local RAID 1 array for the C drive in the server itself, where do you usually install the SQL Server binaries to?



Posted by: Denny Cherry
SQL Server, Storage

In this sort of configuration, which is pretty normal actually, I would but the SQL Server binaries on the C drive.

Denny


February 28, 2011  2:00 PM

I’m having a sale on SQL Saturday slide decks, two for the price of one.



Posted by: Denny Cherry
SQL Saturday, SQL Server

The last two weekends I’ve presented at two different SQL Saturday events, so as I’m pretty lazy I’m posting the decks for both weekends in a single blog post.

SQL Saturday 47 – Phoenix

SQL Saturday 65 – Vancouver, BC

Now the slide decks for my encryption deck are the same, just different templates, but I wanted to upload what people actually saw.

Denny


February 24, 2011  2:00 PM

ANSI Settings can have all the difference in the world.



Posted by: Denny Cherry
Session State, SQL Server

So a little while back we noticed that we had some high CPU load coming from a single stored procedure in the Session State database.  This single procedure (TempUpdateStateItemLong) was taking up 80% of the CPU time that the database was using (we used Spotlight for SQL Server Enterprise from Quest software).  But in another session state database that same procedure was down in the single digits.  So something must be different between them.

I opened them both up in SSMS and the code for the procedures was identical (as you would expect), but there was something different.  The procedure that had the really high CPU % times was compiled with the SET QUOTED_IDENTIFIER ON setting, while the procedure that had the really low CPU % times was compiled with the SET QUOTED_IDENTIFIER OFF setting.

I have no idea why there was a difference, but I changed the one which was ON to OFF and pushed the procedure into the database.  As soon as I did the CPU % for that procedure dropped down into the single digit range where it should have been.

Let this be a lesson, those setting definitely matter.  And don’t trust that they are correct, even in Microsoft provided code like Session State.

Denny


February 21, 2011  2:00 PM

If you see your replication log reader slow down for no reason, here’s some stuff to look at.



Posted by: Denny Cherry
Replication, SQL Server, SQL Server 2008

So a while back we were seeing some very strange behavior with our SQL Replication.  Every once and a while for no apparent reason the log reader would just slow down the pulling of records from a single publisher database.  Our replication was setup with a single publisher, and a single distributor with over a dozen publications all being sent to a couple of different subscribers.

At random times we would see the latency for all the publications for a single database start to climb, eventually being a few hours behind for no apparent reason.  Looking in the normal places didn’t lead me to much.  I looked at some execution plans, and saw a couple of performance issues there (with the Microsoft code) so I threw a couple of new indexes onto the MSlogreader_agents and MSsubscriptions tables (see below) and I also made a couple of tweaks to the sp_MSset_syncstate procedure to fix some of the pathetic code which I found within the procedure (you’ll also find this below).

This helped a little, but it didn’t solve the problem.  What did was when I queried the sys.dm_os_waiting_tasks dynamic management view.  This showed a large number of processes with a wait_type of TRACEWRITE, and these were waiting long enough that blocking was actually starting to pop up (very sporadically making it very hard to see).  A query look at sys.traces told me that there were three traces running against the server. I knew that I didn’t have one running, so I took the session_id values which were shown in sys.traces and looked in sys.dm_exec_sessions for those session IDs to find out who needed to be kicked in the junk.  Turns out that the traces were being run by Quest Software’s Spotlight for SQL Server Enterprise’s Diagnostic Server (the program_name column read “Quest Diagnostic Server (Trace)”).

So I logged into the diagnostic server’s via RDP, and opened Spotlight .  Then edited the properties for the server which is our distributor.  Then I opened the SQL Analysis window, and disabled the SQL Analysis for this server.  Pretty much as soon as I clicked OK through the windows the TRACEWRITE locks went away, and the latency went from 2 hours down to 0.

This just goes to show, just how careful that you have to be when using SQL Profiler (or any sort of tracing) against your database server.

Denny

P.S. If you decide to make these changes to your distributor keep in mind that they may cause anything or everything to break, including patches, etc. that you try and install against the SQL Server engine.  These changes were made for a distributor running SQL Server 2008 R1 build 10.0.1600, use against another build at your own risk. That said, here’s the code.

USE distribution
GO
CREATE INDEX IX_sp_MSget_new_errorid ON dbo.MSrepl_errors
(id)
WITH (FILLFACTOR=100)
GO
CREATE INDEX IX_sp_MSadd_logreader_history
ON dbo.MSlogreader_agents
(id)
include (name, publication)
GO
CREATE NONCLUSTERED INDEX IX_sp_MSset_syncstate
ON MSsubscriptions
(publisher_id, publisher_db, article_id, subscription_seqno)
include (publication_id)
with (fillfactor=80)
GO
CREATE NONCLUSTERED INDEX IX_sp_MSset_syncstate2
ON MSsubscriptions
(publisher_id, publication_id, sync_type, status, ss_cplt_seqno, publisher_db)
include (article_id, agent_id)
WITH (FILLFACTOR=90, DROP_EXISTING=ON)
GO

ALTER procedure sp_MSset_syncstate
@publisher_id smallint, 
@publisher_db sysname, 
@article_id int, 
@sync_state int,  
@xact_seqno varbinary(16)
as
set nocount on 
declare @publication_id int

select top 1 @publication_id = s.publication_id 
from MSsubscriptions s
where 
s.publisher_id = @publisher_id and
s.publisher_db = @publisher_db and
s.article_id = @article_id     and
s.subscription_seqno < @xact_seqno


if @publication_id is not null
begin
	if( @sync_state = 1 )
	begin
		if not exists( select * from MSsync_states 
		               where publisher_id = @publisher_id and
					   publisher_db = @publisher_db and
					   publication_id = @publication_id )
		begin
			insert into MSsync_states( publisher_id, publisher_db, publication_id )
			values( @publisher_id, @publisher_db, @publication_id )
		end
	end
	else if @sync_state = 0 
	begin
		
		delete MSsync_states 
		where 
		publisher_id = @publisher_id and
		publisher_db = @publisher_db and
		publication_id = @publication_id 

		-- activate the subscription(s) so the distribution agent can start processing
		declare @automatic int
		declare @active int	
		declare @initiated int

		select @automatic = 1
		select @active = 2
		select @initiated = 3

		-- set status to active, ss_cplt_seqno = commit LSN of xact containing
		-- syncdone token.  
		--
		-- VERY IMPORTANT:  We can only do this because we know that the publisher
		-- tables are locked in the same transaction that writes the SYNCDONE token.
		-- If the tables were NOT locked, we could get into a situation where data
		-- in the table was changed and committed between the time the SYNCDONE token was
		-- written and the time the SYNCDONE xact was committed.  This would cause the
		-- logreader to replicate the xact with no compensation records, but the advance
		-- of the ss_cplt_seqno would cause the dist to skip that command since only commands
		-- with the snapshot bit set will be processed if they are <= ss_cplt_seqno.
		--
		update MSsubscriptions
		set status = @active,
			subscription_time = getdate(),
			ss_cplt_seqno = @xact_seqno		
		where
			publisher_id = @publisher_id and
			publisher_db = @publisher_db and
			publication_id = @publication_id and
			sync_type = @automatic and
			status = @initiated and
			ss_cplt_seqno <= @xact_seqno	
		OPTION (OPTIMIZE FOR (@automatic=1, @initiated=3, @publisher_id UNKNOWN, @publisher_db UNKNOWN, @xact_seqno UNKNOWN))
	end
end
GO


February 17, 2011  2:00 PM

For the love of god people, quit screwing around with the base permissions within SQL Server.



Posted by: Denny Cherry
Database security, SQL Server, SQL Server 2000, SQL Server 2005

I know that security people like to remove permissions from everything before certifying that a server is ready to go into production.  And like 10+ years ago that was something that you might have wanted to do (I’m just talking about SQL Server here).  However in today’s world of SQL Server 2005 and newer that isn’t needed.  These newer versions are designed to take security much more seriously than before.  The rights that are granted to public in the master and msdb databases should be left the hell alone.

If you are going to go around revoking database permissions that you don’t understand what do, don’t come to be complaining that your SQL Server isn’t working correctly.  Guess what, those permissions where there for a reason, and should be left alone.  If you have some out dated security mandate that says that all database permissions must be revoked from public before the server can be used, then you had damn well better understand what that means.  And you should probably update your stupid security policy so that it reflects the changes that have been made in the product over the last 10 or so years.  Even under SQL Server 2000 I didn’t ever recommend that people remove all the rights from public, ever.  Not if you wanted the SQL Server to work as expected.

If you have decided to go and remove all the permissions, then you will probably want to install a new SQL Server, and find all the permissions there and grant them back.  That or restore your master database back to a state from before you screwed it up, which is the same thing that I recommended to the person in the forum thread above do.  If you intentionally break your SQL Server don’t expect much sympathy from me.

Denny


February 14, 2011  2:00 PM

My MCM Lab experience.



Posted by: Denny Cherry
Certifications, MCM, SQL Server, SQL Server 2008

So just in case you missed it, I took and passed the Microsoft MCM lab last Thursday.  Since I’m the first person to go through the test in the new format I wanted to put together some comments (as best I can thanks to NDAs).

So the MCM gives you 6 hours to complete, and when you take it, you’ll need most if not all of the time.

I basically finished in a little over 4 hours, then went back and reviewed a couple of the scenarios which took me until about the 5 hour mark.  The exam is made up of several smaller scenarios which are independent and each of which will test your knowledge of one or more parts of the SQL Server engine.

As I mentioned in my prior blog post you will need to know all aspects of the SQL Server engine very well.  After taking the lab I can say that I feel this even more now.  With the time limitation you need to know the production very well, as well as where to find useful information in Book OnLine.  The lab is a closed book exam, and the only reference that you have access to is Books OnLine.

When you do decide to take the exam, don’t expect to get your results back as quickly as I did.  Currently Microsoft only has one Lab Exam environment to do the tests on, and that environment had to be reset after I took it for another person so my exam had to be scored very quickly.  Once the lab is released for general use scoring will take about 30 business days as the exam will be scored by hand to account for creative solutions to the scenarios as well as to account for creative ways that people may try and cheat on the exam.

I didn’t realize just how worried about taking the exam I was until I was done, and logged off of the environment.  I could actually feel the stress just leave once I finished the exam.  And this was before I knew if I had passed.  I knew that I would get my result quickly because of the next person taking it so quickly, and I knew that if I passed I would be thrilled, but I didn’t realize just how happy I would be to have passed.

When you take the exam, you can’t take any water or food into the exam room, but you can leave some in a locker outside the exam room.  You’ll want to bring food with you to eat and drink.  Getting hungry during the process sure won’t help you pass the exam, and it’ll just distract you.  Just keep in mind that snack breaks do count against your 6 hour window.

Personally I didn’t do a whole lot of studying (I talked about my professional experience in my last post). I watched most of the MCM videos that SQL Skills put out.  I skipped ones that were on topics that I can speak on in my sleep like storage, Service Broker, etc just because I felt that I knew that material the best.  I focused on the indexing internals, and the security pieces as to me those are the most complex parts of the database engine to me.

Hopefully this helps you on your trip though the MCM process.

Denny


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: