SQL Server with Mr. Denny


April 21, 2010  4:12 PM

Limited SQL Server 2008 R2 Virtualization Support

Denny Cherry Denny Cherry Profile: Denny Cherry

As part of the SQL Server 2008 R2 launch it was announced that SQL Server 2008 R2 now supports Hyper-V Live Migration.  However according to MSDN only the Standard, Enterprise and Datacenter editions support Live Migration.

I’m not sure why those are the only editions which support Live Migration, but hopefully they will get this extended to the Workgroup, web and Express editions as those editions probably get virtualized more than the Enterprise and Standard edition databases do.

Denny

April 21, 2010  3:09 PM

SQL Server 2008 R2 has been Released to Manufacturing

Denny Cherry Denny Cherry Profile: Denny Cherry

Microsoft’s new database platform SQL Server 2008 R2 has gone RTM.  This means that the code has been locked, and shipped to the DVD pressing company so that they can begin pressing the DVDs.

For those looking on MSDN or TechNet to download the RTM version, hold off.  The final version will be released to MSDN and TechNet on May 3rd, 2010 and to Microsoft Volume License customers on May 13th, 2010.

Some of the features you can look forward to in SQL Server 2008 R2 include:

  • StreamInsight which provides you a low latency event processing system, which is designed to import data in near realtime.  StreamInsight is perfect for processing log files, or telephone call records which are contently processed into the platform.
  • Master Data Services allows you to build a single point which holds your business data from multiple applications into a single repository.
  • PowerPivot is probably the most talked about part of SQL Server 2008 R2 as this allows end users to enjoy the power of the SQL Server Analysis engine on their desktop so that they can create dynamic data applications where they can slice and dice the data any way they need to, without having to be connected to the SSAS instance.
  • SQL Server 2008 R2 now fully supports Live Migration and Virtualization under Hyper-V.
  • You can run SQL Server 2008 R2 on some of the largest hardware available, as SQL Server 2008 R2 supports 256 logical CPUs in a single server (with the data center edition).

If you are wondering if you have the RTM release of SQL Server 2008 R2, the build number for SQL Server 2008 R2 is 10.50.1600.1.  If you have another build then you don’t have the RTM release.

As part of the release announcement Dave Jones from the Mississippi Department of Transportation gave an overview of how MDoT uses SQL Server 2008 R2 as an early adopter using PowerPivot, Report Builder 3.0 and SSRS to meet internal needs on a highly available clustered install.

We also heard from Ron Van Vanten from Premier Bank Card who is the Directing Officer of Business Intelligence.  They manage over 40 TB of data under SQL Server, with a data warehouse which is over 20 TB in size.  They are over 150 instances of SQL Server hosting this data. Premier Bank Card is using the self service features of SQL Server 2008 R2 such as SSRS, Report Builder and PowerPivot to empower its business users to make better decisions while reducing overall costs.

The third customer that we heard from was Ayad Shammout of Care Group Systems.  Ayad talked a little bit about how PowerPivot was able to help their users quickly and easily learn how to analyze data without having to wait for IT resources.

The Parallel Scaleout Datawarehouse Edition has not gone RTM yet.  It has gone into a second RC build, and will be released later this year.

There are some major changes in what each edition supports with SQL Server 2008 R2.  Some of these include that Express supports 10 Gig databases.  The SQL Server 2008 R2 standard edition will now include backup compression saving customers from needing to purchase a third party solution to get the backup compression which was introduced in SQL Server 2008.

Be sure to check out the SQL Server 2008 R2 official launch site http://www.sqlserverlaunch.com/.

Denny


April 19, 2010  11:00 AM

Hey sysadmins, powerpivot is coming, and it’ll take a lot of disk space.

Denny Cherry Denny Cherry Profile: Denny Cherry

So if you haven’t heard of PowerPivot yet, it’s going to be the next best thing to sliced bread for your business users.  While it’ll take a while for most IT shops to get the full Sharepoint & PowerPivot integration setup, PowerPivot will be available to your users within the next few weeks as Office 2010 is released.

The basic jist of PowerPivot is that you can download a huge amount of data from your data warehouse, then slice and dice the data, analyzing it with the power of the SQL Server Analysis Services engine.  The downside of all this you ask?  Really, really big Excel workbooks.  And when I mean big, I mean big.  The user can download 30+ Gigs of data into the Excel workbook, which Excel will compress into an Excel workbook of up to 2 Gigs. Continued »


April 15, 2010  11:00 AM

Powering up and down a storage array may not be done in the order you think

Denny Cherry Denny Cherry Profile: Denny Cherry

Powering up a single server is easy. Turn on the direct attached storage, then turn on the server.  Done.  When powering your storage array down or back up there is quite a bit more to it than that.

The proper procedure for powering down your array is:

  1. Power off the servers.
  2. Power down the storage processors.
  3. Once the storage processors have powered down completely you can power down the disk drive shelves.
  4. Power down the switches.

The proper procedure for powering up your array is:

  1. Power on the switches
  2. Power on the disk shelves
  3. Power on the storage processors
  4. Power on the hosts

Now the good news on the power up is that you can do steps 1, 2 and 3 together.  The switches will take 1-2 minutes to boot up and become operational.  The disk shelves will come online within a few seconds.  The storage processors will take 8-10 minutes to come online (I’m talking about an EMC CX array here, different arrays will have different power up times).

On the power down process there aren’t a whole lot of shortcuts you can take.  Which is usually a problem, because if you have to power down the array there is usually a major power problem going on, which involves no power coming into the array.

Hopefully you never need to know this information, but if you do need to this is useful information to have.

Denny


April 12, 2010  9:00 AM

Man, I really hope this is my last post about plagiarism.

Denny Cherry Denny Cherry Profile: Denny Cherry

So there have been a lot of talk recently about plagiarism in our little corner of the Internet.   If you haven’t read up on it you can read up here, here, and here.  Sadly this wasn’t the first time that plagiarism has reared its head recently.  The prior incidents can be found here, here, here and here (among others).

While I was up at the SQL 2008 R2 Tweet Up my friends Sean and Jen from MidnightDBA (@MidnightDBA) and Jorge Segarra (@SQLChicken) and I got together to talk a bit about plagiarism in general and this recent incident.  The talk was recorded and they published it over the weekend as part of the DBAs @ Midnight series of videos.  The video was broken down into 4 sections when they uploaded it (Part 1Part 2, Part 3, Part 4).

Hopefully you’ll have as much fun watching the video as we had making it.  During the videos we are pretty critical of some people within the community (if you’ve read the blog posts I link to above you can probably guess who).

Oh, and some of the language isn’t kid/boss safe.  Consider yourself warned.

Denny


April 5, 2010  12:00 PM

Events, events, events. When the hell am I suposed to get work done?

Denny Cherry Denny Cherry Profile: Denny Cherry

This week I’ve got a few speaking events to let everyone know about.

On Tuesday April 6th, at 6pm I’ll be speaking to the LA C# User Group.  The meeting is held at the Manhattan Beach Library at 1320 Highland Ave in Manhattan Beach, CA 90266.  The group asks for $7 (or $5 is you RSVP by 3pm).

On Wednesday April 7th,, at 6pm I’ll be speaking to the SoCalDotNet User Group.  The meeting is held at Partners Consulting at 7101 Village Dr, Buena Park, CA. The group asks for $7 (or $5 is you RSVP by 3pm).

At both user group meetings I’ll be giving the same presentation “Back To Basics; Getting Back To The Basics of SQL Server“.

Also on Wednesday I’ll be attending and answering questions at the SSWUG vConference.  My sessions are from 9:30a-10:30a, 10:30a-11:30a and 12:00p-1:00p (all times are pacific).  If you haven’t registered yet, don’t forget to use my discount code DCSPVC10 to save $30.00.

Now on the 6th and 7th I’ll be work at Starbucks in the afternoons so that I don’t have to sit in traffic for ever.  On the 6th I’ll be at the Starbucks at 233 Manhattan Beach Blvd, Manhattan Beach, CA.  On the 7th I’ll be at the Starbucks at 7711 Beach Blvd, Buena Park, CA.  I’ll probably get there about 2pm or so on each day if anyone wants to come and hang out before the meeting.  Keep an eye on my Twitter stream (@mrdenny) as I’ll be posting where I’m at as I get there.

On Thursday I’ll be up in Redmond, WA at a Tweet Up with Donald Farmer on the Microsoft Campus.  Sadly its a private event so I can’t invite everyone to show up and hang out.

Sometime on Friday (or on Thursday after the Tweet Up) my friends Jorge Segarra (@SQLChicken), Sean and Jen McCown (@MidnightDBA) and I will be recording a pod cast.  The topic of our DBAs@Midnight recording will be plagiarism; a topic which there has been a lot of discussion about recently on Twitter.  Should be a good time recording this.  Once I know when Sean and Jen get it posted I’ll be sure to send out the link.

Hopefully I don’t have to get a whole lot of my normal work done this week.  If I do, I’m not sure when I’m actually going to be able to.

Denny


April 1, 2010  11:00 AM

Getting started with Master Data Services in SQL Server 2008 R2

Denny Cherry Denny Cherry Profile: Denny Cherry

A couple of days ago a new article went live over on SearchSQLServer.com where I dig into setting up Master Data Services.  This is the first in a series of articles which will be either 2 or 3 articles long where I’m going into MDS in some detail.

The first article “Getting started with Master Data Services in SQL Server 2008 R2” is the one which went up, with the others coming in the following months.

Denny


March 29, 2010  11:00 AM

So I was installing SQL 2008 CU6 for SP1 the other day…

Denny Cherry Denny Cherry Profile: Denny Cherry

God that sounds boring doesn’t it.

Anyway, I was installing SQL 2008 CU 6 for SP 1 the other day and I was greeted with the dreaded “some process is running which will cause your SQL Server to reboot after install” screen.  The app which was running was fshost.exe which is part of the Full Text Search process.

So I naturally stopped the SQL Full Text Service, gave it a few minutes and the process was still running.  As this was a production server I didn’t want to just kill the process (the only thing which uses Full Text on the server is the MS CRM database, and we don’t even use MS CRM yet).

So I moved forward and opted for the reboot.  After the installer finished however there was no reboot prompt.  Apparently when the SQL Service went down for patching the fshost.exe also went down so there was no reboot needed.

So if you are installing a CU, and it tells you that you’ll need to reboot because of fshost.exe you probably won’t need to reboot.

Denny


March 25, 2010  11:00 AM

Using Service Broker instead of Replication

Denny Cherry Denny Cherry Profile: Denny Cherry

During my SQL Service Broker sessions I talk about using SQL Service Broker as a replacement for SQL Replication.  But I’ve never really documented how to do this other than bits and pieces here and there.  So here I’m going to try and spell out how I put a system like this together for an auto finance company that I used to work for.

Why Service Broker

Before I get into the code a little reasoning behind why we went with SQL Service Broker over SQL Replication is probably good to talk about.

The main reason is that we wanted to be able to do ETL on the data as is moved from the production OLTP database to the reporting database.  We also wanted the ability to easily scale the feed from one to many reporting databases with the reporting servers being in different sites.

Tables

Below you’ll find some SQL Code that we’ll use to create some sample tables.  In our OLTP database we have two tables LoanApplication and Customer, and in our Reporting database we have a single table LoanReporting.  As data is inserted or updated in the LoanApplication and Customer table that data will then be packaged up into an XML document and sent to the reporting database.  In the case of my sample code here everything is on one server, but the databases could be easily enough moved to separate servers.

IF EXISTS (SELECT * FROM sys.databases WHERE name = 'Sample_OLTP')
DROP DATABASE Sample_OLTP
GO
IF EXISTS (SELECT * FROM sys.databases WHERE name = 'Sample_Reporting')
DROP DATABASE Sample_Reporting
GO

CREATE DATABASE Sample_OLTP
CREATE DATABASE Sample_Reporting
GO

ALTER DATABASE Sample_OLTP SET NEW_BROKER
ALTER DATABASE Sample_Reporting SET NEW_BROKER
GO
ALTER DATABASE Sample_OLTP SET TRUSTWORTHY ON
ALTER DATABASE Sample_Reporting SET TRUSTWORTHY ON
GO

USE Sample_OLTP
GO
CREATE TABLE LoanApplication
(ApplicationId INT IDENTITY(1,1),
CreateTimestamp DATETIME,
LoanAmount MONEY,
SubmittedOn DATETIME,
ApprovedOn DATETIME,
LoanStatusId INT,
PrimaryCustomerId INT,
CoSignerCustomerId INT)
GO
CREATE TABLE Customer
(CustomerId INT IDENTITY(1,1).
FirstName VARCHAR(50),
LastName VARCHAR(50),
EmailAddress VARCHAR(255))
GO
USE Sample_Reporting
GO
CREATE TABLE LoanReporting
(ApplicationId INT,
CreateTimestamp DATETIME,
LoanAmount MONEY,
SubmittedOn DATETIME,
ApprovedOn DATETIME,
LoanStatusId INT,
PrimaryCustomerId INT,
PrimaryFirstName VARCHAR(50),
PrimaryLastName VARCHAR(50),
PrimaryEmailAddress VARCHAR(255),
CoSignerCustomerId INT,
CoSignerFirstName VARCHAR(50),
CoSignerLastName VARCHAR(50),
CoSignerEmailAddress VARCHAR(255))
GO

Service Broker Objects

With this system I use a single pair of service broker queues to handle all the data transfer. This way transactional consistency can be maintained as the data flows in.  These SQL Service Broker objects should be created in both the Sample_OLTP and the Sample_Reporting database..

CREATE MESSAGE TYPE ReplData_MT
GO
CREATE CONTRACT ReplData_Ct
(ReplData_MT SENT BY ANY)
GO
CREATE QUEUE ReplData_Source_Queue
GO
CREATE QUEUE ReplData_Destination_Queue
GO
CREATE SERVICE ReplData_Source_Service
ON QUEUE ReplData_Source_Queue
(ReplData_Ct)
GO
CREATE SERVICE ReplData_Destination_Service
ON QUEUE ReplData_Destination_Queue
(ReplData_Ct)
GO

The Routes

In the OLTP database you create a route like this (just change the BROKER_INSTANCE to match your server).

CREATE ROUTE ReplData_Route
WITH SERVICE_NAME='ReplData_Destination_Service',
BROKER_INSTANCE='566C7F7A-9373-460A-8BCC-5C1FD4BF49C9',
ADDRESS='LOCAL'

In the reporting database you create a route like this (just change the BROKER_INSTANCE to math your server).

CREATE ROUTE ReplData_Route
WITH SERVICE_NAME='ReplData_Source_Service',
BROKER_INSTANCE='A4EC5E44-60AF-4CD3-AAAD-C3D467AC682E',
ADDRESS='LOCAL'

Stored Procedures on the OLTP Database

In the OLTP database we need just a single stored procedure.  This stored procedure will handle the sending of the message so that we don’t have to put that same code in each table.

CREATE PROCEDURE SendTriggerData
@XMLData XML
AS
BEGIN
DECLARE @handle UNIQUEIDENTIFIER

BEGIN DIALOG CONVERSATION @handle
FROM SERVICE ReplData_Source_Service
TO SERVICE 'ReplData_Destination_Service'
ON CONTRACT ReplData_Ct
WITH ENCRYPTION=OFF;

SEND ON CONVERSATION @handle
MESSAGE TYPE ReplData_MT
(@XMLData)
END
GO

OLTP database Triggers

The triggers that are on each table on the OLTP database are kept as small as possible so that we put as little additional load on the OLTP server as possible.  Obviously there will be some additional load on the OLTP database, but we want to keep that to a minimum.

CREATE TRIGGER t_LoanApplication ON LoanApplication
FOR INSERT, UPDATE
AS
BEGIN
DECLARE @xml XML

SET @xml = (SELECT *
FROM inserted as LoanApplication
FOR XML AUTO, ROOT('root'))

EXEC SendTriggerData @xml
END
GO

CREATE TRIGGER t_Customer ON Customer
FOR INSERT, UPDATE
AS
BEGIN
DECLARE @xml XML

SET @xml = (SELECT *
FROM inserted as Customer
FOR XML AUTO, ROOT('root'))

EXEC SendTriggerData @xml
END
GO

Procedures on the Reporting Database

The reporting database is where the real work happens.  Here we take the XML document, identify which table the data is from then pass the XML document to a child procedure which then processes the data and updates the table.

CREATE PROCEDURE ProcessOLTPData_LoanApplication
@xml XML
AS
DECLARE  @hDoc INT
EXEC sp_xml_preparedocument @hDoc OUTPUT, @xml

UPDATE LoanReporting
SET ApplicationId=a.ApplicationId,
CreateTimestamp = a.CreateTimestamp,
LoanAmount=a.LoanAmount,
SubmittedOn=a.SubmittedOn,
ApprovedOn=a.ApprovedOn,
LoanStatusId=a.LoanStatusId
FROM OPENXML (@hDoc, '/root/LoanApplication')
WITH (ApplicationId INT '@ApplicationId',
CreateTimestamp DATETIME '@CreateTimestamp',
LoanAmount MONEY '@LoanAmount',
SubmittedOn DATETIME '@SubmittedOn',
ApprovedOn DATETIME '@ApprovedOn',
LoanStatusId INT '@LoanStatusId',
PrimaryCustomerId INT '@PrimaryCustomerId',
CoSignerCustomerId INT '@CoSignerCustomerId') a
WHERE a.ApplicationId = LoanReporting.ApplicationId

INSERT INTO LoanReporting
(ApplicationId, CreateTimestamp, LoanAmount, SubmittedOn, ApprovedOn, LoanStatusId, PrimaryCustomerId, CoSignerCustomerId)
SELECT ApplicationId, CreateTimestamp, LoanAmount, SubmittedOn, ApprovedOn, LoanStatusId, PrimaryCustomerId, CoSignerCustomerId
FROM OPENXML (@hDoc, '/root/LoanApplication')
WITH (ApplicationId INT '@ApplicationId',
CreateTimestamp DATETIME '@CreateTimestamp',
LoanAmount MONEY '@LoanAmount',
SubmittedOn DATETIME '@SubmittedOn',
ApprovedOn DATETIME '@ApprovedOn',
LoanStatusId INT '@LoanStatusId',
PrimaryCustomerId INT '@PrimaryCustomerId',
CoSignerCustomerId INT '@CoSignerCustomerId') a
WHERE NOT EXISTS (SELECT * FROM LoanReporting WHERE a.ApplicationId = LoanReporting.ApplicationId)

EXEC sp_xml_removedocument @hDoc
GO
CREATE PROCEDURE PRocessOLTPData_Customer
@xml XML
AS
DECLARE  @hDoc INT
EXEC sp_xml_preparedocument @hDoc OUTPUT, @xml

UPDATE LoanReporting
SET PrimaryEmailAddress = EmailAddress,
PrimaryFirstName = FirstName,
PrimaryLastName = LastName
FROM OPENXML(@hDoc, '/root/Customer')
WITH (CustomerId INT '@CustomerId',
FirstName VARCHAR(50) '@FirstName',
LastName VARCHAR(50) '@LastName',
EmailAddress VARCHAR(255) '@EmailAddress') a
WHERE PrimaryCustomerId = a.CustomerId

UPDATE LoanReporting
SET CoSignerEmailAddress = EmailAddress,
CoSignerFirstName = FirstName,
CoSignerLastName = LastName
FROM OPENXML(@hDoc, '/root/Customer')
WITH (CustomerId INT '@CustomerId',
FirstName VARCHAR(50) '@FirstName',
LastName VARCHAR(50) '@LastName',
EmailAddress VARCHAR(255) '@EmailAddress') a
WHERE CoSignerCustomerId = a.CustomerId

EXEC sp_xml_removedocument @hDoc
GO

CREATE PROCEDURE ProcessOLTPData
AS
DECLARE @xml XML
DECLARE @handle UNIQUEIDENTIFIER, @hDoc INT

WHILE 1=1
BEGIN
SELECT @xml = NULL, @handle = NULL

WAITFOR (RECEIVE TOP (1) @handle=conversation_handle,
@xml=CAST(message_body AS XML)
FROM ReplData_Destination_Queue), TIMEOUT 1000

IF @handle IS NULL
BREAK

EXEC sp_xml_preparedocument @hDoc OUTPUT, @xml

IF EXISTS (SELECT *
FROM OPENXML(@hDoc, '/root/LoanApplication')
)
BEGIN
EXEC ProcessOLTPData_LoanApplication @xml
END

IF EXISTS (SELECT *
FROM OPENXML (@hDoc, '/root/Customer')
)
BEGIN
EXEC PRocessOLTPData_Customer @xml
END

EXEC sp_xml_removedocument @hDoc

END CONVERSATION @handle
END
GO

You can now set the queue to run the ProcessOLTPData Stored Procedure as an activation procedure and it will process the data as it comes in.

Hopefully someone finds this useful as a framework in their shop.  If you have any questions please feel free to post them here.

Denny


March 23, 2010  9:44 PM

Plagiarism, Inspiration, and John Dunleavy (Madder and More Pissed Off)

Denny Cherry Denny Cherry Profile: Denny Cherry

So until just a few minutes ago the problem that everyone had with the blog written by John Dunleavy, who is the owner of SQLTech Consulting, was that he was taking the first part of whitepapers and posting them, with links to the original work, which John defended.

However then Paul Randal pointed out John’s entry into this months TSQLTuesday.  It would appear that a good portion of John’s post were taken directly from an older article which I wrote.  Below are screen shots from John’s blog.  I’ve underlined the parts which are taken directly from an article which I wrote a couple of years ago titled Optimize Disk Configurations in SQL Server.  You’ll find the matching content about half way down.

Now if you compare those sections to the RAID level descriptions that I posted in my article you’ll find that they match way to closely to be by accident.  Hell the RAID 5 one matches word for word, and the RAID 10 one is pretty damn close.

The best part is John apparently didn’t finish my article, he just got to the parts he needed to swipe and stopped because in his RAID 10 description he says that if you are attached to a SAN you’ll be using RAID 10 for everything, and he even says that RAID 0 is find for ultrahigh write databases.  Where in my article I say that RAID 5 or RAID 6 will work fine for most, then move up to RAID 10 if you need the higher write throughput.

Now, yes you could say that it’s RAID, how many ways are their to describe it, so it could just be chance.  However note that the punctuation matches, including the placement of the semi-colons.  It really isn’t very likely that this was an accident.

Even if posting unattributed copies of others works with links back was ok, flat out stealing content from their site isn’t.

Needless to say I was less than impressed when I saw this.  In case you were wondering stealing someones content is not flattery.  At all.

Denny

P.S. In case you are looking for the unchopped version of this blog post, I’ve included screen shots below in case his post is removed from his site.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: