Probably one of the least fun things about being a DBA is having to deal with the licensing of SQL Server. Continued »
One of my developers came to me asking me to add more RAM to the c++ build server. However the machine was already at 2 Gigs so I wasn’t sure if adding more RAM wouldn’t help. Turns out that VS 2005 doesn’t support AWE so adding more RAM wasn’t going to be of much help. Until I found a post by Steve Harman entitled Hacking Visual Studio to Use More Than 2Gigabytes of Memory.
After making those changes it seamed to work.
For those of you who know me, or have heard me talk at a Code Camp in the last year, you’ve heard me talk about a data center migration that I want to do from Rackspace in Texas to our own equipment in the LA area. Well that day has finely come.
Our current environment has served us well, but we have outgrown the services that Rackspace can offer us, and we have purchased our own production environment. This isn’t any rinky dink environment either. We are starting out with a fully redundant, highly available environment which can be scaled by simply deploying more VMs, and in the event that the VMware hardware is over tasked by simply plugging another VMware server into the mix, and shifting the VMs from one node of the cluster to another.
We are very proud of our new environment, so I figured that I’d give you some of the tech specs of it (yeah, I’m totally bragging here).
On the storage side of things we’ve got an EMC CX4-240 with ~35TB of storage laid out in three tiers. This is connected via multiple 4 Gig fibre cables to a pair of Cisco fibre switches. Each fibre switch is connected to each of the SAN attached servers.
We went with Dell servers (I would have preferred HP servers, but I was overruled).
The SQL Servers and the VMware servers are identical. Quad chip, quad core servers each with 64 Gigs of RAM. Each pair will be clustered for High Availability. The VMware servers will look a little like they puked cables out of the back. Because of all the various subnets and to ensure that each subnet is connected to each of the redundant network switches each of the VMware ESX servers will have 11 Ethernet cables, and 2 fibre cables coming out of the back.
The VMware vCenter services are running on a little single chip quad core server. This is the only part of the system which isn’t redundant, but ESX can run fine for up to 14 days without the License server running, and since this machine has a 4 hour turn around on parts we’ll be fine if the machine dies.
The file servers which host the screenshots, emails, etc which have been captured by our application and will be served to the website upon request are a pair of dual chip, quad core servers also clustered for high availability.
All the servers are SAN attached via the fibre and all data will be saved on the SAN.
Our current environment is much smaller. A single SQL Server, three web servers, and two file servers. The only redundant pieces are the fibre cables from the SQL Server to the SAN, and the fact that we have three web servers. However if the newer web server goes out in the middle of the day, the other two will choke at this point.
Rackspace has been pretty good to us over the years. It just wasn’t cost effective for us to purchase this level of hardware before now, and Rackspace was able to provide us with a good service level for a reasonable price. But at this point, because of the amount of hardware we were looking to move into, and the amount of bandwidth we are going to be using it simply became more cost effective for us to host the systems at a local CoLo.
The main reason that I’m telling everyone this is that if you have been trying to find me for the last two weeks or so this is why I can’t be found. I’ve been spending pretty much every waking moment this together and getting it all setup so that we can migrate over to it.
Needless to say its an awesome project. How many people get the chance to build a new data center and design it the way they want to from scratch. Pretty much no one. Data centers usually grow from a small config of a server or two in a sporadic way, and they are inherited from one person to the next. But this time I get to design everything they way I want to from the grown up. It’s going to be a blast.
I haven’t made a decision on putting databases in the cloud. I think Amazon and now Microsoft have the write idea for cloud databases. Give people lots of options, use named value pars (AmazonDB or the old school Azure database) when it makes sence and use full blown RDBMS when it makes sence.
But the big question that I have (besides pricing) is how does all this fit into the overall picture for my company or my client.
Do I see a lot of large enterprises moving large parts of there environment into the Cloud? Probably not.
Do I see the small/medium business moving customer facing applications to the Cloud? Possibly, it’s going to depend on the application and the business model.
Do I see the cloud being a stepping stone in a eventual path to building your own data center? Very much so.
Why don’t I see large enterprises moving data into the cloud? Mainly control and compliance. Large companies (and even larger medium sized companies) want to control everything about there data. They also need to be able to ensure that no one who isn’t authorized to view the data can’t view the data. The easiest way to do this is to own the machines that have the data. Large companies also have to have DR plans. Those plans usually can’t depend on some other company doing “Yes, it will be back up.”.
I said above that I see the cloud being a stepping stone to getting your own servers and data center. The path that I see in my mind is for the small to medium business who can’t afford to setup their own servers onsite or at a colo. For them cloud computing is a great first step to let them get started and see where the application goes. If nothing happens, then there isn’t much capitol lost. If it grows like crazy then everything scales nicely (not yet sure how well and automatically the databases scale). This gives the SMB the ability to judge where the business is going to go and how fast it’s going to get there.
Some applications may be able to stay in the cloud forever. Either they persist a lot of the data at the client, or they simply never outgrow the cloud. On the otherhand I see a lot of applications going from running in the cloud to moving to an MSP (Managed Services Provider) such as Rackspace, MaximumASP, etc. These guys offer the benifits of dedicated hardware, without having to shell out the massive amounts of cash up front. Over time however it becomes cost ineffective to continue at an MSP, and buying your own hardware simply becomes the correct thing to do. The trick is knowing when this is the case, so that you aren’t spending a lot of money at the MSP.
Now for those that were paying attention, you’ll noticed that I skipped the point above about the SMB moving somethings into the cloud. I think this falls into both answers above. Some things will make sense to host up in the cloud, other things won’t.
What do you think will happen to the cloud? Where do you see it being really useful? Will you be moving applications into the cloud at your current company, at a future company?
These are questions that you’ll need to ask yourself at some point, so why not now? In these times of rapid change to the IT world (and the world in general) don’t be afraid to change your answers to these questions.
Personally I don’t get most of the social networking sites / products / whatever you want to call them. Continued »
If your company does business in the state of California then don’t forget to change the sales tax rate in your shopping carts tomorrow.
That’s right, tomorrow the state of California will be playing the worlds worst April Fools Day prank. We will be increasing our state sales tax by 1% tomorrow April 1, 2009. Depending on the county you live in sales tax could now be as high as 10% as some counties are also increasing the county sales tax as well.
At this point I’m betting my boss is pretty happy that we finished the purchase of our new SAN and servers in March (a blog post about the project will be later). The increased rate of a $700k (US) purchase would be another $7000 (US) to the state goverment.
If you don’t collect enough sales tax, you still have to pay the correct amount to the state, so that extra money that you don’t collect could eat into the bottom line pretty quickly.
If you don’t already read the CSS teams blog, I suggest that you do. A little while ago they posted a blog post about how the SQL Dumper utility wouldn’t work in a SQL Server 2008 Clustered Installation.
This should be considdered mandatory reading for anyone with a SQL Server 2008 install in a cluster.
I just published another article up on SearchSQLServer.com. This one is entitled Testing a SQL Server environment before an upgrade. In it I talk about some successes and some failures that I’ve seen when upgrading SQL Server versions and how the failures could have been avoided.
Hopefully you have seen one of the “Did you know” videos that are on the web. If you haven’t I highly recommend that you do. I’ve included the four versions which I’ve found below.
I’ve found these videos quite interesting to watch, especially as you watch the different versions and see the numbers being updated and changing over time. After you watch theses I invite you to talk with people about the message that you got from these videos. I look forward to your comments either here, on Twitter, via email, or in person.
For those that don’t know the first of these videos was created by Karl Fisch for use at a beginning of the school year meeting. You can read up more about Karl on his site and more about Shift Happens.
The original (which I uploaded to YouTube to make it easier to view). In this video he refers to AHS which is the school he teaches at, or at least he did at that time.
[kml_flashembed movie="http://www.youtube.com/v/4FplchaOeuQ" width="425" height="350" wmode="transparent" /]
To this 4 minute and 55 second version.
[kml_flashembed movie="http://www.youtube.com/v/UIDLIwlzkgY" width="425" height="350" wmode="transparent" /]
To this 2007 version which is 8 minutes and 19 seconds long.
[kml_flashembed movie="http://www.youtube.com/v/pMcfrLYDm2U" width="425" height="350" wmode="transparent" /]
To the most recent 2008 version which is 5 minutes and 16 seconds long.
[kml_flashembed movie="http://www.youtube.com/v/jpEnFwiqdx8" width="425" height="350" wmode="transparent" /]
UPDATE: (Yeah I know 4 minutes after I published the origional)
Some other video’s by the same group of teachers. The first is What If
[kml_flashembed movie="http://video.google.com/googleplayer.swf?docid=-2855786550703993653" width="400" height="326" wmode="transparent" /]
And the second is 2020 Vision
[kml_flashembed movie="http://video.google.com/googleplayer.swf?docid=7281108124087435381" width="400" height="326" wmode="transparent" /]
Last week Linchi Shea wrote a posted a blog entry entitled “How does that AD user account get access to the database?“. In it Linchi shows a method of finding out the domain groups that a SQL Server user uses to access the database.
There is however an easier, method you can use.
There is a system extended stored procedure called xp_logininfo. Microsoft was even kind enough to document the procedure for us. You can use this procedure to see what group a use belongs to, or what users are in a domain group, all from T/SQL.
For example, on my sandbox instance if I run
EXEC xp_logininfo 'CORP\dcherry'
I get a result set which says that I gained my access to the SQL Server via the “BUILTIN\Administrators” group.
This is a nice quick and easy method to see what domain group a user used to access the SQL Server.