During the 90’s life was good, there wasn’t any regulations to deal with, you followed best practices to the extent that the company you worked for could afford them. Change control usually consisted of sending out an email saying “Hey, were going to change a bunch of stuff, nothing should break.”. Today however things are different, very different depending on how large the company is that you work for, and if your company is public or private.
Today I work for a private company, so we are able to run things much like the “good old days”, but most are not so lucky. Change control processes are cumbersome at best, and the number of legal compliance issues that we have to not only be aware of the existence of, but actually understand is quite daunting. You’ve got everyone’s favorite SOX which says that lots of stuff needs to be controlled and duties must be separated, but doesn’t give any sort of guidelines as to how to do this, or what duties should be done by who. If you take credit cards over the web, or process credit cards over the web then you’ve got PCI (which I’m dealing with now on our shopping cart server). If you work for a company which stored medical records then got help you when it comes to HIPPA. For those that aren’t aware of HIPPA part of it basically says that every lookup to medical records has to be logged. Within an application that’s easy. Windows SQL Server 2008 that’s easy, but what about the legacy SQL 2000 medical application? It provides no guidance other than to say “do it”.
I remember that when Windows 2003 SP 1 was released (it may have been SP 2) there was a thread on a forum somewhere (probably tek-tips.com) where we were discussing SP1 and HIPPA. Somewhere in HIPPA is says that you have to keep your systems patched. Somewhere else it also says that systems which stored medical information on there cannot report data back to a vendor. Well SP1 introduced code that would allow a sysadmin to have the server report usage and error data back to Microsoft. So which part of HIPPA should you violate?
On top of all the federal regulations, states are now passing data encryption laws which have to be dealt with. Here in California we’ve had data encryption regulations in place since 2003 or so. At several companies that I’ve worked at the IT managers didn’t know anything about the law and what it meant. The law here in California is so vague that it is almost meaningless. It says (in laymen terms) that if your data is breached, and the data isn’t encrypted then you have to tell your customers either directly or via the media. But it doesn’t define encryption, or how strong that encryption has to be. It at least defines what data items it includes (name, address, username, password, social security number, etc) but if you take the law at face value doing a simple character replacement is sufficient to comply with the law. While this complies with the letter of the law it obviously doesn’t comply with the spirit of the law, but the letter of the law is what matters in court.
Having to keep track of all these laws which apply to us is mind boggling at best, and impossible at worst. And reading the laws is amazingly painful. The California law I sited above, which I’ve read several times, still confuses me to no end; and I’ve reviewed it with the legal team at one company already due to a data breach. And consider that there are data encryption laws in several states, all of which you have to comply with if you have customers in that state, or if you do business in that state. I have no idea which states have these laws, or even how many states have these laws. Even if I did, I’d then need to find the overlaps and the exceptions, then figure out how to build our database to meet these laws. Beyond that I’d have to anticipate the future laws that could be coming in the other states and account for those potential laws at the same time.
At this point handling the database design is just getting more complex.
Now the new Transparent Data Encryption is great for handling data at rest. It keeps your backups all encrypted and save. But what happens when the bad guy breaks in and swipes the data by logging into the database. Yea the data is encrypted on disk, so technically we are covered, but the data is still out there and usable because the bad guy was able to login as a database user with select access to the tables and dump the data to his system via a SELECT statement. What has to happen now?
I don’t know about you, but I got into this field so that I wouldn’t have to worry about stuff like this. I guess those times are over with.
This rant is now complete. See what happens when I get on a plane at 6:30am and the nice lady starts poring coffee down my throat for the entire flight.
This setting is the Large Pages setting. This will allow the SQL Server to allocate memory from the host in 2 Meg chunks instead of 4k chunks. This has been shown to provide up to a 6% performance improvement in performance, however your mileage may very.
This setting can be found by editing the Hosts Advanced Settings on the Configuration tab. The setting is Mem.AllocGuestLargePage.
The Sessions are:
IT Pro Panel: Q&A for Developers
SQL Server Service Broker
SQL Service Broker Advanced Performance Tips and Tricks
Virtual SQL Servers Should I or Shouldn’t I?
I’ll be posting the slide decks and sample code shortly before Code Camp.
So You’re On A Deserted Island With WiFi and you’re still on the clock at work. Okay, so not a very good situational exercise here, but let’s roll with it; we’ll call it a virtual deserted island. Perhaps what I should simply ask is if you had a month without any walk-up work, no projects due, no performance issues that require you to devote time from anything other than a wishlist of items you’ve been wanting to get accomplished at work but keep getting pulled away from I ask this question: what would be the top items that would get your attention?
Learn More About vSphere
A couple of weeks ago VMware released there next version of ESX complete with new branding. The new product is vSphere 4.0 and it has a ton of new features. I’d love to be able to spend some time learning more about the new features as well as the best way to upgrade an ESX 3.5 Cluster to vSphere 4.0. Then I would be able to upgrade our production cluster to vSphere 4.0.
Learn more about SSAS
SQL Analysis Services is a part of SQL Server that I don’t know much about. I’d love to be able to sit down and spend some time learning more about it, and how I can leverage it to find ways to make more sales.
Get more work done on my open source projects
For a while now my work has been stalled on the two open source projects that I’ve started (Standalone SQL Agent and Outside Queue to SQL Service Broker Adapter). The Standalone SQL Agent is designed for SQL Servers running SQL Express so that they can have some job scheduling options as the SQL Express engine doesn’t include the SQL Agent. The Outside Queue to SQL Service Broker Adapter allows you to route messages from one message queueing system directly into the SQL Service broker without having to write customer software to sit between the two and process the message.
I’m a little late in getting this posted, so I’m going to hold off on tagging anyone this time around.
I submitted a bunch of sessions, and one was picked. I’ll be presenting “Storage for the DBA”. In this session I’ll be showing the best ways to talk to your storage admins and sysadmins so that you can be sure to get the correct storage that you need.
And unlike Brent Ozar I’m actually going to Disneyland afterwords.
Obviously you can setup a SQL Profiler trace to capture this information but that requires the overhead of running SQL Profiler, and who wants that.
All version of SQL Server (from 2000 and up at least) provide some level of logging about who has tried to log into the SQL Server. Within Enterprise Manager or SQL Server Management Studio’s Object Explorer right click on the Server and select properties (if using Enterprise Manger select properties not connection properties).
Select the Security Tab and find the Login Auditing section. By default SQL Server only logs the failed logons which is good as it tells you who hasn’t been able to log into the server. However it doesn’t tell you it they have been successful which is why you may want to change this to both failed and successful logins.
Now changing this setting has an upside and a downside. The upside is that you know who has been successfully broken into your database using a brute force attack and when. The downside is that every client that successfully connects to the SQL Server will also log an entry, making it very hard to find the correct entry you are looking for.
Where do these entries get logged to you ask? That’s the other downside. They get logged to the SQL Server ERRORLOG file and the Windows Security log file. Which means that these files will fill up fast. And if you have a large enough client base logging into the database VERY FAST.
In a perfect world, I’d set this screen to both failed and successful logins. In reality failed is probably all I can do.
SQL Server 2000 didn’t provide a whole lot of information about what is happening as it only says that Login n has tried to connect and failed. Not exactly helpful as you don’t know who was trying to login to the SQL Server using the sa account over and over again. SQL Server 2005 and up include a little piece of helpful information, the IP Address of the person who tried to connect to the SQL Server. This will help tell you who is connecting to the SQL Server so that you can smack them around.
That’s right, I have succeeded where Gargamel has always failed. I have taken on the mighty Smurf, and have not only survived the battle, but I have one it.
In other words I got bored this weekend and decided to dye my hair blue. From the front it’s pretty boring, but from the back you can really see the color. (No worries folks, no smurfs were harmed in the dying of my hair.)
(Click to enlarge.)
My hair shall remain blue (probably) until after code camp, so my blue self will see you there.
Most of our servers are VMs, including a Windows 2008 Cluster (yeah I know not supported, I’m a rebel damn it and I want a Windows 2008 cluster installed under VMware ESX).
When I was setting up my VMware templates I decided to save myself a step and I added the fail over clustering support to the template, as all VMs which would be running Windows 2008 Enterprise would be clustered. Apparently this causes a problem as when the fail over clustering is installed it creates a few network adapters which are hidden with unique MAC addresses. However when you then deploy that template as a VM, the MAC address isn’t changed which then causes the validate wizard to fail as having unique MAC addresses is a requirement to create a cluster.
Every time a database is backed up records are written about it. Over time this can add up to a lot of useless data floating around the SQL Server in the msdb database.
If you like to use the UI to restore you databases, this can also lead to the UI stalling when the restore database window comes up.
Fortunately Microsoft has provided a system stored procedure which you can use to clean up this old data. This procedure is the sp_delete_backuphistory system stored procedure. The usage of this procedure is very simple. It takes a single parameter @oldest_date which is simply the oldest date of data you want to keep. As an exmple:
EXEC sp_delete_backuphistory ’1/1/2009′ would delete backup data older than Janunary 1, 2009.