Wooohooo. I am shocked but after sending my new boss the info for IE2 from sqlskills.com and the news of the super early bird discount, she had me book the class. I have booked the class in Chicago for May 6 – 10. I have booked my flights and my room. I am shocked but very excited. After attending the 2011 IE1 class in Washington, I can not wait to get into class and get my learn on again. I have been wanting to attend this class for over a year now. And I get to go to Chicago, I have never been to Chicago so I am very excited about that. I am going to take some days after the class ends for a mini vacation. My wife and son will meet me in Chicago and we will go see all the cool sites and have a great time. But man I am super excited for another great week of deep SQL Server training. I can not wait to get back and start putting some of the things that I learn into practice in order to make our environment even better. I know it is a long way out, but if you can book this training now so you can save some money. As outlined on Pauls Blog. I hope to see some of you at the event and I am so excited for my brain to hurt like only the sqlskills group can make it.
After changing jobs in late 2011 I was happy to get an opportunity for training in 2012. I was however, disappointed that I did not get to attend the training that I really wanted to go to. Instead I went to the devconnections con in Las Vegas. It was a good con and I attended some great sessions. I was even lucky enough to get a pre-con on powershell by Allen White. It was a great time. I met some cool people had fun doing some drinking and gambling and all that good stuff. But for me, the cons just do not really get me what I really desire. I want to present at the cons and try to share some of my knowledge, but I really want a deeper dive into SQL Server then a 60 to 75 minute session can give me. That is why I am going to submit for what I really want in 2013 and if I do not get that I will ask the company to save my portion of the budget so I can go to what I really want in 2014. I will be making my play today so I will keep you updated.
I wanted to shed a bit more light on the bug that I talked about it my last post. So this is something that should really never happen. When you add a data file to SQL Server, it gets a file_id and that id is stored in sys.master_files. So if you add a bunch of data files to a database, you might have id’s 1 -primary, 2 – log, 3 – secondary, 4- secondary, 5 – secondary. This is the situation that my database was in at some point. Like shown below.
So you can see that I have file_id 1..5 making up my database. But now if I drop file ID 4. That will leave a gap.
This ID also has an entry in the backup file, so when you backup and restore the database, the file_id’s will match up. So the issue was that the virtualization software just assumes that the files will be sequential and no gaps will be in the order. Since this was not the case the VDB was not able to come online. In my last post, I talked about a hacky way to resolve that by modifying sys.master_files. (NOT RECOMMENDED).
The other way is to add as many data files back to the source database as you have gaps in your ID sequence. SQL Server will give these files the ID to fill in the gap. Show below.
Notice that the file_id 4 is back but the file name is actually file_id6. So once you add these files now you will not run into the bug.
So in my testing of the Database Virtulization product I ran into a most interesting bug that really caused me some pain to figure out. When I provisioned the VDB, the database showed up on the target instance but was in a restoring state. Checking the error log I found that the database was unable to start do to a file_id in the file not matching what SQL Server Expected due to what was in sys.master_files. The ID of one of the data files was incorrect in sys.master_files. To resolve this I had to update the sys.master_files catalog directly. I do not recommend doing this and I did it only because I knew that if something catastrophic happened… Well I just did not care about this database at all. First I went and found Paul Randal talking about how to update system catalogs. Please read and you will notice that Paul is also cautioning you against doing this.
Here are the steps that I took.
- Bring the SQL Server Instance offline.
- Restart the SQL Server Instance in Single user mode.
- Sqlsevr –m
- Connect to the instance using the DAC
- Sqlcmd /A
- Allow ad hoc updates to system catalogs.
- Sp_configure ‘allow updates’, 1
- Reconfigure with override
- Update the catalog with the correct file id
- Update sys.master_files
- Set file_id = 6
- Where database_id = 8
- And file_id = 5
- Shutdown the SQL Server Instance
- Restart the SQL Server Instance in regular mode.
And just like that my VDB was back.
More on the virtulization product to come.
And we notified the vendor and they are working on a fix as what I did just is not acceptable for a fully baked product.
I had a need to write some scripts that interface with a unix environment for some of the new Database Virtulization beta testing that I am doing. I found plink.exe. This is a great little command line ssh tool. I am able to create input files on the fly in my script and then call plink, pass it the file, and get the results into my variable. Then I am able to parse the results and act on the results that I get out.
This virtualization product applies log backup files to the database as they are created on my source. I can plink into the box find the backupuuid of the last applied log, then I can query msdb on my source database to find all the logs that have been applied based on that uuid that I am getting from unix. Once I get that I can use powershell to go out to the windows filesystem and remove the log backup files that have already been applied to the VDB.
Good stuff. More to come on Virtulization. Should be getting an updated set of bits to test late this week.
Today is my one year anniversary working for my new company. When I made the decision to change jobs I was a bit nervous. The new place really sounded too good to be true and I am not a grass is greener type of a guy. I am very loyal as well, and my old company, while on the decline, actually treated me very good. But after a year I have found that this place really is as great as it seemed and I am very happy here. I miss the people that I used to work with, but I still keep in touch with them. So overall I am very glad that I made the move and I think it was the right move for my career. It is nice working for a company that is doing well and staffs the DBA position appropriately. I am busy but not overwhelmed and I get the chance to play with some emerging technology as well as be a lot more proactive then I have ever been before. So I guess what I am saying, is if the opportunity comes up and it looks like a good move, do not let loyalty to a company hold you back. I almost did but I know that the company would not have been as loyal to me if they needed to make cuts.
I have a need to have a linked server that can connect to multiple catalogs, but I need a proc to determine what catalog to connect to. So I need to be able to do something like this.
set @SQL = 'select * from linkname.' + @RemoteDBName + '.dbo.table' exec (@SQL)
I do not want to do this with dynamic sql though. It works great as long as you define @RemoteDBName. The issue with doing it this way is that SQL Server can not re-use a plan so you incur overhead of generating a plan each time the query runs. In my case it runs hundreds of times a minute. So that overhead adds up quickly.
After some searching on Google I found a few sites that mention using synonyms. So I created a synonym for every table that all of my queries hit for each of my linked servers. A bit tedious but the payoff was good.
If you are like me you are getting away for the long weekend. I am heading out to San Diego to take Mason to the Beach for the first time. I hope that all of you will be having a good time as well. Get away from work and just have a good time. Drink some beer, have a good time with your family and friends. No matter what you do though, please be safe.
This one is really for application support teams. All I ask is that you understand how your app works before you blame the database. For instance, if a user is unable to save a project to a database in your application, you should know and understand the workflow and the security model that is in place. You should know if your app handles security for accessing the database or if that is done at the database level. You should know if you are using Active Directory, some other LDAP, or local accounts for access. You should know if the application interfaces with the database with a service account, or if every user needs to have database level access. You should know what the pre reqs are, in the app, for saving a project. What fields does the end user have to have filled out or completed before that can occur. You should know if your app is dependent or integrated with another app and if so you should understand all of the above for the second application as well. I do not think that is too much to ask and I certainly do not think it is too much to ask that you have your app configured to connect to the correct database before coming down and asking the DBA to take a very granular look at permissions of users that are and are not able to save these projects.
And yes, I did have an application person come down and ask me to look at object level permissions for some users and compare them, they were the same. And yes he had the app pointed at a database other than the one that he had me investigating. So please, do not waste my time. Know your app, know the workflow, know how to check your database connection, and know the security model.
8 months ago I had my first child and I thought I would introduce him. His name is Mason and he is my future DBA. He loves to bang on the keyboard when I sit at the computer and I am holding him so I just know that he will be a natural. He hears me talking about work all the time so I am sure he is soaking in all things SQL Server. I bet by the time he is 12 he will be a better DBA then I am. Any way, here he is.