Recently I was working with a client, and the client wanted to send me a batch of SQL Server execution plans (about 300 of them) in a single file. Normally I would just connect to the SQL Server and view the queries from there, but the client hadn’t gotten me VPN access to their network yet, so that wasn’t an option. I didn’t really want to copy and paste the plans from the text file that they sent me one at a time, so I threw a little app together and figured that I should post it online in case any one else runs into this problem. Just give it the big text file and point it towards an empty folder then click go.
RedGate’s SQL In The City is coming to the amazing Skirball Center in Los Angeles just a couple of weeks after the PASS summit. This is a great change to see some great presentations giving by some of the top speakers in the SQL Server community (who all happen to be great friends of mine).
The speakers (including myself) are:
- Steve Jones, SQL Server MVP
- Brad McGehee, SQL Server MVP
- Grant Fritchey, SQL Server MVP
- Denny Cherry, SQL Server MVP
- Kalen Delaney SQL Server MVP
- Aaron Nelson SQL Server MVP
- Ike Ellis SQL Server MVP
All this for the amazing low price of zero, zip, nada, nothing. If you’ll be in town, or you can be in town this is a great opportunity to get some absolutely amazing SQL Server training. Now is the time to register for the event and reserve yourself a seat before they are all gone.
See you at the Skirball center, and at the party after.
As you may or may not know, the SQL Server MVPs put out a book last year called the SQL Server MVP Deep Dives in which the proceeds of the book went to charity. The MVPs have done a second volume of the book with once again the proceeds of the book going to charity. This time around we as a group chose Operation Smile as the charity. Now this isn’t an update to the first book, but a totally different book done under the same premise.
A large number of the MVPs submitted topic ideas and those who’s ideas were accepted (they didn’t want 10 chapters on PowerShell for example) were asked to write a chapter on that topic. My chapter was on Storage (shocking I know) and got put in as Chapter 5. You can see the chapter list over on Manning’s site (they are the publisher). Currently you can’t pre-order from Manning, but you can from Amazon. If you wait until you can get the book from Manning directly the charity will get a few extra bucks, but if you don’t want to wait we all totally understand.
Rumor has it that there will be another SQL Server Deep Dives book signing at the PASS summit this year. If you are interested in getting your book signed by a large number of the authors this is your chance. There aren’t that many times when this many of the authors are in the same room at the same time. There hasn’t been any announcement about where or when it will be, but you should be able to pick up a copy of the book at the PASS summit book store so you don’t have to lug a copy from home then back home again. I know that when the MVPs did the last MVP Deep Dives book the book store did sell out and they had to do a rush overnight over to get more copies for everyone.
I’ll see you at the PASS summit, and hopefully you’ll be carrying around a copy of the new SQL Server MVP Deep Dives Volume 2.
OK, so that’s a total lie. I can barely ask for a beer correctly in Spanish. However for my Spanish speaking readers you will not be able to find my blog posts in Spanish. My good friend Felipe Zuñiga (blog | @felipezuniga_) has begun translating this blog into Spanish, which he is then posting up on his blog.
I’m not sure how far back Felipe will be translating (there are currently over 600 posts on this site, so I’m guessing that not all of them will be done), but many of the new posts that I write he’ll be translating. I think this is awesome for Felipe to do as this will increase the amount of information available to people in their native language, and all done in his spare time. If you see Felipe at a conference or user group meeting, be sure to buy the man a beer. Translating technical content is probably about the hardest thing that you can do, and he is doing this all for free.
This last weekend was SQL Saturday 95, and it was a great event. I was speaking during 5 sessions. Three were my own sessions, and two were panels (one panel I did solo as my co-presenter got sick and couldn’t make it). For my three sessions you’ll find my slide decks below.
I hope that everyone had as good a time at SQL Saturday as I did.
P.S. Something to note, don’t leave early after your raffle ticket is drawn. We had so many great prizes that we ran out of raffle tickets and had to start drawing evals from the box for prize winners, so you might have won something else if you stuck around till the end.
I’ve managed to put together a nice two city European tour this November thanks to the SQL Server community and some good friends over there.
My first stop will be November 12th, 2011 for the SQL Zaterdag IV in the Netherlands (the site is in Dutch and hasn’t been setup for the next event, yet). This is a good old fashioned SQL Saturday except that it isn’t listed on the SQL Saturday website. Something about the SQL Saturday site only being in English, and everyone over there speaking Dutch. Hopefully they don’t expect a presentation in Dutch, if they do it’s going to be a really long hour. I suppose I’ll have to learn how to say “Hello, where can I get a beer” at the very least. Good thing there’s a 12 hour flight to get over there.
My second stop will be SQL Server Days over in Belgium where the organizers have asked me to come and speak this November 14th and 15th; presented by SQLUG.be and Microsoft Belux. I’m not up on the schedule (yet), I don’t even know what sessions I’m going to be presenting yet (they haven’t picked them yet) but I’m going to be there and having a great time with all the attendees (as soon as they pick the sessions I’ll be presenting I’ll post them here). I can’t wait to see everyone there. SQL Server Days is a two day conference with some Microsoft speakers on the 14th and community speakers on the 15th.
This is a great opportunity for me because I’ll get to meet some new people who hopefully have read some of my stuff before, so I can get some new feedback and make some new friends. (This is one of the things that I love about going to new places to present is meeting new people.)
There are some great speakers that I’m really looking forward to seeing again, or for the first time, such as Kevin Kline, Chris Webb, Dandy Weyn and Jennifer Stirrup. Cost for SQL Server Days is VERY reasonable. If you register in September it’s €79, after that it is €99.
So if you are in Europe and you’ve got the time, you should come check out these two great events. As soon as I’ve got more information about the sessions that I’ll be presenting I’ll be sure to get that information published.
Hopefully I’ll see you at one or both events,
So this year for the first time ever we have some sponsors for SQL Karaoke at the SQL PASS summit. A few fine companies have offered to open up the bar for our enjoyment. There will be a limited number of wristbands available for drinks, so be sure to find me Wednesday during the summit to get yours (more details about when and where I’ll be handing these out to follow as we get closer to the summit). Many thanks to our sponsors NEC and Genesis Hosting Solutions. Their support of our event is nothing short of awesome.
NEC is a regular SQL PASS so be sure to go to their booth and give them some love at the summit. NEC offers a wide range of products including great storage arrays and some of the most highly available servers around which are great for running SQL Server on.
Genesis sadly won’t be at the SQL PASS summit this year but there were kind enough to sponsor us anyway. If you see Genesis Hosting at another conference be sure to show them the love in the booth. Genesis Hosting Solutions offers a wide variety of hosting and virtualization solutions designed to meet the needs of any company from needing just a few VMs to hosting an entire virtual production or DR facility with hundreds or thousands of virtual machines.
Thanks again to both sponsors for helping us out.
See you at the summit,
Back on July 19th, there was a blog post that I was pointed to which talked about tossing your backup solution and using cloud for your backup instead. Basically the points which are made are that because someone else is now holding your data you don’t have to worry about DR plans, keeping multiple copies, etc. because someone else is worrying about this stuff now.
On paper this all sounds great, but I work in reality. In reality as the admin I can’t just trust that someone else is going to manage my DR solution. When things break and we lose the site and have to restore to DR, as the admin I’m the one on the hook with management to get the company up and running again not whoever I’ve out sourced the backups to.
When it comes to my backups (and pretty much any other data at all) I trust no one with it. If I sent it out to some cloud provider how do I know that no one is going to look at it, change it, sell it, etc. If I don’t control everything from end to end, I can’t be sure that my data is secure. I can encrypt it before I sent it up to the cloud, but that’s only giving me so much protection. Encryption can be broken; it just requires having enough machines working on the problem.
There’s also another little problem with using the cloud for backups. Large companies (and even small companies) have lots of data, and I mean lots of data. These days it isn’t crazy for a 10-20 person company to have a couple of terabytes of data. If you are backing all that data up to the cloud on a regular basis you need a lot of bandwidth to get your backup uploaded to the cloud in a timely fashion. Bandwidth sure isn’t free, not even here in the US much less in other countries. Many other countries have bandwidth caps in place where you may by the meg to upload data. If you have to upload 100 Gigs of data a week (a 10% total data change rate is pretty standard) that could take 10-12 hours to upload on a fast connection, and could cost hundreds or thousands in bandwidth charges if you are bandwidth capped.
Running your app in the cloud is a totally different thing. When you do this you have control of the setup, and can control how many sites your data is located in. With the cloud backup solutions that I’ve looked at so far you don’t have this sort of control. You just have to trust that the company that you are paying is doing the right thing. After all what happens if they store your data close to you, for quicker access then what happens when your site loose power because of a natural disaster and they are down for the same reason. Who do you call? You can’t fire anyone because the plan was to let them handle it. You can’t get your site back up because you have to wait for them to get their site back up.
In my world, that’s just not a reasonable solution.
This is a total repost of Stacia’s blog post from this morning so that hopefully everyone will see it. So pretend that Stacia wrote this and that I didn’t.
Yesterday, Denny Cherry (blog|twitter) and I co-presented a 24HOP session for the Fall 2011 lineup, “So How Does the BI Workload Impact the Database Engine?” 24HOP stands for 24 Hours of PASS and is a semiannual roundup of speakers from the SQL Server community. Initially, this event consisted of 24 consecutive sessions, each lasting an hour, but later it became a two-day event with 12 consecutive sessions each day. The sessions are free to attend and feature many great topics covering the spectrum of SQL Server things to know. Even if you missed previous 24HOP events, you can always go back and view recordings of sessions that interest you at the 24HOP site for Spring 2011 and Fall 2010.
And if you missed Denny and me yesterday, a recording will be available in a couple of weeks and I’ll update this post with a link. Our hour-long session for 24HOP was a sneak preview of our upcoming half-day session of the same name that we’ll be presenting at the PASS Summit in Seattle on Thursday, October 13, 2011 from 1:30 pm to 4:30 PM. In our half-day session, we’ll dig into the details and spend more time on database engine analysis, whereas in our 24HOP session, we focused on reviewing the architecture and highlighting the connection between BI components and the database engine.
We were able to answer a few questions at the end, but one question in particular could not be answered easily in the time allotted in a single sentence or two: How much RAM do I need to plan for Integration Services (SSIS)? Andy Leonard (blog|twitter) did manage a succinct response: All of it! I, on the other hand, am not known for being succinct, so deferred the question for this post.
Andy is right that SSIS wants as much memory as you can give it, which can be problematic if you’re executing an SSIS package on the same box as SQL Server. On the other hand, there are benefits to executing the package on the same box as well, so there is no one-size-fits-all solution. And the solution for one data integration scenario might not be the right solution for another data integration scenario. A lot depends on what CPU and RAM resources a given server has and how much data is involved. In order to know how much horsepower you need, you’re going to have to do some benchmark testing with packages. Here are some good resources for SSIS if you’re concerned about memory:
- Top 10 SQL Server Integration Services Best Practices from the SQL Customer Advisory Team (blog | twitter): This article provides an overview of best practices (as the name implies!) and includes links to information about using performance counters to monitor resource usage and about optimizing the Lookup transformation, which is one of the big memory consumers in SSIS.
- SQL Server 2005 Integration Services: A Strategy for Performance, a whitepaper by my friend, former colleague, and co-author of my first book, Elizabeth Vitt. Although it was written for SSIS 2005, the principles related to tuning packages and how to benchmark still apply. The significant changes between SSIS 2005 and SSIS 2008 with regard to performance were improvements in thread management and in the Lookup transformation.
Is there a rule of thumb for deciding how much memory you’ll need for SSIS? Well, no less than 4 GB per CPU core is a good place to start. But if that’s not possible, you certainly want to have memory that’s at least two or three times the size of data that you expect to be processing at a given time. So if you’re processing 1 GB of data, you’ll want at least 2-3 GB of memory and, of course, more memory is even better!
For many years now the SQL PASS summit has been a place where the database loving folks of the world come together learn a thing or two and have a little fun. For the last couple of years the braver among us have burred our shame and barred our legs for all to see for our own personal amusement.
So if you are crazy like we are order away for your kilt, or if you would prefer to see what you are buying hit up one of the kilt shops in downtown Seattle, and join us on Thursday at the PASS summit.
Just don’t be like Allen and forget your belt at home, because then you are just wearing a skirt and that would just be silly.