SQL Server with Mr. Denny


May 2, 2011  2:00 PM

Next week is EMC World, and I’m getting there a little differently than usual.



Posted by: Denny Cherry
EMC World, EMC World 2011

So this year EMC World is back in Las Vegas, and I’m damn happy that I’m going again.  Instead of taking the quick flight from Ontario to Las Vegas like I normally word, I’ll be getting there a little differently, I’ll be riding my motorcycle.

As I can only go about 130 miles on a tank of gas, I’ll have to stop a couple of times to fill up each way.  I’ve created a Google map to show my route.  I’ll be driving from home to Barstow, then to State-line, NV then on to Vegas.  If you see a pretty blue motorcycle with a leather clad motorcycle along this route with a license plate that says “mrdenny” give me a wave.

If you aren’t driving in from California and are attending EMC World, I’ll see you in Vegas at the welcome reception.

Denny

April 28, 2011  2:00 PM

If i have 3Par SAN, is there any specific recommendations, as it does its custom stripping across the drives. or any other specifics related to 3PAR?



Posted by: Denny Cherry
Storage

The biggest recommendation that I have (pretty much the only one since I haven’t worked with a 3PAR in a while) is to ensure that your LUNs are sized correctly so that they are evenly spread out across all the disks in the disk set that you are using.

For example, if you have 100 disks in the disk set (I don’t remember the 3PAR term here), then you have 50 disks available (as the other 50 are the mirror).  If you have a 1 Gig chunk size you want to ensure that all your LUNs are divisible by 50 Gigs.  The basic math is ChunkSize*ActiveDisks then multiple the number you get until you get to the first size larger than the size that you need.  You will end up wasting a little bit of space, but you will have a faster array.

Denny


April 25, 2011  2:00 PM

We get many warnings in our SQL error log about I/O requests taking longer than 15 seconds to complete. What do we do in such scenario?



Posted by: Denny Cherry
Index, Index Scan, Index Seek, SQL Server, Storage

There are a few things which you will want to look at.  First make sure that you have enough RAM for the database to use.  Without enough RAM in the SQL Server you’ll begin putting extra load on the disks because the same data is being pulled into the buffer cache from the disk over and over.

The second thing to check is that you don’t need to add any additional indexes.  If your queries are doing scan’s which they shouldn’t be then this will put extra load on the disks.

If everything else looks good it’s time to get the boss to open their checkbook and buy more disks, or faster disks.

Denny


April 21, 2011  11:00 AM

Is there a best practice to advise a storage admin how to set a SAN up best for SQL Server OLTP & OLAP? or is this entirely company dependent?



Posted by: Denny Cherry
SQL Server, Storage

There are some general best practices such as separate data and log LUNs, separate TempDB from everything else.  For data warehouses make sure that the data is spread over as many disks as possible.  Anything more specific however is very company specific and would require a good understanding of what the databases look like as well as what the storage array looks like.

Denny


April 18, 2011  2:00 PM

For SQL, best practice is to have data on one array, logs on another array and tempdb on another array. how does this work in san world? SAN people don’t really give all these options when assigning storage for database server.



Posted by: Denny Cherry
EMC, SQL Server, Storage

Even when using a storage array it is typically best practice to have the data on one LUN, logs on another LUN, and tempdb on another LUN.  Especially if you have the option to putting each LUN on different physical disks.  If you I talked a little more about this a couple of weeks ago, so check there more some more detail.

Denny


April 14, 2011  2:00 PM

So you’ve changed the text display settings, and SSMS still won’t show more than 256 characters.



Posted by: Denny Cherry
SQL Server, SQL Server 11, SQL Server 2005, SQL Server 2008, SQL Server 2008 R2, SQL Server 2012, SQL Server Management Studio

By default SQL Server Management Studio will only show you 256 characters per column.  The fix for this is pretty easy.  You click on Tools > Options to being up the options menu.  From there navigate to Query Results > SQL Server > Results To Text.  Change the “Maximum number of characters displayed in each column” value from 256 to what ever number you want up to 8192.  After you click OK the setting will be saved.

Here’s where the trick comes in.  The setting doesn’t actually take effect until you restart SQL Server Management Studio.  There is no prompt to restart, but a restart is required because apparently SQL Server Management Studio only reads this setting on start up.

Thanks Manageability Team.

Denny


April 13, 2011  1:19 AM

Some job descriptions just have to be shared



Posted by: Denny Cherry
Interview, Job Opening, Social Commentary, SQL Server

So I’m going along doing my work thing, shrinking databases rebooting servers at random, etc. and I get an email some of which I’ve included below asking if I’d be interested in this job opening.  I’ve cut out the names, etc. to protect the stupid.

Below is the response that I sent back.

Industry Confidential
Location Palo Alto, CA
Job Title SQL Server DBA
Duration 03 month(s)(Possible Extension)
Profile/ Job description
  • The SQL Database Administrator is part of a data center team providing application and infrastructure support for a large healthcare engagement. The SQL Database Administrator is responsible for managing and monitoring database platforms which support multiple OLTP and OLAP applications. THE SQL Database Administrator will also be responsible for installing and configuring a high availability, and disaster recovery environments and planning upgrades and patches as required to maintain each environment.

Specific responsibilities will include:

  • Working experience with SLA’s
  • Monitor software database license process
  • Monitor performance activity
  • Familiar with Clustered environment and load balancing when required
  • Hands on experience with Mirroring, replication and recovery models
  • Hands on experience on performance monitoring and database counters
  • Manage technical action reports to resolve bugs/issues with DBMS software
  • Support capacity planning
  • Participate in implementing database strategies and solutions
  • Provide database maintenance activities
  • Monitor database logs for potential problems
  • Participate in planning software upgrades and patches
  • Provide outage resolution support and perform analysis to resolve recurring database issues
  • Install and configure DBMS
  • Provide database support for software development/application projects
  • Verify database backups have completed successfully
Basic Qualification:
  • 2+ years Database Administrator experience
  • 2+ years SQL 2000, 2005, 2010
  • 2+ years MCITP – Microsoft SQL 2008 Database Administrator
  • 2+ years MCDBA 2003/2005
  • Adaptive Server Enterprise Administrator professional certification
  • MCITP: SQL Server 2008 Database Administrator
  • MCDBA 2003/2005 Certification
Professional Skill Requirements
  • Operations Management
  • Project Management
  • Service Level Agreement (SLA) Management
  • Problem Solving and Escalation Management
  • Service Performance Management and Metrics
  • Estimation and Planning
Travel/Shift Hours/OT Requirements Minimal travel.

  • Work location is Palo Alto and candidates must live within a 100 mile radius of client site locations. Will require shift work and on-call duty and off-hours and weekend work as needed, as this project is a 24 x 7 support team.
Preferred Skills:
  • Detail Oriented
  • Quality and Process Improvement
  • Microsoft Office Proficient

If you would like to pursue this position, please reply to this e-mail and provide us details in following areas. Also, please attach a copy of your updated resume, in word format, elaborating your projects in tune with the client’s requirement. This will assist us in placing you with one of our clients.

Here’s what I sent back.

You should probably work with your client to get some actual requirements.

If you are going to list the Industry as “Confidential” you probably shouldn’t mention it in the job description.

There is no such thing as SQL 2010.
There is no certification as the MCDBA 2003.
There is no certification as the MCDBA 2005.

You want someone with way more than 2 years’ experience installing high availability and disaster recovery servers.

People with 2 years’ experience won’t have experience with mirroring or replication.  They probably won’t have much experience with server monitoring or looking at performance monitor counters (there is no such thing as database counters).  Someone with 2 years’ experience will have no idea how to open tickets with Microsoft, or how to work the Microsoft CSS support team to get the problems resolved.  The odds of someone with 2 years’ experience being able to correctly identify a new bug in the database engine is next to 0 (if not lower than 0).

It is barely possible for people to have the SQL Server 2008 certifications for two years.  They were only released a little over two years ago.  If someone only has two years’ experience with SQL Server, they probably aren’t ready to take the certification exams, much less the MCITP.

You are asking for a Sybase (Adaptive Server Enterprise) certification, yet you never mention Sybase anywhere in the job description.

Please learn at least something about the position you send out so that you don’t look like a total moron. Based on this job description I would NEVER allow you to represent me on any job posting. Open job positions are not high priority. Production servers being offline, that’s a high priority email. The government closing, that’s a high priority email. Family member is sick and in the hospital, that’s a high priority email. This is not.

And yes, this email was sent as a high priority email.

Denny


April 11, 2011  2:00 PM

Join me at “SQL in the City” presented by Red Gate Software



Posted by: Denny Cherry
In Person Events

The fine folks at Red Gate Software have made the critical mistake of asking me to be a presenter at the Los Angeles “SQL in the City” event at the Skirball Center on October 28th, 2011.

I’ll be joined by a few familiar faces including Steve Jones (blog | twitter), Brad McGehee (blog | twitter) and Grant Fritchey (blog | twitter).  This is a free event including what I’m sure will be a great lunch provided by Red Gate.  So be sure to register today before all the seats are taken.

The session that I’ll be presenting will be on data security, and data encryption.  Knowing Steve, Brad and Grant I’m sure that they all have some excellent sessions lined up for the day as well.

Denny


April 7, 2011  2:00 PM

Slide decks for Desert Code Camp (#dcc11)



Posted by: Denny Cherry
In Person Events

I presented three sessions at Desert Code Camp 2011.1, and I had a great time doing so.  In case you wanted to grab a copy of the slide decks, here they are.

Getting SQL Service Broker Up and Running

SQL Server Indexing for the .NET Developer

Where should I be encrypting my data

Thanks,
Denny


April 4, 2011  2:00 PM

Can tempdb data and log put on on RAID 1 one physical disk, or keep it separate physical disk?



Posted by: Denny Cherry
SQL Server, Storage, TempDB

It really depends on the amount of load that you put on the tempdb database.  Everything which is done within the tempdb database is going to be logged in the transaction log just like any other database so if the tempdb database has a very high IO then it may make sense to put the tempdb log on a separate disk.  If however the tempdb database isn’t used very much by your databases then putting the log on the same disk as the tempdb data file should be fine.

If you present multiple disks to the tempdb data files with a subset of the tempdb data files on each disk, then present a separate disk for the tempdb log file just to ensure that the log isn’t causing uneven IO load on the disk.  Remember that creating multiple log files won’t do anything for you like it does when you have multiple data files as the log files are filled sequentially not in a round robin where the data files are done via a round robin.

Denny


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: