SQL Server with Mr. Denny


October 14, 2019  4:00 PM

What is Page Life Expectancy?

Denny Cherry Denny Cherry Profile: Denny Cherry

If you’ve been a DBA for a while, you’ve heard people talking about PLE.  PLE stands for Page Life Expectancy; most DBAs (and really anyone that works with SQL Server for more than 5 seconds) called it PLE. I think this is for two reasons; 1 Expectancy is hard for a lot of people to say; 2 we’re lazy efficient.

PLE

PLE is reported by SQL Server as a number, somewhere between 0 and ~2B.  The number that it’s showing you is a number of seconds.  Those number of seconds are the amount of time that SQL Server estimates that it will be able to keep a page loaded in memory for before that page is swapped out.  So if a page of data was loaded into memory right now, and PLE was 100 seconds, then in ~100 seconds we could expect that page to be removed from the buffer pool in order to make room for another page.

I want to be really clear here, 100 is not a target number (neither is 300). It’s just a number that I am using for my explanation here.

When you look at PLE on a larger server, you may see a few numbers. That’s because each NUMA node has a PLE value, and those numbers are calculated together to get the servers PLE.  The actual calculation isn’t an average or anything quite that simple.  You take the PLE value of the node and multiply it by 1000.  You do this for all the NUMA nodes. Then you sum up these values, and divide the total by the number of NUMA nodes, then divide by 1000.

This means that PLE for one NUMA node can drop, while PLE for another NUMA node can be stable, and the server-wide PLE number may not change (or change much).

The reason that each NUMA node has its own PLE value is that SQL Server manages each NUMA node independently.  Understanding why PLE for each node can vary requires an understanding of how SQL Server works.  So, let’s dive a little deeper.  (For this article we’re going to assume a 2 CPU server, which has 2 NUMA nodes. It keeps things easier, but all this applies if you have 4 or 8 nodes as well.)

So let’s calculate PLE for our sample server. If both NUMA nodes have a PLE of 100, then the math looks like this.

((100 * 1000) + (100 * 1000)) / 2 / 1000 = 100

If one node has a PLE of 100 and the other has a PLE of 60, then the math looks like this.

((100 * 1000) + (60 * 1000)) / 2 / 1000 = 80

If you have a four-node box, then the value gets even odder looking. Let’s assume we have three NUMA nodes with a PLE of 100 and one node with a PLE of 60 and see how the math looks.

((100 * 1000) + (100 * 1000) + (100 * 1000) + (60 * 1000)) / 4 / 1000 = 90

NUMA assignments

When a table is loaded into memory, that table is assigned to a NUMA node; and that’s the only NUMA node that the table will be assigned to.  The reason for this is that SQL Server only wants to assign CPU processors from that NUMA node to work on that table.  Because of this, if there’s a report against that table (or a bad query that causes a scan of the table), that’ll impact the PLE of that NUMA node because the table needs to be loaded into memory so that the CPUs can process that table. (All sorts of things can cause PLE issues to crop up, this is just one example.)

So if we’ve got one really large table in our database, and lots of small tables (and everything in SQL Server and the OS set correctly) we should expect to see two very different numbers for PLE. One for the NUMA node that is working on the large table, and one on the NUMA node that’s hosting all the smaller tables.

You may be asking yourself how can I assign tables to a specific NUMA node, and the answer is that you can’t. Even if you could, there’s pretty much no reason to. SQL Server does a really good job of assigning tables to NUMA nodes all by itself.  If you think you’re having a problem with SQL Server because it’s assigning to many tables to a NUMA node, you’re probably got other problems that need to be dealt with (hint, we can help you with that).

PLE Recommendations

There are lots of PLE recommendations out there. Most of them are built on the number 300 which you’ll find in Books Online. Any advice about PLE, which has a specific number as a target, is worth the amount of money you paid to read the blog post (if you did pay to read that number I’m sorry you got screwed).  The target PLE is very system-specific, and it should be treated as such. Some systems should have a really low PLE because they churn data so often.  Other systems need to have a really high PLE so that they never have to hit the disk.

One of our clients has a PLE of somewhere around 35,000. If it gets below 28,800 (8 hours) we start to worry.  When we were designing this system, we decided that when customers ran the end of day reports (the customers are only open for 8 hours a day) for their office, we didn’t want to hit the hard drive. That means that all the data over the last 8 hours needed to be loaded into memory and it needs to stay there until the end of the day.  Is this a guarantee that there are no queries that will hit the disk at the end of the client’s day? No, of course not, but we’ve done everything we can to make this a reality for the client.  And given that this client has thousands of customers, and tens of thousands of unique users per day (if not more) and the system has been stable for years, we’ve done a pretty good job getting to that point.

So what number should you shoot for when looking at PLE?  As high a value as you can afford to get. Increasing PLE is a matter of index tuning and buying more RAM.  If you can afford to get more RAM, get it.  The more data that SQL Server can cache, the higher PLE is going to go, and the happier SQL Server will be.

Wrap-Up

While you probably don’t need to know what PLE is in the long run, it’s one of those things that you’ll want to keep an eye on, because if it drops you want to know why. PLE dropping isn’t a problem or even a symptom of a problem; unless it is. It’s a potential warning that some investigation on the server needs to be done and from that investigation, you might need to do more, or you might not.

Denny

October 7, 2019  4:00 PM

Doing something in a loop in Microsoft flow, and something else if there’s no match

Denny Cherry Denny Cherry Profile: Denny Cherry

Recently I wanted to make a Microsoft flow process to create channels in Microsoft Teams once a project is ready to start.  If the channel was already there then I wanted a message to be posted to the Teams channel saying that the project was ready, but that it didn’t need to create the channel.  So in programming terms I wanted a loop, with an IF statement inside the loop. Then if the loop got to the end of the downloaded set and there was no match it should do something.

Now Microsoft Flow doesn’t include anything to do this, so I had to get a little creative.

As I was looking through Microsoft Flow I found that you could use Variables within a Microsoft Flow application, and this made things suddenly click.  At the begining of the Flow I initiazlited a variable; in this case called “FoundClientInTeams”.

From here I setup a pull from Teams that would pull down all the channels for a team.  That was passed to an “Apply to Each” object, which is basically a “FOR” loop.

Within that “Apply to Each” object there’s a Condition that looks to see if the client name that we’re looking for exists or not.  If the channel exists (according to the condition) then the flow sends a message and changes the variable from false to true.  (There’s a “No” block of this condition that doesn’t do anything.)

After the “Apply to Each” there’s another Condition block, which checks to see if the varaiblehas been reset if needed. If the value is still false, that means that there was no match found, so we need to create the channel.  Is also posts a message saying that it created the channel, so there’s some debugging going on as well. The “No” side of the condition is left blank, so if the channel is found the varible is changed, and this condition should fail and nothing should be done at this point.

As I was working on this, I’m shocked to find that Microsoft flow is as poweful as it is.  What we’re doing in this case, is that when a document is updated on our SharePoint O365 site (in our case the SOW status), Microsoft Flow sees the change, and them either creates a new channel for the client if they are a new client, or just posting a message if they are an existing client.  Because we were able to automate this task, its one less thing that we need to do when we sign an SOW for a client.

Denny

 


September 30, 2019  4:00 PM

How big can you make a SQL Server transaction log file?

Denny Cherry Denny Cherry Profile: Denny Cherry

While the GUI (SSMS) has UNLIMITED in there, that’s actually not the right answer here. If you set the truncation log file to UNLIMITED and click save, it’ll actually reset the maximum file size to 2TB, since that’s actually the max supported size of a transaction log file.

So how to do create a single transaction bigger than 2 TB? You have multiple transaction log files. Also, don’t create 2TB transactions. If those fail, rollback is going to take a LONG time.

Denny


September 28, 2019  4:00 PM

Can I setup replication with an availability group as the subscriber?

Denny Cherry Denny Cherry Profile: Denny Cherry

Yes, absolutely. In order to do this, you need to configure the replication to use the Availability Group listener as the subscriber.  That way, the database that you’re using as the subscriber will follow the AG.  After replication creates the database, you will need to add it to the AG, as replication won’t do that for you.  After that, it’s just normal SQL Server Replication.

If you want to, you can replicate from the same availability group to another database on that same availability group.

Denny


September 23, 2019  4:00 PM

PowerShell for SQL Server Administration

Denny Cherry Denny Cherry Profile: Denny Cherry

If you’re a DBA, espcially if you’ve got multiple SQL Server’s to manage (or you work in Azure, or Exchange, or VMware, or a bunch of other things) then you’ll want to learn some PowerShell.  I recently released a course on LinkedIn Lynda where you learn about PowerShell and how to manage multiple servers all at once.

So click on through and check out the course.

Denny


September 17, 2019  3:00 PM

Heading to the PASS Summit 2019? There’s still time to sign up for the SQL Karaoke Party!

Denny Cherry Denny Cherry Profile: Denny Cherry

Are you planning on heading to the PASS Summit in November 2019 (or just be in Seattle that week because you live there, work there, etc)?  There’s still time to sign up for the SQL Karaoke party on Tuesday November 5th, 2019 at 9:30pm. It’s going to be a great night of hanging out with friends, watching some friends sing some great songs (they’ll have a great backup band) and have an all together great time.

Go to the Event website, and register. Show up and have a great time.

Denny


September 9, 2019  3:00 PM

Cloud Migration and Data Warehousing

Denny Cherry Denny Cherry Profile: Denny Cherry

Over the last year Denny Cherry & Associates Consulting been working on creating a new data warehouse for renowned non-profit, the Elizabeth Glaser Pediatric AIDS Foundation.  The data warehouse centralizes patient data from all the counties that the foundation serves to improve reporting and improve patient care, enabling them to respond faster to patient needs and thereby save lives.

As an added bonus, the foundation informs us they’ve been able to save a whopping 75% on annual IT administrative costs.  This cost reduction allows the Elizabeth Glaser Pediatric AIDS Foundation to provide more care to more patients, as well as give donors more information about where their donations are being used, which encourages additional funding.

For DCAC this solution led to us winning three American Business Awards  including “Technical Innovation of the Year” and “Most Innovative Tech Company of the Year.”

If you are contemplating a data warehousing challenge, we encourage you to read the full case study on our website.

Denny


September 2, 2019  4:00 PM

Stop telling DocuSign to send the Documents when they are signed

Denny Cherry Denny Cherry Profile: Denny Cherry

I love DocuSign. As a person that sends out contracts to clients I have no idea how I’d ever function without DocuSign. I know that it’s used by a huge number of companies to sign contracts, transmit tax information between companies, etc.  There’s one flaw that we’ve found in it, you can tell DocuSign to email you the documents after they are signed. This means that whatever is in those documents gets sent. So if your tax info is in there, if gets sent to everyone over email.  This is a bit of a problem.

Now I’m guessing that DocuSign won’t remove this setting (they should, but I’m guessing that it was added because people wanted it) but companies should turn this setting off (which is the default).

If your DocuSign is sending out emails when you are done signing and you’d like to stop it, it’s a simple settings change. Go to DocuSign and log in.   Click on your picture in the upper right corner, and select “Go To Admin”.  Select “Signing Settings” from the menu on the right.  Under envelope delivery there’s an option which says “Attach documents to completion email”. Uncheck that option and click save.

By turning off this checkbox, you’ll still send out an email upon completion of the document, but the document won’t be attached. You’ll have to click the link to download the document that you signed from the DocuSign portal.

Denny


August 26, 2019  4:00 PM

We’ve moved our VMs to the cloud, so now we’re cloud ready, right?

Denny Cherry Denny Cherry Profile: Denny Cherry

No, not really.  In my mind, you’ve completed the first step on your journey to the cloud.  The end state that you want to get to eventually is a Platform asCloudflare being the cloud a Service (PaaS) offering.  Being able to move from Infrastructure as a Service (IaaS) to PaaS is the next step that you’ll want to take.  To be fair there are some applications that aren’t going to be ready for PaaS; these are going to be vendor apps that you’ve purchased. Unless the vendor makes these applications cloud-ready, they probably won’t be so you’ll need to keep running those in VMs until the applications are either replaced, or the vendor makes a cloud-ready version of the application.

For those home-grown applications, PaaS is where you want to have those applications end up.  The costs of running applications are typically cheaper than running Virtual Machines; high availability is built into most PaaS platforms, and disaster recovery is easy to set up and typically looks like a scale-out configuration with active/active offerings instead of an active/passive offering with servers sitting there doing nothing.

Just because you’ve completed that move from VMs on-prem into a cloud platform, your journey is not complete.  You’re just starting down the path of cloud.  If the cloud is something that you’ve been starting or thinking about starting, we can help you with that.  Just reach out to our team, and we’ll help you with your cloud journey.  If you’re just starting your cloud journey, or if you’re well into it, we can help.  If you’re just starting your cloud journey, we might be able to get Microsoft to help pay for the expenses around getting your cloud journey started.

Denny


August 19, 2019  4:00 PM

VMware on Azure

Denny Cherry Denny Cherry Profile: Denny Cherry

The recent announcement that VMware will be available within Azure makes for a really interesting announcement.  It brings a great on-prem solution (VMware) that the admin teams already know and use extensively, into the Azure Cloud.  In a perfect work apps that are moving into the cloud should be moved to PaaS services, but that isn’t always possible. There are legacy apps, or services that have to be run that can’t be moved into Azure functions, and just need to be run as Windows Servers.

Moving from an on-prem world into Azure can be a daunting experience for Admins that are looking at Azure for the first time. By being able to simply put VMware hosts in their Azure environment we can combine the power of the Azure cloud platform and the existing knowledge of the VMware platform to make a migration (or expansion) to the cloud an extremely easy solution.

By using technology that the sysadmin team is already familiar with, they don’t have to learn about the entire Azure platform. They only need to familiarize themselves with the basics of Azure and how VMware interfaces with Azure.  The rest of the platform is just VMware, so that part of the platform the systems team should already know and be familiar with.

Is this something that can be used today? No, not yet. It was just recently announced so it’ll take time before this is ready, but it’ll be a interesting path for companies that are cloud adverse but need an option for burstable capacity without having to leave an entirely new cloud platform.

Denny


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: