SQL Server with Mr. Denny


February 11, 2019  4:00 PM

Should SQL Servers be rebooted weekly

Denny Cherry Denny Cherry Profile: Denny Cherry

There are people out there that reboot SQL Servers (or all servers) weekly. Usually the argument that I hear for doing this is that SQL Server has memory leaks which can only be resolved by rebooting Windows.

My argument against this, is that no SQL Server doesn’t have memory leaks which cause it to need to be rebooted weekly.  SQL Server is designd to use all the RAM that’s allocated to it. By default SQL Server is configured to use 2147483647 MB of RAM, or all of the RAM in the server. You can and should be changing this setting to a lower number so that there is memory for Windows and anything else that’s installed on the box (anti-virus, SSIS packages, SSAS, etc.).  The people that I typically see doing these server reboots, have been doing them for 15-20 years, typically because back then rebooting a server would fix a problem.  It won’t anymore.

There are servers that are running SQL Server that have been up for months, or years without issue (I’m ignoring the fact that you aren’t patching SQL Server and Windows here). I’ve persoally seen boxes with pretty heavy workloads on them that have run for years without having to be rebooted.

SQL Server doesn’t have any memory leaks. It’s a world class product of Microsoft, that runs millions of databases (if not more) and makes Microsoft billions of dollars in revenue each year. Whatever memory leak you think there is in SQL Server, there’s probably isn’t. If you think there is, then contact CSS at Microsoft to report the issue as Microsoft needs to patch the problem.

The biggest problem that people will see if that the buffer pool is flushed when the SQL Server restarts, and this causes SQL Server to read all the data it needs from disk as there’s no data in RAM. This causes performance issues right after the restart of SQL Server.

If you’re restarting SQL Server because there’s some performance issues that “goes away” after a SQL Server restart, we’re going to be better off dealing with the root cause of what’s causing the problem to come up to begin with. It’s probably going to be indexing or statistics, but that’s just a guess as every server is different, but it’s probably a safe guess.  Looking at the server will tell us what the problem is for sure, so we can solve the problem (and hopefully make the server faster in the process).

Denny

February 5, 2019  3:49 PM

DCAC Grows Again, Adding Meagan Longoria to our team of Excellent Consultants

Denny Cherry Denny Cherry Profile: Denny Cherry

Today we are proud to announce that Denny Cherry & Associates Consulting is growing our team of consultants again.  Our newest team member is Meagan Longoria from Denver, CO.

Meagan is a Microsoft Data Platform MVP living with her adorable dog Buster in Denver, Colorado. She is an experienced consultant who has worked in business intelligence, data warehousing, and database development for over ten years. She enjoys creating solutions in Azure and SQL Server that make data useful for decision makers. Her areas of expertise include data visualization, dimensional modeling, and data integration design patterns.

Meagan enjoys sharing her knowledge with the technical community by speaking at conferences, blogging at DataSavvy.me, and sharing tips and helpful links on twitter (@mmarie).

With the addition of Meagan our team of consultants now includes 5 Microsoft Data Platform MVPs, which is more MVPs than most large consulting companies have.

We’re thrilled to have Meagan joining our team.

Denny


January 28, 2019  4:00 PM

Connecting to Office365 with PowerShell

Denny Cherry Denny Cherry Profile: Denny Cherry

When looking at support documents for Office365, you’ll often see instructions like “connect PowerShell to Office365,” but there are no instructions on how to do that.  I was going through this, and thought, this is a problem I can fix.  It only takes a few lines of PowerShell to connect your PowerShell to an Office365 environment.  Basically what you are doing is PowerShell Remoting.  The underlying code that you’ll want to run is this:

Set-ExecutionPolicy RemoteSigned
$UserCredential = Get-Credential
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $UserCredential -Authentication Basic -AllowRedirection
Import-PSSession $Session -DisableNameChecking

When you run this code, it’ll prompt you for a username and password (that’s what the second line does). These are the credentials that you use to manage the Office365 environment.  After that, you can fire off whatever commands you need to, and they’ll be send to the Office365 farm, and the response will be sent back to your PowerShell window.

It should be noted that you’ll get an error with the line that starts with “$Session” if you have MFA on your account.  You’ll either need to disable MFA or put an exception for your IP address so that you don’t get an MFA prompt.  Once you do that, you can authenticate without MFA, and the third line of the code will work as connectinos to Office 365 via PowerShell appear not to support MFA currently.

Denny


January 21, 2019  4:00 PM

Exchange Public Folder migration code

Denny Cherry Denny Cherry Profile: Denny Cherry

This last week John Morehouse and I did a significant office migration with one of our clients. As part of the migration, we decided to move their public folders from Exchange 2016 to Office 365 so that their public folders were hosted in the same place as their mailboxes; allowing us to

https://www.flickr.com/photos/spidere/5101386622/in/

https://www.flickr.com/photos/spidere/5101386622/in/

decommission Exchange as the public folders were the last thing on the servers.

Microsoft has some documentation on your the migration, but the Microsoft documentation is rather lengthy and takes a long time to go through. We only had about 400 Megs of data to move, most of which is contacts and calendars. Instead of going through the rather lengthy process to move such a small amount of data, we instead opted to export the data through Outlook and reimport it to Office 365. This process required that we reset the permissions on the data after the migration.

The first thing we needed to do was to export the current permissions from the on-prem Exchange 2016 server. The export of the permissions was done via the Get-PublicFolderClientPermission PowerShell CmdLet. We combined this with Export-CSV as so.

Get-PublicFolderClientPermission “\” | select of user, identity, accessrights | format-table -width 500 | Export-Csv c:\temp\rights.csv -notypeinformation

This produced a fixed width text file that we could work with. We then opened the file in Notepad++ and did some Find/Replace operations to turn our fixed width file into a file we could more easily work with.

First, we searched for anything that has three spaces and replaced it was three {pipe} values (we had a couple of values with two spaces in them). We then did a search for {pipe}{space}{space} and replaced that value with {pipe}. We then did a search for {pipe}{space} and replaced that with {pipe}. We then ran a search and replace for {pipe}{pipe} and replaced it with {pipe}. We did this a bunch of times until there were 0 results found to the file. This case of a pipe delimited file that we could work with.

We then told Exchange to use the Office 365 version of the public folder, instead of the local version. We did this with the Set-OrganizationConfig CmdLet. Keep in mind, this CmdLet will delete all the data in your Exchange server’s Public Folder database, so make sure that you export your data from the public folder, and that you pull down your permissions BEFORE you run this.

Set-OrganizationConfig -RemotePublicFolderMailboxes $Null -PublicFoldersEnabled Local

After this was all done, we connected a PowerShell window to Office 365 and processed the text file to put the permissions back. This was done using a short PowerShell script.

$a = Get-Content C:\temp\rights2.csv
$a | foreach {
$data = $_
$arr = $data.Split(“|”)
$perm = $arr[2].replace(“{“, “”).replace(“}”, “”)
$perma = $perm.split(“,”)

$permb = [collections.arraylist]$perma

$data

Add-PublicFolderClientPermission -Identity $arr[0] -user $arr[1] -AccessRights $permb

}

This script told us what it was working on (in case there was an error), then it did the work (using the Add-PublicFolderClientPermission CmdLet). The biggest trick with this script was converting the “normal” array named $perma into the “collection.arraylist” named $permb and then using that.

This script ran for a few hours, as Add-PublicFolderClientPermission isn’t very fast, but the result was that the permissions were back where they should have been, the Public Folder was hosted in Office 365 instead of Exchange, and we were one step closer to being able to get rid of the exchange server. And most importantly, the users didn’t report a single problem after the migration was done.

Denny


January 14, 2019  4:00 PM

When is the Right Time to Look at New Services?

Denny Cherry Denny Cherry Profile: Denny Cherry

Microsoft Azure is rolling out new features at a fantastic speed.  But when is the right time to evaluate those new features?  It might be right as the feature is released, it might be later in the lifecycle.  The basic answer to when to look at using new services is, it depends.

If the company has some problem that needs to be solved, that we can’t address today using available technology, or the solution we have built today could be made better, then a new feature or new service might be worth looking into.

If there’s nothing that needs to be solved, then the new feature isn’t helping you do anything. The feature is just shiny and new.

What it comes down to, is can the new feature potentially solve a problem that you or the company is having?  If you can’t solve the problem, then the new feature isn’t going to help you.  Now, this is going to mean some research into the new feature to see if it’s the right solution or not.  But if the feature that is being researched turns out to not be a solution to the solution to the problem (or a better solution to the problem than what you have today), then it’s time to move to another solution to the problem.

All too often, companies decide that they are going to use a solution to a problem, no matter what that right solution might be.  Even if the solution that they have decided must be used, costs thousands of millions of dollars a year.

Selecting a solution to a problem gets more complicated when there’s politics in the office involved.  All to often someone from upper management will decide that some product needs to be included in the solution.  This isn’t a new problem, either.

Back in the early 2000s, I was tasked with building a new knowledge base for a department at the company I worked at.  XML was getting popular and was being mentioned in all the magazines.  The knowledge base was supposed to be a normal relational database so that people could look up.  A Senior Manager wanted to use XML instead of a relational database.  I refused because XML wouldn’t perform well, XML wouldn’t scale well, and it would make no sense to build a database as a single XML file (which is what he wanted).  He insisted we use XML, and I asked him if he wanted to use XML, or he wanted the application to scale and perform well. It took a while to get him to see reason, but eventually, he saw reason and we used SQL Server for the relational data of the application.  And shockingly the application was able to be used successfully by thousands of employees on daily biases.

What it came to show, is that applications should be built to be successful, not to use some shiny bit of new technology.

Denny


January 7, 2019  4:08 AM

My 2018 Blogging By The Numbers

Denny Cherry Denny Cherry Profile: Denny Cherry

2018 was an great year for blogging for myself.  There was a decent amount of people reading articles that I’ve posted this year.  My numbers are a bit off, as there were some issues I didn’t notice when I did an upgrade of my WordPress plugin that counts all the views. My plugin recorded about 100k views of my posts.  Based on the stats that were recorded, my estimated page views are about 150k views for this year. Given the recovery this year, the lower number of page views make sense.

The most popular post that people were looking at was Difference between an Index and a Primary Key. With the next most popular post being my post on what MSDTC is, titled What exactly is MSDTC, any when do I need it?.

Other posts that were popular were about SQL Server Replication, Microsoft Ignite Announcements, SQL Server NUMA nodes, SQL Server NOLOCK, and a post about how important Disaster Recovry is.

All of this is a pretty huge spread of topics, but it gives me some idea what people are interested in reading about.

Here’s to an great 2019.

Denny


January 2, 2019  3:00 PM

DCAC Welcomes Our Newest Team Member

Denny Cherry Denny Cherry Profile: Denny Cherry

Today we start the new year (yes I know, yesterday was Jan 1, but it was a holiday) with a new team member at Denny Cherry & Associates Consulting. Peter Shire (T) is joining our team as our new Director of Sales.

Peter and Bill

 

Peter is a successful sales and marketing professional with vast experience and connections throughout the SQL Server community. He began his career with a eight year stint at Microsoft. Peter enjoys sharing that he “knew Bill when he wasn’t even a Billionaire. Here we are when he was a meager 470-Millionaire!”

Of course, many of you probably recognize Peter from his eleven years at SentryOne and or his long tenure as the President of the Charlotte SQL Server Users Group (CSSUG). While at SentryOne serving as the Director of Partner Relationships, he designed, built and managed their successful Partner program as well as their relationship with Microsoft and also SQL Server MVPs.

He helped resurrect CSSUG and grow it into one of the nation’s leading users group. The success of Charlotte’s SQLSaturday #33 – the first of the large multi-tiered events featuring 23 SQL Server MVPs, was pivotal to bringing the 2013 Summit to Charlotte. Peter’s influence on the community extends beyond Charlotte as he created unique events that brought renowned SQL Server experts Grant Fritchey and Kevin Kline throughout multiple PASS chapters in the South. Kevin Kline interviewed Peter in July 2017.

Peter is a husband, father to three daughters and wrangler of two golden retrievers. He loves college basketball but his favorite player of all time is still in high school. You can follow Peter via Twitter and Instagram.

We are extremely happy to be able to welcome Peter to our team.

Denny


December 24, 2018  4:00 PM

Last Chance To Click Thru To Make A Donation

Denny Cherry Denny Cherry Profile: Denny Cherry

Today (December 24th, 2018) is the last day that you can click through to our DCAC’s Happy Holiday’s page, which just my visiting the page will trigger us to donate $1 to the Elizabeth Glaser Pediatric AIDS Foundation.  If you come to the page after December 24th, 2018 you can still make a donation by clicking through on the page to make your own donation.

If you aren’t familer with the Elizabeth Glaser Pediatric AIDS Foundation; they have assisted 27 million pregnant women at more than 5,000 clinics in 19 African countries, The Elizabeth Glaser Pediatric AIDS Foundation continues to seek an end to global pediatric HIV/AIDS through prevention and treatment programs, research, and advocacy.

Needless to say, EGPAF is a great founation for Denny Cherry & Associates Consulting to be team up with. So click through, and we’ll make a donation.

Denny


December 18, 2018  9:01 PM

View a website, make a donation

Denny Cherry Denny Cherry Profile: Denny Cherry

Tis the season and Denny Cherry & Associates Consulting has a quick way for you to make a difference. All you have to do, is click through to our donation webpage, and that’s it.  For clicking through, DCAC will donate $1 to Elizabeth Glaser Pediatric AIDS Foundation.

Every dollar accessed by your participation matters. Having assisted 27 million pregnant women at more than 5,000 clinics in 19 African countries, The Elizabeth Glaser Pediatric AIDS Foundation continues to seek an end to global pediatric HIV/AIDS through prevention and treatment programs, research, and advocacy.

Want to DONATE MORE? EXCELLENT! We’re matching funds! Click through here to donate to EGPAF!

Denny


December 17, 2018  4:00 PM

IO Performance in Windows 2016

Denny Cherry Denny Cherry Profile: Denny Cherry

If you’re using Windows 2016 in, its default config you may not get getting the IO performance that you were expecting. If you try a Windows 2012 server, the problem magically goes away. The question is why.

The answer is shockingly straightforward, Windows 2016 ships with Windows Defender installed by default, where Windows 2012 R2 didn’t.  Windows Defender, if not disabled by GPO can have a significant impact on your server.  On a client machine in Azure that was having the issue, DiskSpd on a disk which should have 7500 IOPs available was getting just 1100.  The same applied to a stripe of 1TB disks which should have given us 20,000 IOPs was also seeing only 1100 IOPs. As soon as we disabled Windows Defender, we got the speed we were expecting.

Now I’m not going to recommend that you disable Windows Defender automatically, but you’ll want to set exclusions for it to ignore MDF and LDF files as well as BAK files (and any folders that you’re using for FILESTREAM data) so that Windows Defender it’s making your SQL Servers extremely slow.

Denny


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: