SQL Server with Mr. Denny


August 19, 2019  4:00 PM

VMware on Azure

Denny Cherry Denny Cherry Profile: Denny Cherry

The recent announcement that VMware will be available within Azure makes for a really interesting announcement.  It brings a great on-prem solution (VMware) that the admin teams already know and use extensively, into the Azure Cloud.  In a perfect work apps that are moving into the cloud should be moved to PaaS services, but that isn’t always possible. There are legacy apps, or services that have to be run that can’t be moved into Azure functions, and just need to be run as Windows Servers.

Moving from an on-prem world into Azure can be a daunting experience for Admins that are looking at Azure for the first time. By being able to simply put VMware hosts in their Azure environment we can combine the power of the Azure cloud platform and the existing knowledge of the VMware platform to make a migration (or expansion) to the cloud an extremely easy solution.

By using technology that the sysadmin team is already familiar with, they don’t have to learn about the entire Azure platform. They only need to familiarize themselves with the basics of Azure and how VMware interfaces with Azure.  The rest of the platform is just VMware, so that part of the platform the systems team should already know and be familiar with.

Is this something that can be used today? No, not yet. It was just recently announced so it’ll take time before this is ready, but it’ll be a interesting path for companies that are cloud adverse but need an option for burstable capacity without having to leave an entirely new cloud platform.

Denny

August 14, 2019  11:17 AM

Denny Cherry & Associates Consulting is a member of the Inc 5000 list for 2019

Denny Cherry Denny Cherry Profile: Denny Cherry

Inc. 5000 Award LogoI’m thrilled to announce that Denny Cherry & Associates Consulting has placed on the Inc 5000 list for 2019.  The Inc 5000 list is the most prestigious ranking of the nation’s fastest-growing private companies, and DCAC was named #2056 on this list which is a fantastic feat for our company.

I’m super proud of everyone at the company because we couldn’t have made this happen without our fantastic team members; so thank you all (Kris, Joey, Kerry, Monica, John, Peter, Meagan) for making this happen.  Everyone at DCAC is an expert in the field, and our placement on the list reflects that.

When I first spoke with our team about making the Inc. list, the assumption was that we’d be towards the bottom of the list because we’re a smaller company. But the fact that we were able to make the top half of the Inc. 5000 list shows how hard everyone has worked to make DCAC stand out from the other firms in our space.

A big thanks to our customers as well. We couldn’t be as successful as we have been without our customers.  We’ve been able to help them grow in each of their areas, and their success shows in our success.

Denny


August 12, 2019  4:00 PM

SQL Karaoke 2019 Is Coming to Seattle

Denny Cherry Denny Cherry Profile: Denny Cherry

I’m thrilled to announce that this year the SQL Karaoke party is on as planned. The party this year will be at Cowgirls, Inc.  We have two fantastic returning sponsors this year, and we’ll be joined by the great folks at SentryOne and SIOS.   Registration for the event will be similar to registration for years past; fill out the registration form, bring the registration form with you (this is very important if you have an open bar ticket as we’ll have no way to check if you registered at the event), show your ID at the door, have a great time.

As usual, we’ll have the great folks atRockaraoke there to be our live band and make us all look awesome.

Since we’re back at Cow Girls the patio “should” be open, and the mechanical bull is there to abuse you. 🙂

The party will be on November 5th, 2019 at 9:30 and it’ll go until you drop (or 2 am happens) so come and have a great time with your fellow PASS attendees as there are three days of excellent sessions the starting the following day.

Keep in mind that IDs will be checked at the door, and only those aged 21 and over will be admitted (state liquor laws).  Also, be aware that the PASS Code of Conduct will be inforced.

Wristbands for the open bar will get the beer, wine and well drinks.

Register, show up, have a great time. We have pretty simple rules.

Hopefully, we’ll see everyone there.

Denny


August 5, 2019  11:00 AM

Moving a disk in Azure that’s in Managed Storage

Denny Cherry Denny Cherry Profile: Denny Cherry

We’ve been using Azure for several years now at DCAC. Back when we started in Azure their were no PaaS services for MySQL (there was but it was stupid expensive, and from a third party). When we created VMs we put some in US West and some in US Central. Now that we’ve been able to move all our web properties from IaaS to PaaS we wanted to consolidate the VMs into one region so we could kill the site to site VPN (and save the $50 a month we were spending on keeping the VPN Gateway up). This required moving some VMs from US Central to US West as there’s a couple of VMs that we need to keep and I’m to lazy to set back up.

We ended up needing to move two VMs from US Central to US West. 1 has standard disks and one is a managed disk. The question became how to move these. The answer was using Start-AzStorageBlobCopy. The command is fairly straight forward.

$sourcekey = “17SZrnd…Q==”
$destkey = “UUSkMu…A==”
$sourceContext = New-AzStorageContext -StorageAccountKey $sourcekey -StorageAccountName SourceName
$destinationContext = New-AzStorageContext -StorageAccountKey $destkey -StorageAccountName DestinationAccountName
Start-AzStorageBlobCopy -Context $sourceContext -DestContext $destinationContext -SrcContainer vhds -DestContainer vhds -SrcBlob “VMdisk0.vhd” -DestBlob “VMDisk0.vhd”

Now this will nicely handle the VM that’s in a storage account. So how to do handle the VM that’s no in a storage account? Pretty much the same way with Start-AzStorageBlobCopy, there’s just a different parameter to use. Let’s look at that code.

$destkey = “UUSkMu…A==”
$destinationContext = New-AzStorageContext -StorageAccountKey $destkey -StorageAccountName DestinationAccountName
Start-AzStorageBlobCopy -AbsoluteUri “https://md-h1fl3bgpqvq3.blob.core.windows.net/f1mx3lrkhl1q/abcd?sv=2017-04-17&sr=b&si=994de0ad-04eb-46a7-8d91-62e827064bf4&sig=Q…3D” -DestContext $destinationContext -DestContainer vhds -DestBlob “VMdisk0.vhd”

Now, the first question is how to we get that URI that we need to pass in. First you’ll want to stop and delete the VM (don’t worry, this won’t delete the disk for the VM). Then in the Azure Portal find Disks and select the disk that you want to copy. On the menu that opens when you select the disk, you’ll see a “Disk Export” option. Select that. It’ll ask you how long you want to create a SAS key for. It defaults to 3600 seconds, I changed it to 7200 seconds to give it plenty of time, then click the Generage URL button. When that’s done it’ll give you a SAS URL (don’t close the window until you’ve copied the URL as the portal will only show it to you once). Take that URL and drop it into the AbsoluteUri parameter of Start-AzStorageBlobCopy.

Now these commands are going to take a while to run, and we want to see how far along they are. We can do that with another simple PowerShell cmdlet.
Get-AzStorageBlobCopyState -Blob “VMdisk0.vhd” -Context $destinationContext -Container vhds -WaitForComplete

This command throws a nice message up in my PowerShell window and the PowerShell window waits for the copy of the file to complete. The really nice thing about Start-AzStorageBlobCopy is that is doesn’t download the blob, all the copying happens in Azure so the copy is actually pretty quick.
Once this is done you have your data file sitting in a storage account, so you need to move it back into managed storage. This can be done in the GUI, unless you really want to do this in PowerShell.

Simply go into the disks section of the Azure Portal, and create a new disk (the plus right in the upper left). When the next screen opens one of the questions will be what the source of the new disk should be. One of the answers to this will be Storage Blob, you’ll want to select this option.
After you set the blob (there’s a GUI, use it, love it) make sure you set the OS of the disk and the correct size, and the performance tier (HDD, BSSD, PSSD) and click OK. Once that’s done, select your new disk and you can create a new VM based off of the disk. Once that’s done your VM is up and running in the new site. (Don’t forget to delete the old disk after you ensure the new VM is working as you don’t want to get charged for the old disk any more.)

Denny


July 29, 2019  1:00 AM

Data Platform Summit 2019 Pre-Cons

Denny Cherry Denny Cherry Profile: Denny Cherry

I’m thrilled to say that this year I’ll be delivering two Pre-Cons at the Data Platform Summit in August.  I’ll be delivering a session on August 19th, 2019 titled “Azure Infrastructure” as well as a course on August 21st, 2019 titled “HADR – SQL Server & Azure”.  

On those same two days, Joey D’Antoni will also be delivering two Pre-Cons, which his session on August 19th, 2019 titled “Architecting a Modern Analytics Solution – Start To Finish” and his session on August 21st, 2019 titled “Deploying & Maintaining SQL Server on Linux Platform.”

I know that we’re both thrilled by this chance to visit the great attendees at the conference, and hopefully, we’ll see you there.

Denny


July 23, 2019  5:44 PM

Only use CLR when you need to

Denny Cherry Denny Cherry Profile: Denny Cherry

There are people out there that love SQL CLR, and there are people that hate SQL CLR. I have a slightly different opinion, which is that I hate how most people use SQL CLR.  The trick to SQL CLR is that you should only be using it for things that CLR does better than T-SQL does; which is mainly dealing with text manipulation such as RegEx.  T-SQL can’t do stuff like RegEx, so SQL CLR is the way to do that.  If you’re having SQL CLR do things like calling out to a web server or running SSIS packages, or pulling in stuff from a file server all from T-SQL, then you probably need to take a step back and look at what you’re trying to do, and possibly do a redesign of what you’re trying to do.

Denny


July 15, 2019  9:04 PM

Azure Bastion

Denny Cherry Denny Cherry Profile: Denny Cherry

The recently announced Azure Bastion service looks like a pretty slick service. It provides a secure way into your VMs without the need to VPN in. It gives you the same authentication that you’d expect from the Azure Portal (MFA, AAD Credentials, etc.) all while giving you a pretty easy to manage way to get into VMs in Azure. Now, this bastion service isn’t going to be for every situation, so it shouldn’t be used for that. But if you need a secure, logged way to connect to VMs in your Azure environment, this looks like a pretty good solution.

What the bastion service does is allow users to log in to the Azure portal, then select the VM that they want to connect to. From there they get an RDP session within their browser that lets them log into the VM that’s running in Azure.  From a security perspective, the cool thing about this is that you don’t have to give your VMs public IPs. Because the Azure Bastion service is the bridge between the public internet and your internal VMs, nothing needs a public IP address as nothing is going directly to the Internet.

If your in an environment when you need a way to give users RDP access to servers, this is going to give you a nice secure way of going so.

Like I mentioned earlier, this isn’t going to solve all problems. If you work from home and you need SQL access to VMs, then Azure Bastion isn’t going to help you as it doesn’t just pass traffic like SQL Traffic. You’d need to RDP into a machine, then run the SQL tools from there. So if you wanted to run something locally that could log into SQL Server, you’ll still need a VPN in that case.  But for situations where you need to RDP into machines, users that are remote logging into a terminal server for example where you don’t want to have to require that they install VPN software, this could be a good solution for them.

Currently, the Azure Bastion service is in Preview, so you’ll need to sign up for it which you can do from the Microsoft Docs. That doc will also tell you how to use the Azure Bastion service, as you can’t access it from the normal portal URL (yet).

There’s a couple of items to know about Azure Bastion.

  1. It isn’t available in all the regions yet. Because it’s a preview service isn’t only in a few Azure regions. The lack of regions will change, but while it’s a preview, it’s going to be a limited release.  Those regions the service is in today are:
  • West US
  • East US
  • West Europe
  • South Central US
  • Australia East
  • Japan East
  1. Today Azure Bastion can’t plan vNets. So if you have VMs in two different vNets, you’ll need to bastion services, one in each vNet. Hopefully, this will change by release.

Denny

 


July 8, 2019  4:00 PM

For the Love of God, Stop Exposing Company Information

Denny Cherry Denny Cherry Profile: Denny Cherry

Companies (and the employees at them) need to stop posting private company information on the Internet. And they really need to stop posting private information in public spots with no password.  Just last week yet another company was found to be doing something stupid.  In this case, they had tons of information posted to an S3 bucket, and there was no password on the S3 bucket.  In this S3 bucket, they had backups from systems, One Drive backups from employees, credentials for customer environments, keys for their production environments, etc.

“System credentials can be found in a number of places in the Attunity data set and serve as a useful reminder of how that information might be stored in many places across an organization’s digital assets,” UpGuard researchers said in a report published yesterday.

This information should have never been posted to a publically accessible location, much less one without a password.  There’s no good reason why things like system credentials would be posted online.

As IT workers, we have to do better than this.  We just have to.  There are too many people out there who would do bad things with this information if they got there hands on it.

Do I have a solution, no I don’t. But this really isn’t a problem that needs a technical solution. Whoever did this, simply shouldn’t have done it. There is no excuse for exposing anything much less this much information.

Denny


July 1, 2019  4:55 PM

Today is MVP Day, and it was a good day for some not so for others

Denny Cherry Denny Cherry Profile: Denny Cherry

Today is “MVP Day” when Microsoft MVPs find out if they’ve been awarded for another year as Microsoft MVPs.  For some people, it was not that great of a day as they weren’t renewed as Microsoft MVPs today. For others, today there was a really good email in their inbox. I’m happy to say that all 5 of the folks at DCAC that were Microsoft MVPs yesterday are still Microsoft MVPs today.

Congrats to Joey, Monica, John and Meagan (and myself to make 5) on another year of being Microsoft MVPs.

Denny


June 25, 2019  2:44 PM

Thanks for the invite Data Grillen, See You Next Year

Denny Cherry Denny Cherry Profile: Denny Cherry

Everyone you atteeded Data Grillen 2019, thanks so much for the invite this year. I had a great time visiting with everyone, both those that I knew from prior years and those that I met for the first time this year.

Hopefully I’ll get the chance to see everyone again next year.

Look for an email from Ben and William with the link to all the presentions. The slide deck that Joey and I presented will be up there.

If we can be of any assistance in your journey to Azure please let us know how we can help.

Denny


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: