Over the last year Denny Cherry & Associates Consulting been working on creating a new data warehouse for renowned non-profit, the Elizabeth Glaser Pediatric AIDS Foundation. The data warehouse centralizes patient data from all the counties that the foundation serves to improve reporting and improve patient care, enabling them to respond faster to patient needs and thereby save lives.
As an added bonus, the foundation informs us they’ve been able to save a whopping 75% on annual IT administrative costs. This cost reduction allows the Elizabeth Glaser Pediatric AIDS Foundation to provide more care to more patients, as well as give donors more information about where their donations are being used, which encourages additional funding.
For DCAC this solution led to us winning three American Business Awards including “Technical Innovation of the Year” and “Most Innovative Tech Company of the Year.”
If you are contemplating a data warehousing challenge, we encourage you to read the full case study on our website.
I love DocuSign. As a person that sends out contracts to clients I have no idea how I’d ever function without DocuSign. I know that it’s used by a huge number of companies to sign contracts, transmit tax information between companies, etc. There’s one flaw that we’ve found in it, you can tell DocuSign to email you the documents after they are signed. This means that whatever is in those documents gets sent. So if your tax info is in there, if gets sent to everyone over email. This is a bit of a problem.
Now I’m guessing that DocuSign won’t remove this setting (they should, but I’m guessing that it was added because people wanted it) but companies should turn this setting off (which is the default).
If your DocuSign is sending out emails when you are done signing and you’d like to stop it, it’s a simple settings change. Go to DocuSign and log in. Click on your picture in the upper right corner, and select “Go To Admin”. Select “Signing Settings” from the menu on the right. Under envelope delivery there’s an option which says “Attach documents to completion email”. Uncheck that option and click save.
By turning off this checkbox, you’ll still send out an email upon completion of the document, but the document won’t be attached. You’ll have to click the link to download the document that you signed from the DocuSign portal.
No, not really. In my mind, you’ve completed the first step on your journey to the cloud. The end state that you want to get to eventually is a Platform as a Service (PaaS) offering. Being able to move from Infrastructure as a Service (IaaS) to PaaS is the next step that you’ll want to take. To be fair there are some applications that aren’t going to be ready for PaaS; these are going to be vendor apps that you’ve purchased. Unless the vendor makes these applications cloud-ready, they probably won’t be so you’ll need to keep running those in VMs until the applications are either replaced, or the vendor makes a cloud-ready version of the application.
For those home-grown applications, PaaS is where you want to have those applications end up. The costs of running applications are typically cheaper than running Virtual Machines; high availability is built into most PaaS platforms, and disaster recovery is easy to set up and typically looks like a scale-out configuration with active/active offerings instead of an active/passive offering with servers sitting there doing nothing.
Just because you’ve completed that move from VMs on-prem into a cloud platform, your journey is not complete. You’re just starting down the path of cloud. If the cloud is something that you’ve been starting or thinking about starting, we can help you with that. Just reach out to our team, and we’ll help you with your cloud journey. If you’re just starting your cloud journey, or if you’re well into it, we can help. If you’re just starting your cloud journey, we might be able to get Microsoft to help pay for the expenses around getting your cloud journey started.
The recent announcement that VMware will be available within Azure makes for a really interesting announcement. It brings a great on-prem solution (VMware) that the admin teams already know and use extensively, into the Azure Cloud. In a perfect work apps that are moving into the cloud should be moved to PaaS services, but that isn’t always possible. There are legacy apps, or services that have to be run that can’t be moved into Azure functions, and just need to be run as Windows Servers.
Moving from an on-prem world into Azure can be a daunting experience for Admins that are looking at Azure for the first time. By being able to simply put VMware hosts in their Azure environment we can combine the power of the Azure cloud platform and the existing knowledge of the VMware platform to make a migration (or expansion) to the cloud an extremely easy solution.
By using technology that the sysadmin team is already familiar with, they don’t have to learn about the entire Azure platform. They only need to familiarize themselves with the basics of Azure and how VMware interfaces with Azure. The rest of the platform is just VMware, so that part of the platform the systems team should already know and be familiar with.
Is this something that can be used today? No, not yet. It was just recently announced so it’ll take time before this is ready, but it’ll be a interesting path for companies that are cloud adverse but need an option for burstable capacity without having to leave an entirely new cloud platform.
I’m thrilled to announce that Denny Cherry & Associates Consulting has placed on the Inc 5000 list for 2019. The Inc 5000 list is the most prestigious ranking of the nation’s fastest-growing private companies, and DCAC was named #2056 on this list which is a fantastic feat for our company.
I’m super proud of everyone at the company because we couldn’t have made this happen without our fantastic team members; so thank you all (Kris, Joey, Kerry, Monica, John, Peter, Meagan) for making this happen. Everyone at DCAC is an expert in the field, and our placement on the list reflects that.
When I first spoke with our team about making the Inc. list, the assumption was that we’d be towards the bottom of the list because we’re a smaller company. But the fact that we were able to make the top half of the Inc. 5000 list shows how hard everyone has worked to make DCAC stand out from the other firms in our space.
A big thanks to our customers as well. We couldn’t be as successful as we have been without our customers. We’ve been able to help them grow in each of their areas, and their success shows in our success.
I’m thrilled to announce that this year the SQL Karaoke party is on as planned. The party this year will be at Cowgirls, Inc. We have two fantastic returning sponsors this year, and we’ll be joined by the great folks at SentryOne and SIOS. Registration for the event will be similar to registration for years past; fill out the registration form, bring the registration form with you (this is very important if you have an open bar ticket as we’ll have no way to check if you registered at the event), show your ID at the door, have a great time.
As usual, we’ll have the great folks atRockaraoke there to be our live band and make us all look awesome.
Since we’re back at Cow Girls the patio “should” be open, and the mechanical bull is there to abuse you. 🙂
The party will be on November 5th, 2019 at 9:30 and it’ll go until you drop (or 2 am happens) so come and have a great time with your fellow PASS attendees as there are three days of excellent sessions the starting the following day.
Keep in mind that IDs will be checked at the door, and only those aged 21 and over will be admitted (state liquor laws). Also, be aware that the PASS Code of Conduct will be inforced.
Wristbands for the open bar will get the beer, wine and well drinks.
Register, show up, have a great time. We have pretty simple rules.
Hopefully, we’ll see everyone there.
We’ve been using Azure for several years now at DCAC. Back when we started in Azure their were no PaaS services for MySQL (there was but it was stupid expensive, and from a third party). When we created VMs we put some in US West and some in US Central. Now that we’ve been able to move all our web properties from IaaS to PaaS we wanted to consolidate the VMs into one region so we could kill the site to site VPN (and save the $50 a month we were spending on keeping the VPN Gateway up). This required moving some VMs from US Central to US West as there’s a couple of VMs that we need to keep and I’m to lazy to set back up.
We ended up needing to move two VMs from US Central to US West. 1 has standard disks and one is a managed disk. The question became how to move these. The answer was using Start-AzStorageBlobCopy. The command is fairly straight forward.
$sourcekey = “17SZrnd…Q==”
$destkey = “UUSkMu…A==”
$sourceContext = New-AzStorageContext -StorageAccountKey $sourcekey -StorageAccountName SourceName
$destinationContext = New-AzStorageContext -StorageAccountKey $destkey -StorageAccountName DestinationAccountName
Start-AzStorageBlobCopy -Context $sourceContext -DestContext $destinationContext -SrcContainer vhds -DestContainer vhds -SrcBlob “VMdisk0.vhd” -DestBlob “VMDisk0.vhd”
Now this will nicely handle the VM that’s in a storage account. So how to do handle the VM that’s no in a storage account? Pretty much the same way with Start-AzStorageBlobCopy, there’s just a different parameter to use. Let’s look at that code.
$destkey = “UUSkMu…A==”
$destinationContext = New-AzStorageContext -StorageAccountKey $destkey -StorageAccountName DestinationAccountName
Start-AzStorageBlobCopy -AbsoluteUri “https://md-h1fl3bgpqvq3.blob.core.windows.net/f1mx3lrkhl1q/abcd?sv=2017-04-17&sr=b&si=994de0ad-04eb-46a7-8d91-62e827064bf4&sig=Q…3D” -DestContext $destinationContext -DestContainer vhds -DestBlob “VMdisk0.vhd”
Now, the first question is how to we get that URI that we need to pass in. First you’ll want to stop and delete the VM (don’t worry, this won’t delete the disk for the VM). Then in the Azure Portal find Disks and select the disk that you want to copy. On the menu that opens when you select the disk, you’ll see a “Disk Export” option. Select that. It’ll ask you how long you want to create a SAS key for. It defaults to 3600 seconds, I changed it to 7200 seconds to give it plenty of time, then click the Generage URL button. When that’s done it’ll give you a SAS URL (don’t close the window until you’ve copied the URL as the portal will only show it to you once). Take that URL and drop it into the AbsoluteUri parameter of Start-AzStorageBlobCopy.
Now these commands are going to take a while to run, and we want to see how far along they are. We can do that with another simple PowerShell cmdlet.
Get-AzStorageBlobCopyState -Blob “VMdisk0.vhd” -Context $destinationContext -Container vhds -WaitForComplete
This command throws a nice message up in my PowerShell window and the PowerShell window waits for the copy of the file to complete. The really nice thing about Start-AzStorageBlobCopy is that is doesn’t download the blob, all the copying happens in Azure so the copy is actually pretty quick.
Once this is done you have your data file sitting in a storage account, so you need to move it back into managed storage. This can be done in the GUI, unless you really want to do this in PowerShell.
Simply go into the disks section of the Azure Portal, and create a new disk (the plus right in the upper left). When the next screen opens one of the questions will be what the source of the new disk should be. One of the answers to this will be Storage Blob, you’ll want to select this option.
After you set the blob (there’s a GUI, use it, love it) make sure you set the OS of the disk and the correct size, and the performance tier (HDD, BSSD, PSSD) and click OK. Once that’s done, select your new disk and you can create a new VM based off of the disk. Once that’s done your VM is up and running in the new site. (Don’t forget to delete the old disk after you ensure the new VM is working as you don’t want to get charged for the old disk any more.)
I’m thrilled to say that this year I’ll be delivering two Pre-Cons at the Data Platform Summit in August. I’ll be delivering a session on August 19th, 2019 titled “Azure Infrastructure” as well as a course on August 21st, 2019 titled “HADR – SQL Server & Azure”.
On those same two days, Joey D’Antoni will also be delivering two Pre-Cons, which his session on August 19th, 2019 titled “Architecting a Modern Analytics Solution – Start To Finish” and his session on August 21st, 2019 titled “Deploying & Maintaining SQL Server on Linux Platform.”
I know that we’re both thrilled by this chance to visit the great attendees at the conference, and hopefully, we’ll see you there.
There are people out there that love SQL CLR, and there are people that hate SQL CLR. I have a slightly different opinion, which is that I hate how most people use SQL CLR. The trick to SQL CLR is that you should only be using it for things that CLR does better than T-SQL does; which is mainly dealing with text manipulation such as RegEx. T-SQL can’t do stuff like RegEx, so SQL CLR is the way to do that. If you’re having SQL CLR do things like calling out to a web server or running SSIS packages, or pulling in stuff from a file server all from T-SQL, then you probably need to take a step back and look at what you’re trying to do, and possibly do a redesign of what you’re trying to do.
The recently announced Azure Bastion service looks like a pretty slick service. It provides a secure way into your VMs without the need to VPN in. It gives you the same authentication that you’d expect from the Azure Portal (MFA, AAD Credentials, etc.) all while giving you a pretty easy to manage way to get into VMs in Azure. Now, this bastion service isn’t going to be for every situation, so it shouldn’t be used for that. But if you need a secure, logged way to connect to VMs in your Azure environment, this looks like a pretty good solution.
What the bastion service does is allow users to log in to the Azure portal, then select the VM that they want to connect to. From there they get an RDP session within their browser that lets them log into the VM that’s running in Azure. From a security perspective, the cool thing about this is that you don’t have to give your VMs public IPs. Because the Azure Bastion service is the bridge between the public internet and your internal VMs, nothing needs a public IP address as nothing is going directly to the Internet.
If your in an environment when you need a way to give users RDP access to servers, this is going to give you a nice secure way of going so.
Like I mentioned earlier, this isn’t going to solve all problems. If you work from home and you need SQL access to VMs, then Azure Bastion isn’t going to help you as it doesn’t just pass traffic like SQL Traffic. You’d need to RDP into a machine, then run the SQL tools from there. So if you wanted to run something locally that could log into SQL Server, you’ll still need a VPN in that case. But for situations where you need to RDP into machines, users that are remote logging into a terminal server for example where you don’t want to have to require that they install VPN software, this could be a good solution for them.
Currently, the Azure Bastion service is in Preview, so you’ll need to sign up for it which you can do from the Microsoft Docs. That doc will also tell you how to use the Azure Bastion service, as you can’t access it from the normal portal URL (yet).
There’s a couple of items to know about Azure Bastion.
- It isn’t available in all the regions yet. Because it’s a preview service isn’t only in a few Azure regions. The lack of regions will change, but while it’s a preview, it’s going to be a limited release. Those regions the service is in today are:
- West US
- East US
- West Europe
- South Central US
- Australia East
- Japan East
- Today Azure Bastion can’t plan vNets. So if you have VMs in two different vNets, you’ll need to bastion services, one in each vNet. Hopefully, this will change by release.