SQL Server with Mr. Denny


January 29, 2018  4:00 PM

Denny Cherry & Associates Consulting named Consulting Firm of 2017

Denny Cherry Denny Cherry Profile: Denny Cherry

I’m pleased to say that Denny Cherry & Associates Consulting was named by Technology Headlines as the top consulting firm in 2017. Technology Headlines did a nice writeup on Denny Cherry & Associates Consulting to boot.  I’m really proud of the group that we’ve put together at DCAC, and we’re starting 2018 off with a bang.

DCAC was selected as the consulting firm of the year for our work with our clients in both Microsoft Azure and Microsoft SQL Server.  Our clients have been pleased with our work, and that’s the number one thing for us, making sure that our clients are satisfied with the work that we do for them.  That’s the sort of thing that has separated us from some of the other firms in this space and elevated us to this award-winning position.

Be sure to check out our website www.dcac.co as well.

Denny

January 22, 2018  4:00 PM

Where Denny went during PASS and Live360 Events

Denny Cherry Denny Cherry Profile: Denny Cherry

https://www.flickr.com/photos/128733321@N05/17222900949/in

If you were at the PASS Summit, Live360 in November, or SQL Saturday in Slovenia in December, you may have noticed that I couldn’t make it to these events. It turns out that I had a medical issue that I needed to deal with as soon as possible.

Some Background

Over the six months or so before the PASS Summit I had a headache that I couldn’t shake.  Between trips, I made a doctors appointment with a general doctor that I used to use who sent me out for an MRI.  The Insurance company didn’t want to pay for the MRI, but the doctor was able to get the insurance company to pay for a CT scan instead.

Do what the doctor says

I went in for an outpatient CT scan at about 7 pm and figured that I’d get the results from my doctor in a few days.  8 minutes after the I left the outpatient CT scan, the doctor who reviewed the CT scan called my cell phone and told me that he usually doesn’t call patients directly, but that he needed me to turn around and go to my nearest ER.  Between the two of us, we decided to go to the ER attached to his facility was the easiest to get to, and he could call them and tell them I was coming.  So I turned around and drove back to the hospital.

That facility checked me in and transferred me to the neurology unit, which was at another one of their hospitals.  All of this was about a week and a half before the PASS Summit started.

Staying there

The trip to the emergency room and the CT scan were the start of 35+ days of being in the hospital, with two surgeries and one procedure. It was not a fun month.  After the doctor did the first of many MRIs, the doctor determined that we were dealing with a 4.3cm (about 2 inches for Americans) tumor on my brain stem.  To give a visual reference, that’s bigger than a golf ball.  It took about two days before we were able to get into the OR and get the tumor removed.

After the initial surgery I’ve done a lot of physical therapy, first at the hospital, then at an acute physical therapy facility.  Sadly there I picked up a nasty infection which required a transfer back to the hospital and another surgery to resolve.  Thankfully I’ve been home since the day after US Thanksgiving.  The bad news is that I’ve was on IV antibiotics since before I left the hospital, and I was on them until December 29th.

The tumor was not cancer, that’s was the most important part of all this.  That piece of information from the doctors alone made this much easier to deal with and get through.  It also meant not needing Chemo to finish my recovery.

Getting Back To Normal

Things are slowly getting back to normal.  This was an issue affecting the brain stem, not the brain; this means that it affected my fine motor skills, not my memory or ability to work with SQL Server. The most significant skill that had been impacted is walking. I moved to a walker while I was at the hospital which was pretty slow going.  I’ve since moved to mostly a cane, and I’m improving every day.

Speech is where I’m going to need some therapy as well.  Right now, when I speak at my regular speed (which is pretty quick), I have a noticeable speech impediment which I need to improve before I can give presentations again.

In short, I’ll be back doing presentations; it’ll just take some time before I’m back on stage.

Denny


January 15, 2018  4:00 PM

Welcome to Security Week at DCAC

Denny Cherry Denny Cherry Profile: Denny Cherry
Security, SQL

With the announcement of the CPU “issues” in the last week or so, this week has quickly become Security Week at DCAC with our blogging. This week will all be capped off without security webcast this Friday.  If you follow the DCAC Blog, you’ll see different security topics from everyone this week, with one new one coming out each week.

SQL Injection

I wanted to take this time to talk about our old, and poorly named, friend SQL Injection. To date, this is still the most common way for data to be exposed.  As applications get older and more corporate applications get abandoned the risk of them being abandoned gets worse and worse.  As I’ve written about time and time and time again, SQL Injection is a problem that needs to be solved on the application side. However, with enterprise applications that get abandoned this becomes hard for a business to deal with as some business unit needs to pay for these changes.

And that need to paying for development time to fix security issues is why SQL Injection issues can come up. For old applications, business units don’t see a value in fixing applications (or at least verifying that there’s no issue with the application) so the applications will just sit there until an issue comes up. And by the time it does, those problems aren’t going to go away they’re just going to get worse as you now have customer data (or employee, or vendor, etc.) out there in the wild. Now you have a Public Relations issue on top of your security issue.

Issues like we saw this month get pretty logos and flashy names, but for the most part these kinds of issues require some sort of server access (yes I know there’s proof of concepts out there).  But with SQL Injection as long as the application is exposed to users you have the potential for a problem.

We’re not just talking about external users here, but internal as well. Most breaches that companies have where data is taken are internal. In other words, you let the person into your office, gave them credentials to your network and let them go nuts on your network. I couldn’t tell you the number of unauthorized routers, Wi-Fi access points, or applications that scan the network I’ve found over the last 20 years.

So to recap, your biggest threats are employees that are inside your firewall, attacking old applications that haven’t been updated in years but still have access to information worth stealing.

It’s time to secure all those old applications.

Denny


January 8, 2018  4:00 PM

Database Security Webcast on January 19th, 2018

Denny Cherry Denny Cherry Profile: Denny Cherry

On January 19th meet with the crew from Denny Cherry and Associates Consulting at 11 am Pacific Time / 2 pm Eastern Time. During this webcast, we’ll talk about database security in general and specifically how Spectre and Meltdown impact database workloads within the Enterprise.

With Spectre and Meltdown taking over the IT news this month now is the time to review applications and databases to ensure that those applications are properly secured, and the data within those applications is kept safe from prying eyes.

We look forward to seeing you on the 19th, so get signed up now.

Denny


January 1, 2018  4:00 PM

Happy New Year, Welcome to 2018

Denny Cherry Denny Cherry Profile: Denny Cherry

It’s 2018 and welcome to it.  Here’s hoping that 2018 is a easier year than 2017 was.

For me, that means no brain tumor this year.  Hopefully that’s a easy bar to hit.

Denny


December 26, 2017  8:00 PM

When does it make sense to upgrade?

Denny Cherry Denny Cherry Profile: Denny Cherry

There’s a lot of versions of SQL Server available today.  I’ve seen clients deploying new services on SQL Server 2015, SQL Server 2016, SQL Server 2017 (yes we have a client on SQL Server 2017 already) and SQL DB.  But if you’re deploying a new SQL Server what’s the right version to deploy?

https://www.flickr.com/photos/greg_scales/33334103094/

I’d love to tell you that the answer is to use the newest version, but it isn’t. And no one should.

Our first thing to look at is what features of the database platform do we need.  Do they require SQL Server 2017 or does it work with older versions of SQL Server?  The next decision point is what versions is the DBA ready to support?    Our customer that is running SQL Server 2017 is willing to be on the bleeding edge of technology and take risks with new versions of software within days of their release.  Not everyone is willing to take these risks and feel more comfortable on SQL Server 2014 or SQL Server 2016.  While I don’t always agree with the idea of running older versions, for this reason, I do understand it.  I may agree with it, but I do understand it.

After that, it before a political decision within your company as to what version of the database to run.  I can’t help much with political problems.

If possible, I’d vote for a newer version, but my vote isn’t usually the important one.

Denny


December 19, 2017  4:00 PM

What does SQL encrypt by default?

Denny Cherry Denny Cherry Profile: Denny Cherry

For the purposes of this, we’re ignoring SQL Server 7.0 and below.

Out of the box, SQL Server will encrypt some things by default to protect you and your data.  Out of the box, SQL Server will encrypt the passwords which are sent up from the client to the SQL Server.  This will keep the password from being sniffed on the network when logging in to the SQL Server instance.

SQL Server does this encryption using a self-signed certificate to ensure that the certificates always there.  If you have selected a different certificate for encryption, then SQL Server will use this certificate to encrypt the login data.

Denny


December 11, 2017  4:00 PM

Be sure to thank the sponsors

Denny Cherry Denny Cherry Profile: Denny Cherry

Conferences, no matter the size all have one thing in common, they all require the same thing to run from BrDataeakfast until Dinner for the day (or multiple days) that the event runs. That thing they need to run on is cash. The less expensive the event is to the attendee, the more the team running the event will need to make up this cash from somewhere else, usually from Sponsors.

Can an event run without sponsors? Sure. But say goodbye to coffee, snacks, sodas in the afternoon, possibly lunch. And in most places, say goodbye to the venue. These are all the things that sponsors are paying for by showing up and being there; among potentially others.

What it boils down to, is at events, especially smaller events, thank the vendors. They gave up their time and their companies cash to talk to you.

Denny


December 4, 2017  4:00 PM

Managing VMs via Azure Active Directory just got a lot easier

Denny Cherry Denny Cherry Profile: Denny Cherry
Active Directory, Azure, Virtual Machines

Back when Azure and Azure Active Directory got Windows InTune pushing down setting, and specifically oddball settings changes were complex. In the newest release of InTune that is accessible via Azure and Office365 things have gotten much easier. There used to be a InTune menumajor gap, in that you couldn’t run PowerShell. You had to convert it into an EXE, then package it via an MSI and upload the MSI to Azure. Short story, it wasn’t easy.

Now, however, you just need to sign your PowerShell (which was much easier than I was expecting) and upload it to the Azure portal. Then tell Azure which users are assigned to use the PowerShell. After that give the system some time to push to your users, and the PowerShell will be run against the users as needed.

In our case, we’ve got a non-standard VPN configuration, but using PowerShell, I was able to create the VPN connection on users computers easily enough. Let’s look at how it was done. The first step in Azure it two bring up “InTune” from the service list.

After opening up the Intune menu select the Device Configuration option from the Intune menu. This will give you access to where you’ll upload your PowerShell scripts.

Device Configuration Menu

The next step will be to setup a Certificate Authority internally. While this isn’t needed, it’s recommended so that all the users get the CA configuration. From what I’ve been able to tell with a CA in place (and duly registered and synced with Azure) multiple users can sign code and make it available for download and execution by users. For a more extensive IT shop this is going to be critical. For smaller shops, this may not be needed, but it will make life easier.

If you opt not to setup a CA within the network and sync it to Azure, then you’ll need to upload the certificate being used to sign code, and you can only upload a single certificate.

Once the CA is setup and Azure AD sees it (via AD Sync I assume) the menus changes so you can download the sync software. This took about 10 minutes for me when setting this up.

These changes are all done using the “Certification Authority” menu option that you see under “Device Configuration.”

Once the Certificate Authority is setup, you can go into the PowerShell scripts section of the screen. From there just click the “Add” button to add a PowerShell script to Intune.

PowerShell Scripts WindowOnce you’ve added a PowerShell script you can add a name for a PowerShell script and point Azure to the signed PowerShell script so it can be run by users. There’s no much under “Settings” to work with.

The first setting is, is this a user-level script or a system level script. By default, scripts are run by the system account, but there’s a lot of cases where you want things to run at the user level instead, so you’ve got both options available. My script was written as a user-level script, so I set this to “Yes.”

The second setting allows you to force the system to check if the code is signed by a known code-publisher or if InTune doesn’t need to be checked or not. When I was working with this, I left this at “No,” and everything worked exactly according to plan (I also had a CA setup and synced with Azure and Intune).

After creating the script, the Portal should take you to the details of that specific script. The next step would be to change to the “Assignments” page. This is where you configure which domain groups will have access to download and run the script.

When you select “Assignments,” you can select as many groups as are needed to assign to this specific script. Groups can be synced from on-premises, groups which are AAD/O365 only, or even dynamic groups, so users are added automatically based on how settings for the users are configured.

It may seem like there are a bunch of steps to get this completed, but realistically once the PowerShell script is written, it took about 5 minutes to setup the script to be pushed out. After that, it was just a matter of waiting for users systems to refresh and pick up the change.

Denny


November 27, 2017  4:00 PM

Should I be blocking outbound ports in Azure by default?

Denny Cherry Denny Cherry Profile: Denny Cherry
Azure

The short answer is that yes there are ports that you’ll want to block outbound by default.  There’s a variety of amplification attacks that you have the possibility of being a member of. These attacks aren’t against your systems, but you run the risk of your machines being used to amplify attacks against others. These could be DNS based, NTP Based, or other kinds of amplification accounts.

Occasionally I  get notifications from Azure that they see these ports open, and that you should network Network Security Groups to closed the unneeded ports.

Two of the ports that I’ve needed to deal with recently are UDP 123 and 389.  Blocking these was a minor issue but best practice.

UDP 123 and 389Blocking these in Azure is super low risk and easy to implement,

To be clear there is no inherent risk of being in Azure compared to other platforms.   These sorts of amplification issues can come up in any environment. The beautiful thing about Azure is that they monitor these outbound issues and report back to the end  on what blocking needs to be done for successful implementations,

Denny


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: