For me, that means no brain tumor this year. Hopefully that’s a easy bar to hit.
There’s a lot of versions of SQL Server available today. I’ve seen clients deploying new services on SQL Server 2015, SQL Server 2016, SQL Server 2017 (yes we have a client on SQL Server 2017 already) and SQL DB. But if you’re deploying a new SQL Server what’s the right version to deploy?
I’d love to tell you that the answer is to use the newest version, but it isn’t. And no one should.
Our first thing to look at is what features of the database platform do we need. Do they require SQL Server 2017 or does it work with older versions of SQL Server? The next decision point is what versions is the DBA ready to support? Our customer that is running SQL Server 2017 is willing to be on the bleeding edge of technology and take risks with new versions of software within days of their release. Not everyone is willing to take these risks and feel more comfortable on SQL Server 2014 or SQL Server 2016. While I don’t always agree with the idea of running older versions, for this reason, I do understand it. I may agree with it, but I do understand it.
After that, it before a political decision within your company as to what version of the database to run. I can’t help much with political problems.
If possible, I’d vote for a newer version, but my vote isn’t usually the important one.
Out of the box, SQL Server will encrypt some things by default to protect you and your data. Out of the box, SQL Server will encrypt the passwords which are sent up from the client to the SQL Server. This will keep the password from being sniffed on the network when logging in to the SQL Server instance.
SQL Server does this encryption using a self-signed certificate to ensure that the certificates always there. If you have selected a different certificate for encryption, then SQL Server will use this certificate to encrypt the login data.
Conferences, no matter the size all have one thing in common, they all require the same thing to run from Breakfast until Dinner for the day (or multiple days) that the event runs. That thing they need to run on is cash. The less expensive the event is to the attendee, the more the team running the event will need to make up this cash from somewhere else, usually from Sponsors.
Can an event run without sponsors? Sure. But say goodbye to coffee, snacks, sodas in the afternoon, possibly lunch. And in most places, say goodbye to the venue. These are all the things that sponsors are paying for by showing up and being there; among potentially others.
What it boils down to, is at events, especially smaller events, thank the vendors. They gave up their time and their companies cash to talk to you.
Back when Azure and Azure Active Directory got Windows InTune pushing down setting, and specifically oddball settings changes were complex. In the newest release of InTune that is accessible via Azure and Office365 things have gotten much easier. There used to be a major gap, in that you couldn’t run PowerShell. You had to convert it into an EXE, then package it via an MSI and upload the MSI to Azure. Short story, it wasn’t easy.
Now, however, you just need to sign your PowerShell (which was much easier than I was expecting) and upload it to the Azure portal. Then tell Azure which users are assigned to use the PowerShell. After that give the system some time to push to your users, and the PowerShell will be run against the users as needed.
In our case, we’ve got a non-standard VPN configuration, but using PowerShell, I was able to create the VPN connection on users computers easily enough. Let’s look at how it was done. The first step in Azure it two bring up “InTune” from the service list.
After opening up the Intune menu select the Device Configuration option from the Intune menu. This will give you access to where you’ll upload your PowerShell scripts.
The next step will be to setup a Certificate Authority internally. While this isn’t needed, it’s recommended so that all the users get the CA configuration. From what I’ve been able to tell with a CA in place (and duly registered and synced with Azure) multiple users can sign code and make it available for download and execution by users. For a more extensive IT shop this is going to be critical. For smaller shops, this may not be needed, but it will make life easier.
If you opt not to setup a CA within the network and sync it to Azure, then you’ll need to upload the certificate being used to sign code, and you can only upload a single certificate.
Once the CA is setup and Azure AD sees it (via AD Sync I assume) the menus changes so you can download the sync software. This took about 10 minutes for me when setting this up.
These changes are all done using the “Certification Authority” menu option that you see under “Device Configuration.”
Once the Certificate Authority is setup, you can go into the PowerShell scripts section of the screen. From there just click the “Add” button to add a PowerShell script to Intune.
Once you’ve added a PowerShell script you can add a name for a PowerShell script and point Azure to the signed PowerShell script so it can be run by users. There’s no much under “Settings” to work with.
The first setting is, is this a user-level script or a system level script. By default, scripts are run by the system account, but there’s a lot of cases where you want things to run at the user level instead, so you’ve got both options available. My script was written as a user-level script, so I set this to “Yes.”
The second setting allows you to force the system to check if the code is signed by a known code-publisher or if InTune doesn’t need to be checked or not. When I was working with this, I left this at “No,” and everything worked exactly according to plan (I also had a CA setup and synced with Azure and Intune).
After creating the script, the Portal should take you to the details of that specific script. The next step would be to change to the “Assignments” page. This is where you configure which domain groups will have access to download and run the script.
When you select “Assignments,” you can select as many groups as are needed to assign to this specific script. Groups can be synced from on-premises, groups which are AAD/O365 only, or even dynamic groups, so users are added automatically based on how settings for the users are configured.
It may seem like there are a bunch of steps to get this completed, but realistically once the PowerShell script is written, it took about 5 minutes to setup the script to be pushed out. After that, it was just a matter of waiting for users systems to refresh and pick up the change.
The short answer is that yes there are ports that you’ll want to block outbound by default. There’s a variety of amplification attacks that you have the possibility of being a member of. These attacks aren’t against your systems, but you run the risk of your machines being used to amplify attacks against others. These could be DNS based, NTP Based, or other kinds of amplification accounts.
Occasionally I get notifications from Azure that they see these ports open, and that you should network Network Security Groups to closed the unneeded ports.
Two of the ports that I’ve needed to deal with recently are UDP 123 and 389. Blocking these was a minor issue but best practice.
To be clear there is no inherent risk of being in Azure compared to other platforms. These sorts of amplification issues can come up in any environment. The beautiful thing about Azure is that they monitor these outbound issues and report back to the end on what blocking needs to be done for successful implementations,
Tweets, facebook posts and blog posts can be powerfull things. The have the ability to sway peoples opinions of others, to drive people to buy software, to sell stock, and to make bad decissions.
Posting cranky posts just to get clicks views and retweets does nothing useful but show that all you care about is showing that you want to stir the pot.
There are lots of ways of being constructive without fanning the flames. In the above tweet the author just craps all over someone, I assume the people who made the service pack, with no context or any followup at all. I get that it’s only a tweet with 140 characters, but there’s ways to get context. In our next example we see exactly how. We have a thank you to Microsoft for the lovely lapel pin/magnet, but a warning to people who aren’t used handling rare earth magnets that they need to be kept away from kids. As it’s a longer post (from Instagram) there’s a link though to the origional where the rest of the post finishes with “These are dangerous.”. The warning is still given, but without just crapping all over the fact that somone went through the trouble of sending these out to the MVPs.
I think my message here is, think before you post. Think how it’s going to impact others. Not just those you want to have read it, but those who did the thing that you’re writing about. Maybe rephrase how you’re going to post that snarky post and it’ll have more of the desired impact. I can almost guarantee that the first tweet had no useful impact on the SQL Server product team, where as the second post would have had much more impact to the MVP team when designing the next round of awards.
With the 8TB SSD drives that Azure has, which makes the most sense to use multiple 1TB SSDs or the 8TB SSD drives? Well that depends. The 8TB SSDs give you 7500 IOPs and 250 MB/sec, but if I take 8 1TB SSD drives I can get 1600 MB/sec of throughput and 40,000 IOPs in the same amount of space.
Of course I need to stripe the 8 disks together in Windows, but there’s no cost for that. The cost of 8 1TB drives is slightly higher than 1 8TB drive by 114 pounds in the case of this screenshots. But given the performance difference it’s a cost worth having.
So why would I want the 8TB drive, because I have a GS5 that needs 1/2PB of storage. There’s no “easy” way to do that with 1TB drives. If/when we get P70+ drives things will get really interesting.
Everyone takes shortcuts. It’s normal. But we shouldn’t be doing it. It comes with some disadvantages. Sometimes it doesn’t look pretty, sometimes the shortcuts cause performance problems, sometimes they cause bugs in software. Sometimes they cause applications to fails. Our job as IT professionals it’s to do what’s easy. It’s to do what’s in the best interest of the system or company.
Stop putting staples in plants. Stop taking shortcuts.
Today is Day 1 at the PASS Summit, and there’s going to be all sorts of blog posts all about what’s being announced during the keynote today (I assume). I’ll leave those announcements for others to blog about.
Denny Cherry & Associates Consulting has a big announcement to make today as well. Starting today, DCAC is expanding by adding another fantastic consultant to our ranks. This time we’re adding John Morehouse to our growing family. Like the rest of us, John will be working from home which means a co-worker in Kentucky (yea, another set of state paperwork to fill out every month, thanks, John).
John flew in this morning to join the rest of the team for his first day at the “office” which we appreciate. Leaving the two little ones on a very early flight after Halloween couldn’t have been the most fun thing ever.
John’s has 20 years of IT experience, with over ten years of dedicated SQL Server experience making him an excellent addition to the DCAC organization bringing our in-house team up to about 100 years of IT experience.
When John isn’t traveling to SQL Saturday events, his hobbies include spending him with his kids, reading and vacationing.
We welcome John to DCAC. Come to the exhibit hall and to booth 316 to get some great swag and say hi to John.