The first step was to backup the SQL Server database and restore it to the new SQL Server. No problems there, just a normal SQL Server backup and restore operation (just make sure that you don’t restore over an existing database). The next thing to do was to get the data from the SQL Server database that SharePoint didn’t know anything about into a SQL Server database that SharePoint did know about. This is done by going into Central Administration and selecting the “Backup and Restore” option. Then within the “Granular Backup” section selecting “Recover data from an unattached content database” as shown below.
Now go back into the Create
On the next screen give SharePoint the server, database and authentication information for the database you just restored. I then selected the “Export site or list” radio button at the bottom and clicked “Next”. The next screen allows you to select specific sites to export. In my case I wanted everything so I changed the “Site” drop down menu to “No selection” and entered a filename. I didn’t bother with security as we were changing Active Directory domains anyway. I left the default option of exporting all versions so that any version history would be maintained and clicked the “Start Export”.
Depending on how large the content database is you may have time for coffee, dinner or to take the weekend off.
When it was finished I discovered some really big annoyances. The biggest being that the template used for the old SharePoint farm was different from the template for the new SharePoint farm. This meant that I wasn’t able to import the data directly like I wanted to. The new farm was created using the template “SPSMSITEHOST#0″ while the old farm was done using the template “ENTERWIKI#0″ (I found this out when I tried to use the import-spweb powershell cmdlet to import the data).
So in order to get the data imported as quickly as I could the solution that I came up with was to create new Site Collections for the 2 sites that needed to be restored. To do this you go into Central Administration and select “Application Management” from the menu on the left. Then under “Site Collections” select “Create Site Collection”. Now on the left half of the screen in the “Web Site Address” section you can click “Define Managed Paths”, do that and create a managed path for the URL you want, in this case /hr. The type of path should be Explicit inclusion as shown below.
After clicking Add Path then OK go back into the “Create Site Collection” screen. On this screen give the site collection a name, in this case “Human Resources” (you can imagine that HR wanted this back up quickly). Select /hr (or whatever you just created) in the URL field. For the template either select the template if you know what it is, or select the custom tab and choose “< Select template later… >”. Set the primary and secondary admins and the quota template as needed and click OK.
You’ve now prepped the production SharePoint site to accept the data. Now comes the really annoying part. There’s no way to import (that I could find) just part of the site from the backup file as I didn’t want a site collection but rather a subsite. In order to get just a subsite into the backup file I had to create a temporary web application (another TCP listener) for SharePoint that was separate from the main site. This is also done from Central Administration by clicking on “Application Management” from the left hand menu. Then on the right under “Web Applications” click the “Manage web applications” link. From here you can create a new Web Application by clicking the new button at the top left of the screen. Give it the TCP port and the IIS web site name as needed with the needed security information based on your companies policies. You’ll then need to go into Application Management and under Databases use the “Manage content databases” link to add a content database to the new Web Application. You can then restore the entire old SharePoint site to this new temporary Web Application by using the import-spweb powershell cmdlet as shown below.
import-spweb http://site:port -Path ‘c:\backups\YourBackupFile.cmp’ -force
Once that is done we can now backup just the specific sites that we want to move. Go back into Central Administration and select “Backup and Restore”. Under “Granular Backup” click the “Export a site or list” link (you can see it in the first screenshot above). In the backup screen select the Site Collection and Site that you want to copy to the new farm, specify a filename to backup to (I used c:\denny_backups\hr\hr.cmp) and click “Start Export”. Once it’s finished use import-spweb to import the data into the production site as shown below.
import-spweb http://site/hr -Path ‘c:\denny_backups\hr\hr.cmp’ -force
Repeat this last paragraph for each site that you need to move (and that you have already created site collections for.
Needless to say when this was all done the client was happy because everything was back up and running (after doing a little bit of DNS changing) so all and all it was a successful day.
Officially the answer is no, however with a little bit of creative configuration you sure can.
The Overall Environment
To setup SQL Server Express in a Windows Cluster I’m building this on a two node Windows Server 2012 cluster using a file share hosted on my domain controller to host the actual databases. To ensure that the domain controller is rebooted as little as possible the domain controller is installed in core mode. The cluster nodes are Windows Server 2012 standard edition (which now supports clustering) as is the domain controller.
As SQL Server 2012 express edition doesn’t support Windows Clustering out of the box the installation will be a little different from doing a normal clustered install under standard or enterprise edition. To install I did a normal SQL Express install on node1. The only change from a normal install that I made was that I configured the SQL Server instance to start under a domain account. When I got to the data directories part I configured the data folder to a network share on the domain controller.
Once the installation on node1 was completed I stopped the SQL Server services. Then I renamed the folder that I installed the SQL Server database files into. The reason for this is that I need to configure the second instance to put the database files into the same location. I can then install SQL Server 2012 express edition onto the second node.
The installation on node2 is done exactly like it was done on node1.
Once the installation is done on both nodes configure the SQL Server service to have a startup type as “Manual” instead of disabled or automatic. Leave the SQL Agent service as disabled as even though SQL Express installs the SQL Agent the SQL Agent isn’t supported on SQL Express.
Once the installation on Node2 is done the cluster can be configured. To do this bring up the Failover Cluster Administrator on one of the nodes and connect to the cluster. If the cluster hasn’t been configured yet run through the normal Clustering Configuration wizard.
We’ll now configure a new cluster role on the cluster. To do this right click on “Role” then select “Configure Role” from the context menu as shown below.
When the wizard opens click next to get to the list of services. Then select the Generic Service item from the list as shown below.
On the next screen you’ll be asked what service you wish to cluster. From this list select the SQL Server service as shown below.
On the next screen you’ll be asked to name the resource group. Give the group a name which is unique on the domain and click next. The next screen will ask you to select the needed storage. Simply click next on this screen as we aren’t using any local shared storage. The next screen asks you if any registry settings need to be replicated between the machines. We don’t need to replicate anything as SQL Server doesn’t make much use of the registry for the actual SQL Server service so we can simply click next on this screen as well. The next screen is simply a screen to review the changes which will be made. You can simply click next on this screen after reviewing the information on the screen. When the summary screen displays click finish.
Post Clustering SQL Config Changes
The first change that you’ll need to make is to enable the TCP network protocol on both nodes. By default SQL Express has the TCP network protocol disabled which need to be corrected before uses will be able to connect to the SQL Server service.
The next change that you’ll need to make is to change the local server name in the master database from the name of the last node which was installed to the cluster name using a script similar to the one shown below. In the case of this script the nodes are named node1 and node2 and the cluster name is clustersql. Once this script has been run the SQL Server instance should be restarted or failed over to the other node.
exec sp_dropserver ‘nodeb’
exec sp_addserver ‘clustersql’, local
At this point the cluster is up and running and applications can have their databases configured on the SQL Server Instance.
With SQL Server 2012′s AlwaysOn feature we have an active server called the Primary Replica and we have up to 4 secondary replicas. Even if none of those 4 secondary replicas are in use for anything, you will still need to license two of them to be properly licensed. This is because when licensing SQL Server’s each licensed server gets you only a single free passive server. So for a 5 instance AlwaysOn Availability Group deployment you’ll need to license at least 3 of those instances which would give you two passive instances. As long as those two passive instances aren’t being used for read access they are free.
The pod cast episodes will be coming out Tuesday mornings (Pacific Time). Once I get the first ones out I’ll be working on getting it submitted to iTunes so that you can pick it up there on your favorite fruity device.
The topics will vary depending on who my
victim guest is that week. We could be talking about Windows, SQL Server, Xbox, Surface, .NET, etc. or pretty much anything else that comes to mind.
These are totally un-scripted and for the most part un-edited. Some come check them out. I promise mild entertainment at the least.
While the site went live today the first episode will go live on September 11th. My first guest is Karen Lopez. You can find out a bit more about the show over on the People Talking Tech website (www.peopletalkingtech.com).
Do keep in mind as you read this, I’m not a PC gamer, so I’m not pushing my systems to 100% all the time. I’m a normal IT worker so I’ve got a bunch of pretty random apps installed but I don’t push the systems to their limits at all. Also I’ve got SSD drives in all the machines so the disk speed isn’t a bottleneck.
The first thing that you’ll notice with Windows 8 when you log in is that the interface has changed a bit from the beta versions and Windows 7. The default theme is more “metro styled” with white borders around all the applications instead of the more translucent borders that there were in the Windows 8 beta and preview releases.
From a day to day usability perspective I haven’t really had any problems with Windows 8. I got really lucky on the driver side of things as both my laptops and my desktop were fully supported. I’m still waiting for HP to get around to getting updated drivers but that isn’t suppressing. As Hyper-V is now a feature of Windows 8 you can simply create a Windows XP Virtual Machine (you’ll need a Windows XP license and install media for this) to install the HP scanner software (for example) so that things like your HP scanner work. Windows 8 did a great job of finding my network printer and installing and configuring it automatically for me.
All of the things that I need to do on the machine I can do easily. There’s a few annoying things to get used to, like you can’t hit the Windows key on the keyboard then control panel as it isn’t there any more. The easiest thing to do to get into the control panel is to open My Computer then click the computer tab at the top, then click the “Open Control Panel” icon.
Most of the old keyboard shortcuts from Windows 7 and earlier still exist. In fact there’s a list of keyboard short cuts available here.
So far I haven’t really had any major application compatibility issues to speak of. The big annoying one was the Cisco VPN installer (big shock I know) which would either crash, or crash the machine (usually the machine). The fix was pretty easy, just run the installer in Windows 7 compatibility mode. After that there’s a registry key that needs to be changed manually to get it to work (talked about here). The key that needs to be fixed can be found at HKEY_LOCAL_MACHINE\\SYSTEM\\CurrentControlSet\\Services\\CVirtA. Change the DisplayName key to either ”Cisco Systems VPN Adapter” or “Cisco Systems VPN Adapter for 64-bit Windows” (depending if you have a 32bit install or a 64bit install).
The only other issue that I’ve run into so far is that the VMware vSphere management tool doesn’t allow me to view the desktop of the virtual machines as there’s some conflict probably with Visual Studio (which is a requirement of the full SQL Server 2012 tools). So far I haven’t been able to find any solution to this problem in the long term. Some have reported that reinstalling vSphere fixes it, but that hasn’t worked for me yet.
The other applications which I use on a daily basis have been working pretty smoothly. Things like the Windows VPN, IE, FireFox, SQL Server Management Studio, Office 2010 and 2013 beta, QuickBooks, Skype and VMware Workstation all appear to be working together without an issue at all.
One thing that I have noticed is that when copying files between machines is much faster when doing Windows 8 to Windows 8. So doing things like backing up my VMs from one machine to another is very quick.
When it comes to battery life I’ve had a pretty good experience. It seems like Windows 8 is less CPU intensive than Windows 7 was so my battery lasts a little bit longer than it did with Windows 7.
Windows 8 has a really nice file transfer dialog box (shown below) which makes it a lot easier to figure out what’s going on with file transfers. If you start up multiple transfers they will all be stacked into one window, instead of having lots of different windows one for each transfer.
Over all I’ve been pretty happy with Windows 8 (I must be as I upgraded to it on day 1). Give it a try. I admit that the start menu and other changes will take a little getting used to, but there’s no way around them as I’m guessing that they aren’t going anywhere any time soon.
Find the n:\sources\cversion.ini file on the media and open it with notepad. Change the two numbers from 8508 to 7100 and save and close the file.
Now run the installer from within Windows 8 Beta/CTP/RC/RP (or whatever) and you’ll be nicely upgraded to Windows 8. I would imagine that this works for Windows 2012 when that is released as well.
P.S. Thanks to Jason Fay (blog | @jfay_dba) for reminding me that this worked in Windows 7 and testing it faster than I could for Windows 8.]]>
I’m afraid that I’ve got some bad news. You can no longer pre-order Securing SQL Server 2nd Edition from Amazon.
Instead you have to settle for ordering the book outright and having it shipped to you. That’s right, no more being a pre-order book, it’s published and available to be shipped directly to you. Currently Amazon is selling the book at full price which is $49.95, but if you have Amazon Prime it is available for Amazon Prime shipping. Because it is considered to be a text book you get a $5 Amazon MP3 Credit (what ever terms and conditions that Amazon chooses do apply).
This is a totally updated edition of the book including all sorts of new information about security within SQL Server 2012. I of course cover things like how to secure AlwaysOn Availability Groups, how to use user defined server roles, contained users, etc. I also dive into how to properly secure SQL Server Reporting Services and SQL Server Analysis Services so they can’t be used to access data that people shouldn’t have access to.
All in all this book is much larger with Amazon showing it at 408 pages compared to just 272 pages for the 1st edition. If you find someone cheaper to purchase it make sure that you are in fact ordering the second edition. The ISBN number is 1597499471.
I hope that you pick up a copy of the book and that it is useful for you in securing your SQL Server environment.
There however is some good news to this little problem. The good news in this case is that you can create a SQL Server availability group, which has an availability group listener, without putting any databases in it. This is done by creating the availability group without using wizard that is available within SQL Server Management Studio. Instead of starting the wizard select the “New Availability Group…” option from the Availability Group context menu as shown below.
This will allow you to create an Availability Group without any availability groups and with only a single replica. Once the availability group is created, the listener can be created for the availability group. The 3rd party application can then use the listener to connect to the database engine and create the database. The database once created can be added to the availability group as can the additional replicas.
While using this technique is a lot harder than going through the wizard as it requires that the database backups and restores be done manually and the configurations be all done by hand instead of the handy wizard it’ll meet the requirements of the application which is to not change the connection string.
While I love using #1 as it’s the easiest and usually the quickest way to move huge amounts of information it’s the most expensive, and it isn’t very repeatable. If there’s a problem with the data transfer or you need to redo the data transfer then you need to purchase another USB drive from the first provider and have them ship it up to the new site. As this all needs to be done pretty quickly that means that every time you do it you need to pay for overnight shipping which gets expensive, fast. Not to mention that either you need to be at the destination site or you have to pay for remote hands to open the box and connect the hard drive. In this case that means paying a few hundred bucks to have a guy at the remote site unbox the drive and connect it as the data center is in New Jersey and I’m in California a short 5.5 hour plane flight away.
Option #2 that I give here is a decent option as well, except that it only single threaded unless you do some really interesting stuff to spin up multiple copies of robocopy. The reason that you want multiple threads running is because most Managed Service Providers has some sort of Quality of Service settings configured on their routers so that one connection isn’t able to take all the bandwidth available. In this case each connection is limited to about 500kb of bandwidth so if I run several threads I get more throughput than if I run just a single thread.
Which leads me to option #3. As I’m moving lots of database files it’s easy enough for me to do multi-threaded FTP as I can sent each file separately getting getter bandwidth (currently I’m pushing 1,662.90 kb per second). I do this not with the native command line FTP or with the web browser, but by using a little FTP application which has long been abandoned by the developer called LeechFTP. While it hasn’t been upgraded in years, and some of the error messages aren’t in English, it’s a great app for moving massive amounts of data in a multi-threaded process.
Now because FTP does totally suck when it comes to resuming from a failed upload process I add an additional step into the process, I take whatever data I’m planning on transferring and use either winrar or 7zip to break the files into smaller chunks. Typically I’ll just take the entire folder that has all the backups and make one massive upload set out of it. I usually break the files into 100 Meg segments as those will usually be able to be uploaded without any sort of interruption, and if there is a problem reuploading 100 Megs worth of data usually won’t take all that long. Now I don’t bother to compress the data, I just put it into a single large rar or 7z fileset. The reason that I don’t bother trying to compress the data is that it’ll take hours to compress and the time saved usually is pretty small if any (especially as these backups are already compressed). Both winrar and 7z have store only options which usually run pretty quickly. The example 581 Gigs of data that I’m working with here was able to be stored by 7z in about an hour and a half.
I’ve begun using 7z instead of winrar for these projects as I’ve found something very annoying about winrar when using it to copy files up to a network share (like say the shared network drive that the SQL Server is going to restore from). When winrar decompressed all the data it wants to put it into the local %temp% folder which ends up filling the C drive of what ever server you are doing the work on, while 7z doesn’t have this annoying “feature”.
Once the full backups are copied up (~4 days in this case) I just unpack them and get the restore process started (keep in mind that I’ve got log backups being copied across the same network link as well. I’ll take about how I get them across later on.