Replicating a test / lab environment hosted on Microsoft Azure with Azure PowerShell

In my last post, I covered how to stand up a small test / lab environment in Azure using Azure Power Shell.  If you have not seen that post, you can access it by clicking here.

In this post I want to cover another script I’ve created that automates the replication of a lab environment to a separate isolated new lab environment.  The replica will have its own affinity group, own virtual network, and its own dedicated storage account.

The client who I built the initial set of these scripts for desired to offer clients a trial of their enterprise software in a hosted Azure lab environment.  So, this script is designed to take what we call the “gold lab” and clone it to a new lab that would be handed over to a client for testing purposes.

Just like the previous script required, this script also requires that you go and manually create the virtual network in Azure first.  The pre-deployment notes in the script should explain this in sufficient detail.  Next, you need to choose the variables that the script will use to run with by editing the script.

  • $labName is the name you wish to use for the destination (new) lab.  As with the first script, this name will either become the name of the components, or be prepended to them.
  • $azureLocation is the Azure region where the destination (new) lab will be deployed.  I’ve not yet tested replication to a new region.  So, for my purposes this region has been the same region that the original “gold” lab was in.
  • $instanceSize is the instance size you wish for your destination (new) machines to be assigned when they are provisioned.
  • $sourceStorageAccount is the source storage account where the source “gold lab” vhds live.  If you used my previous script to create your gold lab, this will be the $labName you used in that script.
  • $souceVM1Disk and $sourceVM2Disk are the source VHDs that you want to copy.  Unfortunately, at this time you will need to manually enter these.  I’d like in improve this in the future.  You can get these by browsing your source storage container.
  • $sourceContainer is the source container where the VHDs live.  $destinationContainer is the destination container where the new VHDs will live.  Normally this will be set to “vhds” but I wanted to offer these as variables so that if you like to have your VHDs in another container, you could still easily use this script.

Once you have those variables set, your source VMs are powered off and you have manually created the destination virtual network, you should be ready to roll.

Oh, before you run the script… Please remember that I am in no way responsible for what this script might do. I believe it to be useful, and generally safe. You should make sure you are 100% comfortable with what this script is doing before you begin. Ok – now that the lawyers can sleep again tonight, here we go. When you run the script you should see output something like this.

lab-replica

You now have a duplicate of your initial gold lab running in an isolated environment on Azure.  In my case, this includes a Active Directory (AD) Domain Controller (DC) as well as some other software that generally does not respond well to being cloned.  However, because the script is making an exact copy of the disks and spinning the new VMs up in a new identically configured isolated lab (where it can not communicate with the source lab) things work fine for testing purposes.  Keeping the internal IP address assignments identical is key to making this work.  In my case, I’m going back in and editing my virtual network after creation to make sure that DHCP is pointed at the IP reserved for my AD DC.  This is obviously required for AD to behave.  This may or may not be required in your situation.

I hope this is helpful to you.  I am sure this script could be improved.  However, I also hope it is in good enough shape to be helpful to some of you.  If you see ways it could be made better, please let me know.  Keep an eye on my blog for my lab destruction script that I’ll be posting soon.  Until then, be sure to turn off / remove things you don’t want Azure to bill you for.

#region Notes
# Lab Cloner - V2 Built on 2/7/2015
# Built by David Winslow - wdw.org
#
# Azure PowerShell General Notes / Commands
#		[Console]::CursorSize = 25 (Makes the cursor blink so you don't go insane.)
#		Add-AzureAccount (Connects Azure PowerShell up to your Azure account)
# 
# Pre-Deployment Notes:
#    Run Add-AzureAccount (Connects Azure PowerShell up to your Azure account) before running this script.
#    The destination network for the clone must be built manually in Azure portal before running this script.  
#      The network name must match $labName below.
#      Address space: 10.20.0.0 /16
#      Subnet-1 10.20.15.0 /24
#      Subnet name must be Subnet-1)
# You need to choose values for the variables below.
#endregion

#region Pre-Deployment Variables
$labName = "lab104"
# This name will be user entirely for, or prepended to most components.
# Must be all lower case letters and numbers only.  
# Must be Azure globally unique.
# Must not exceed 11 characters in length.

$azurelocation = "East US 2"
# This is the location that resources will be created in.

$instanceSize = "Standard_D2"
# This determines what size instances you create for the destination VMs launch.

$sourceStorageAccount = "lab102"
# This sets the source storage account.

$sourceVM1Disk = "lab102vm1-lab102vm1-2015-2-13-10-55-45-799-0.vhd"
# This sets the 1st source disk.

$sourceVM2Disk = "lab102vm2-lab102vm2-2015-2-13-10-53-38-795-0.vhd"
# This sets the 2nd source disk.

$sourceContainer = "vhds"
# This is the container the source disks are in.

$destinationStorageAccount = $labName
# This sets the destination storage account to the $labName

$destinationContainer = "vhds"
# This is the container the destination disks will be placed in.
#endregion


#region Write Pre-Deployment Variables to screen
Write-Output "============================"
Write-Output "Cloning Lab."
Write-Output "============================"
Write-Output " " 
Write-Output "Source storage account set to:" $sourceStorageAccount
Write-Output " " 
Write-Output "Souce VM Disk 1 set to:" $sourceVM1Disk
Write-Output " " 
Write-Output "Souce VM Disk 2 set to:" $sourceVM2Disk
Write-Output " " 
Write-Output "Souce container set to:" $sourceContainer
Write-Output " " 
Write-Output "Destination storage account $labName will be created."
Write-Output " " 
Write-Output "Destination container $destinationContainer will be created."
Write-Output " " 
Write-Output "Destination instance size set to:" $instanceSize
Write-Output " " 
Write-Output "Destination lab name set to:" $labName
Write-Output " " 
Write-Output "Destination Azure lab location set to:" $azurelocation
Write-Output " " 
#endregion

#region Assigning Azure Subscription
Write-Output " "
Write-Output "=================================="
Write-Output "Assigning Azure Subscription"
Write-Output "=================================="
Select-AzureSubscription "Pay-As-You-Go"
#endregion

#region Provisioning Affinity Group
Write-Output " "
Write-Output "=================================="
Write-Output "Provisioning Affinity Group"
Write-Output "=================================="
New-AzureAffinityGroup -Name $labName -Location $azureLocation
#endregion

#region Provisioning Destination Storage Account
Write-Output " "
Write-Output "=================================="
Write-Output "Provisioning Destination Storage Account"
Write-Output "=================================="
New-AzureStorageAccount $labName -AffinityGroup $labName
#endregion

#region VHD Blob Copy
Write-Output " " 
Write-Output "============================"
Write-Output "Starting VHD Blob Copy"
Write-Output "============================"
# Get Source Keys
$sourceStorageAccountKey = (Get-AzureStorageKey -StorageAccountName $sourceStorageAccount).Primary
$destinationStorageAccountKey = (Get-AzureStorageKey -StorageAccountName $destinationStorageAccount).Primary
$sourceContext = New-AzureStorageContext –StorageAccountName $sourceStorageAccount -StorageAccountKey $sourceStorageAccountKey
$destinationContext = New-AzureStorageContext –StorageAccountName $destinationStorageAccount -StorageAccountKey $destinationStorageAccountKey

#create the destination container
New-AzureStorageContainer -Name $destinationContainer -Context $destinationContext

# VM1Disk
$blobCopy = Start-AzureStorageBlobCopy -DestContainer $destinationContainer `
                        -DestContext $destinationContext `
                        -SrcBlob $sourceVM1Disk `
                        -Context $sourceContext `
                        -SrcContainer $sourceContainer

# VM2Disk
$blobCopy = Start-AzureStorageBlobCopy -DestContainer $destinationContainer `
                        -DestContext $destinationContext `
                        -SrcBlob $sourceVM2Disk `
                        -Context $sourceContext `
                        -SrcContainer $sourceContainer
                        #endregion

#region Disk Additions
Write-Output " " 
Write-Output "============================"
Write-Output "Starting Disk Additions"
Write-Output "============================"
# Add disks from the just cloned blobs
$VM1disk = $labName + "VM1.vhd"
$VM2disk = $labName + "VM2.vhd"
$VM1diskLocation = "https://" + $labName + ".blob.core.windows.net/vhds/" + $sourceVM1Disk
$VM2diskLocation = "https://" + $labName + ".blob.core.windows.net/vhds/" + $sourceVM2Disk
Write-Output $VM1disk
Write-Output $VM2disk
Write-Output $VM1diskLocation
Write-Output $VM2diskLocation
Add-AzureDisk -DiskName $VM1disk -OS Windows -MediaLocation $VM1diskLocation -Verbose
Add-AzureDisk -DiskName $VM2disk -OS Windows -MediaLocation $VM2diskLocation -Verbose
#endregion

#region Assigning Azure Subscription
Write-Output " "
Write-Output "=================================="
Write-Output "Assigning Destination Azure Subscription and Storage Account"
Write-Output "=================================="
Set-AzureSubscription -SubscriptionName "Pay-As-You-Go" -CurrentStorageAccount $labName
#endregion

#region Provisioning Reserved IP Addresses
Write-Output " "
Write-Output "=================================="
Write-Output "Provisioning Reserved IP Addresses"
Write-Output "=================================="
$VM1IP = $labName + "VM1IP"
$VM2IP = $labName + "VM2IP"
New-AzureReservedIP -ReservedIPName $VM1IP -Label $VM1IP -Location $azurelocation
New-AzureReservedIP -ReservedIPName $VM2IP -Label $VM2IP -Location $azurelocation
#endregion


#region Starting VM Creation
Write-Output " " 
Write-Output "============================"
Write-Output "Starting VM Creation"
Write-Output "============================"
$VM1 = $labName + "VM1"
$VM2 = $labName + "VM2"
New-AzureVMConfig -Name $VM1 -InstanceSize $instanceSize -DiskName $VM1disk | Set-AzureSubnet 'Subnet-1' | Set-AzureStaticVNetIP -IPAddress 10.20.15.5 | Add-AzureEndpoint -Name "RemoteDesktop" -Protocol "tcp" -PublicPort 33899 -LocalPort 3389 | Add-AzureEndpoint -Name "PowerShell" -Protocol "tcp" -PublicPort 60208 -LocalPort 5986 | New-AzureVM -ServiceName $VM1 -AffinityGroup $labName -ReservedIPName $VM1IP -VNetName $labName 
New-AzureVMConfig -Name $VM2 -InstanceSize $instanceSize -DiskName $VM2disk | Set-AzureSubnet 'Subnet-1' | Set-AzureStaticVNetIP -IPAddress 10.20.15.6 | Add-AzureEndpoint -Name "RemoteDesktop" -Protocol "tcp" -PublicPort 33900 -LocalPort 3389 | Add-AzureEndpoint -Name "PowerShell" -Protocol "tcp" -PublicPort 60209 -LocalPort 5986 | New-AzureVM -ServiceName $VM2 -AffinityGroup $labName -ReservedIPName $VM2IP -VNetName $labName
#endregion

Automating the creation of a test / lab environment hosted on Microsoft Azure with Azure PowerShell

A couple of weeks ago a client who is a Microsoft Gold ISV partner approached me about building out a test lab environment in Microsoft Azure.  For those of you who may not know, Microsoft Azure is Microsoft’s public cloud offering (IaaS / PaaS / etc).

Ultimately, my client wanted an environment that they could prepare initially, then replicate.  The clients goal is ultimately to be able to provide some of their potential clients with a live trial of the enterprise software they create.  So, a big public cloud provider like Azure was a perfect fit.  Obviously, you could also use AWS / Google Cloud or any one of a number of big IaaS providers for this purpose.  However, since this client is a Microsoft Partner, and mostly a Microsoft shop overall, Azure was the direction they wanted to go.

Since the client ultimately wanted to automate the replication of these environments, I dove in to find out what the best way would be to do this.  It did not take long to figure out that Microsoft Azure PowerShell would offer the exact functionality I needed. So, I decided to dive in and learn a bit while doing this.

I decided that if I was ultimately going to replicate these environments using PowerShell, that I should go ahead and build the initial “gold” environment using PowerShell as well.  The result of that effort is the script below.

In the next few days (once I have the bugs worked out) I’ll be posting scripts that can be used to replicate a lab, as well as destroy a lab once you are finished.  If I have time, I’m also going to create some simple scripts that can be used to power a lab down and power it back up as well.

If you ever wanted to build a lab setup quickly in PowerShell perhaps this will help you too.  It’s unlikely my script is perfect for your project, but perhaps it will serve as a starting point for something you can customize to fit your needs.  Here are a few things to mention before you get started.

  1. Azure costs money.  So, if you are going to test this be sure you know what you are doing and you know how to turn things off / remove them when you are finished.
  2. You can sign up for a free $200 trial which is a great way to start.
  3. You will need Azure PowerShell which you can download here.
  4. Once you have Azure PowerShell downloaded and installed you will need to connect it to your Azure account.  To do that simply run:
     Add-AzureAccount
    

    This will pop up a box that will ask you to authenticate to your Azure account.

  5. If this is your first time using Azure PowerShell you may also need to run the command below to correctly target your Azure PowerShell session at the correct subscription.:
    Select-AzureSubscription -SubscriptionName "Pay-As-You-Go"
    
  6. My script relies on you building out a virtual network in Azure for these VMs to live in manually first.  What is needed should be clear from the Pre-Deployment Notes section of my script.  At this time, Microsoft does not seem to offer a great way to automate the creation of this virtual network (if they do I am missing it).

Once you have all that done, you should be ready to run the script.  The sample script below will build two Azure VMs.  You can choose the following variables in the Pre-Deployment Variables section of the script before running it.  Those variables are:

  • $labName – This is essentially the unique ID of this lab.  That way when you use this tool over and over again to create hundreds of labs that are infinitely useful to you and your organization, you will be able to tell all of the components  apart.  Sorry – I got a bit carried away there.  This $labName will either be the name that is used for components (Affinity Group / Storage Account etc) or be prepended to the name of components (reserved IPs, VMs etc).
  • $azureLocation – This is the Azure region that everything this script creates will be deployed in.  You can find a list of Azure Regions here.  It is important that the location you choose match the location of your virtual network.
  • $instanceSize – This is the instance size of the VMs that you will deploy.  You can find a list of Azure Instance sizes here.
  • $baseImage – This is the base operating system image your VMs will start with.  To get a list of potential images to start with you can run this command from your Azure PowerShell window.  Be sure and filter for what you are looking for by replacing the “Windows Server 2012 R2 Datacenter*” in the example below.
    Get-AzureVMImage | where-object { $_.Label -like "Windows Server 2012 R2 Datacenter*" }
    
  • $adminUser – This is the guest VM administrator username for the VMs that you will deploy.
  • $adminPassword – This is the password for the guest VM $adminUser username you chose above.

Once you have those chosen variables, you should be ready to roll.  Oh, before you run the script… Please remember that I am in no way responsible for what this script might do.  I believe it to be useful, and generally safe.  You should make sure you are 100% comfortable with what this script is doing before you begin.  Ok – now that the lawyers can sleep again at night, here we go.  When you run the script you should see output something like this.

lab-builder-output

Here is what is going on in the background.

  • An Azure Affinity Group named with your $labName variable is being created.  Affinity groups are critically important. When I started this process, I did not understand that.  I was getting terrible network performance between my VMs even though all my VMs were in the same region.  Basically, the affinity group makes sure that all of the other resources are located as close together as possible from a storage / network standpoint.  You can read more about affinity groups here.
  • An Azure Storage Account named with your $labName variable is being created.  This is where all of your VMs disks will be stored.
  • Two Azure reserved IP addresses are reserved.  Reserved IP addresses are IP addresses that will stay with a VM even if it is powered off / deprovisioned.  You can read more about them here.  There are costs associated with these which you can better understand by reading more about that topic here.
  • Two Azure VMs are built out in your affinity group, based on the $instanceSize and $baseImage you selected in the script.  These VMs have your $labName pre-pended to them.  In addition, the admin username and password that you chose is configured on these VMs.  Be sure and change that password after the VMs are deployed.  Having your admin password laying around in a plain text PowerShell script is a terrible idea!

In summary, I am sure this script could be improved.  However, I also hope it is in good enough shape to be helpful to some of you.  If you see ways it could be made better, please let me know.  Keep an eye on my blog for my lab replication / lab destruction scripts that I’ll be posting soon.

#region Notes
# Lab Builder - V2 Built on 2/9/2015
# Built by David Winslow - wdw.org
#
# Azure PowerShell General Notes / Commands
#		[Console]::CursorSize = 25 (Makes the cursor blink so you don't go insane.)
#		Add-AzureAccount (Connects Azure PowerShell up to your Azure account.)
#
# V2 improvements:
#    -Proper affinity group provisioning
#    -Added variable for regions
#    -Added variable for base images
#    -Added variable for instance sizes
#
# Pre-Deployment Notes:
#    Run Add-AzureAccount (Connects Azure PowerShell up to your Azure account) before running this script.
#    The virtual network must be built manually in the Azure portal before running this script.  I hope to improve this in a future version.
#		-The virtual network name must match the $labName variable below.
#       -The virtual network location must match the $azureLocation variable below.
#		 -Address space: 10.20.0.0 /16
#		 -Subnet-1 10.20.15.0 /24
#		 -Subnet name must be Subnet-1
# You need to choose values for the variables below.
#endregion


#region Pre-Deployment Variables
$labName = "lab51"
# This name will be user entirely for, or prepended to most components.
# Must be all lower case letters and numbers only.  
# Must be Azure globally unique.
# Must not exceed 11 characters in length.
$azureLocation = "East US 2"
# This is the location that resources will be created in.
$instanceSize = "Standard_D2"
# This determines what size instances you launch.
$baseImage = "a699494373c04fc0bc8f2bb1389d6106__Windows-Server-2012-R2-201412.01-en.us-127GB.vhd"
# This determines what image these VMs are built from.
# "a699494373c04fc0bc8f2bb1389d6106__Windows-Server-2012-R2-201412.01-en.us-127GB.vhd" - Windows 2012 R2 - Data Center - December 2015
# reference this to get image names if needed. (Get-AzureVMImage | where-object { $_.Label -like "Windows Server 2012 R2 Datacenter*" } )
$adminUser = "labadmin"
# This is the initial default admin username for each VM.
$adminPassword = "@@labAdminPassword2015!!"
# This is the initial $adminUser account password for each VM.
#endregion


#region Write Pre-Deployment Variables to screen
Write-Output " "
Write-Output "=================================="
Write-Output "Building Lab."
Write-Output "=================================="
Write-Output " " 
Write-Output "Lab name set to:" $labName
Write-Output " " 
Write-Output "Azure location set to:" $azureLocation
Write-Output " " 
Write-Output "Instance size set to:" $instanceSize
Write-Output " " 
Write-Output "Base Image set to:" $baseImage
Write-Output " " 
Write-Output "Admin username set to:" $adminUser
Write-Output " " 
Write-Output "Admin password set to:" $adminPassword
Write-Output " " 
#endregion


#region Provisioning Affinity Group
Write-Output " "
Write-Output "=================================="
Write-Output "Provisioning Affinity Group"
Write-Output "=================================="
New-AzureAffinityGroup -Name $labName -Location $azureLocation
#endregion


#region Provisioning Storage Account
Write-Output " "
Write-Output "=================================="
Write-Output "Provisioning Storage Account"
Write-Output "=================================="
New-AzureStorageAccount $labName -AffinityGroup $labName
#endregion


#region Assigning Azure Subscription
Write-Output " "
Write-Output "=================================="
Write-Output "Assigning Azure Subscription"
Write-Output "=================================="
Set-AzureSubscription -SubscriptionName "Pay-As-You-Go" -CurrentStorageAccount $labName
#endregion


#region Provisioning Reserved IP Addresses
Write-Output " "
Write-Output "=================================="
Write-Output "Provisioning Reserved IP Addresses"
Write-Output "=================================="
$vm1IP = $labName + "vm1IP"
$vm2IP = $labName + "vm2IP"
New-AzureReservedIP -ReservedIPName $vm1IP -Label $vm1IP -Location $azureLocation
New-AzureReservedIP -ReservedIPName $vm2IP -Label $vm2IP -Location $azureLocation
#endregion


#region Building VMs
Write-Output " "
Write-Output "=================================="
Write-Output "Building VMs"
Write-Output "=================================="
$vm1vm = $labName + "vm1"
$vm2vm = $labName + "vm2"
New-AzureVMConfig -Name $vm1vm -InstanceSize $instanceSize -ImageName $baseImage | Add-AzureProvisioningConfig -Windows -AdminUsername $adminUser -Password $adminPassword | Set-AzureSubnet 'Subnet-1' | Set-AzureStaticVNetIP -IPAddress 10.20.15.5 | New-AzureVM -AffinityGroup $labName -ServiceName $vm1vm -ReservedIPName $vm1IP -VNetName $labName -Location $azureLocation
New-AzureVMConfig -Name $vm2vm -InstanceSize $instanceSize -ImageName $baseImage | Add-AzureProvisioningConfig -Windows -AdminUsername $adminUser -Password $adminPassword | Set-AzureSubnet 'Subnet-1' | Set-AzureStaticVNetIP -IPAddress 10.20.15.6 | New-AzureVM -AffinityGroup $labName -ServiceName $vm2vm -ReservedIPName $vm2IP -VNetName $labName -Location $azureLocation
#endregion


#region Building VMs
Write-Output " "
Write-Output "=================================="
Write-Output "Lab Builder Script Complete!"
Write-Output "=================================="
#endregion

The future of Microsoft

Who am I to even write a blog post entitled  “The future of Microsoft”?  Well, admittedly my field of view on this topic is limited.  I’m just a regular guy.  However, perhaps I have an interesting perspective after working with their tech for the last 20 years.

History (me and Microsoft):

  • I’m a guy that got interested in computers in my late teens.
  • A few times in high school, when the computer science teacher was not feeling well (she was pregnant at the time) I got to help facilitate teaching my fellow students BASIC – yeah that BASIC.
  • My dad gave me his old PCs as he upgraded.  He was an accountant, and fortunately for me accountants adopted computers quickly.  I used them to write (play with) HTML in notepad, and render it in Netscape Navigator.  All the while I was buying $50 two inch thick books from Barnes and Noble to learn more using the money I earned mowing yards and washing cars at a local car dealership on the weekends.
  • My friend Rick got me an internship in the Information Resources department at a great company in our area.  My job was to help support the field sales laptops that ran Windows 3.1, and assist with the migration effort to get them all migrated to Windows 95.
  • After high school I attended a community college for one quarter in a plastics engineering program, then quickly dropped out to start a web hosting company with Rick (who got me the internship).  We became the first Microsoft Authorized Web Presence Provider for Microsoft FrontPage in NC, and one of the first 50 in the nation.  We did it all in the beginning with a 64k ISDN line and a Gateway 2000 server (not really a server at all by today’s standards).
  • I earned my MCSE (Microsoft Certified Systems Engineer) certification when I was 19 years old.  Back then, this was a big deal.

We will pause there for a moment.  These were the glory days for Microsoft from my perspective.  Microsoft was killing it and I was lucky enough to be along for the ride.  For lots of firms, we were installing their first networks and first internet connections.  When we were not putting new stuff in we were ripping out Novell powered networks and replacing them with Windows NT like crazy.  We helped companies register their first domain names, and configure their first email addresses over and over and over again.

It turns out, for a nerd (especially a Microsoft infrastructure focused nerd) I entered this career at exactly the right time.  By the end of 1999, Microsoft stock was up over 800% from where it was at the start of 1996.

msft-5yr-chart-1996

After the dot com bubble bust in the early 2000’s things started to settle down.  The hype was over, and now businesses depended heavily on these systems.  I’ve kept busy for years taking care of the systems we built.  We upgraded servers running Windows NT to Windows 2000, then Windows 2003, then Windows 2008 R2 for clients along the way.  All the while upgrading the hardware, applications and other infrastructure systems around them.

However, somewhere a few years ago things started to shift.  Remember the I’m a Mac and I’m a PC commercials?  I hate to say it but in some ways they mirrored this shift and were well timed.  Slowly, PCs and the Microsoft server infrastructure that ran them started to feel more boring.

Lots of us old NT4 MCSEs moved on up the food chain.  Some took the management route, others of us who stayed focused on the technology focused on more interesting areas where more innovation was happening from a technical perspective.  For me those areas were virtualization (VMware), Linux (RedHat), security (SANS GPEN etc) or countless other areas where cool stuff was going on.  For me, I still did plenty of Microsoft focused consulting in order to pay the bills, but my attention and real interest shifted elsewhere to where more interesting stuff was going on.

Lately, you upgrade your server OS because you have to, not because you want to.  The old one is not going to be supported or some app the business needed required the new one, so you upgraded.  You did not really want to, but you did.  It’s not that Microsoft did anything so horribly wrong, well other than Vista and Windows 2008.  They just were not doing anything particularly interesting.  In most cases, there was no compelling reason for me to tell clients they should upgrade their desktops from Windows XP or their servers from Windows Server 2003.  Those tools got the job done, and overall they were solid.  That trend has essentially continued.  Today, I have no particular burning reason to tell a client to upgrade Windows 7, Windows 2008 R2, or Exchange 2010.  Run it until support expires, or you have a real need for something else.

Boring.  Operationally important even critical to most businesses, yes.  Interesting, not all that much.  Surely not as interesting as it once was.

The subtle recent shift:

Recently, it feels like things have started to shift.  Microsoft has actually started to innovate again!  They have also started to make good decisions and gain ground where they had fallen way behind.  Here are a few areas I can think of where they have shown recent positive improvement.

  • The Kinect was an actually a really cool and innovative product.  I’m still not crazy about the Xbox UI, but the Kinect itself is cool tech.
  • Microsoft entered the cloud space with Office 365 (SaaS) and Microsoft Azure (IaaS / PaaS).  Both of those are maturing nicely and offering some very credible competition to the folks who really innovated in these spaces.
  • The Microsoft Band is awesome!  I’ve got a friend who has one, and he is crazy about it.  They are still sold out online, and only available in the Microsoft stores if you can get one when they are in stock.
  • The Microsoft Surface is a really solid piece of tech.  Take an average business person and replace their existing tablet with one of these and in my experience they will thank you.  For some of them, you can also use it to replace their laptop and desktop (docking station and external monitor required).
  • Microsoft development tools continue to offer developers some of the very best tools to work with.  I’m a C# beginner so I’m not qualified to say this, but this is what some of the best developers I work with and respect tell me.
  • Microsoft open sourced .Net and embraced Linux running on Azure.  Thank goodness the holy war against Linux and Open Source in general seems to be over.
  • Hyper V seems to be catching up with VMware.  If you hire me today, I’ll likely still suggest VMware but I’m less certain I’ll be doing that forever.  Especially, if you are considering eventually building a hybrid cloud with Azure.

So, the big ship Microsoft seems to have gotten turned, and it feels like it is headed in a more positive direction.

The future:

The future is an awfully hard thing to predict, especially in the tech space.  However, I think HoloLens has real potential to have a huge positive impact on the entire Microsoft ecosystem.  Go ahead – click here and watch the two minute video.  Its worth it.  I’ll wait…

Ok – so is that awesome or what?  Its a fundamentally different way to interact with a computer than anything any of us use today.  Its unbelievably innovative.

Imagine choosing to upgrade Microsoft Office because you want to do 3D data visualization with Excel, or you want to present HoloLens content with PowerPoint.  Imagine wanting to upgrade to Windows 10 so you can use the HoloLens. Imagine your users begging you to upgrade your Remote Desktop servers so that HoloLens enabled apps will work remotely.  I think the potential is huge.  Never mind just the examples of extending the apps we have today with this technology.  Imagine the apps this technology will enable that simply can’t exist today.

I might look back on this post a year or two from now and think it is the craziest thing I have ever written but I’m going to say it anyway.  I think now is the time to double down on Microsoft.

How to quickly fix most of what’s wrong with your default IIS implementation of SSL/TLS.

Background:

Today after announcing some planned downtime for a couple of servers at a client site, another engineer sent me a link to the Qualys SSL Labs – SSL Server Test tool output for one of the clients servers that I would be working on during the downtime.  In essence, he was saying: “Hey – while you are at it…can you fix this too?”

Here is the top of the report he sent at first.  Note the “F” grade.  While this server did have a functioning SSL certificate, the server was not optimally configured to support the best and most secure use of that certificate.

Qualys SSL Labs Test - F

So, I started to work on fixing these issues one by one.  While fixing these issues one by one, using manual registry entries from a couple of Microsoft KB articles (listed below),  I found a great tool called IIS Crypto from Nartac Software that automates making these changes.

How I fixed this quickly (and you can too):

As always, I’d suggest that you test on non-productions systems first, and make backups so that if  you break things, you can fix them quickly.  I believe your biggest risk is breaking connectivity with old clients that don’t support newer cipher suites (think: those pesky Windows XP boxes still running IE 6 that should be in a recycling pile somewhere anyway).  If you care about connectivity from those boxes, you will want to stop at this point and work through this in a more methodical manner.  In order to maintain connectivity with those ancient boxes you would need to leave some non-best practices settings in place.

I downloaded a tool called IIS Crypto from Nartac Software.  Naturally, I scanned the tool with VirusTotal.  It came back clean, so I proceeded.  Before running the tool, I went and read the FAQ.  The FAQ indicates that the tool only automates adjustments to registry settings in the following registry locations.

  • HKLM\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL
  • HKLM\SOFTWARE\Policies\Microsoft\Cryptography\Configuration\SSL0010002

When it comes to software, I have trust issues.   If you are a good I.T. person – you probably do too!  So, I fired up regedit, and made a quick export of those keys.  These exports provide a reference / baseline as well as a quick way to revert if things go wrong.

After doing this, I ran the IISCrypto.exe file, clicked ok on the license, and was then presented with the following screen.

IIS Crypto

I chose the “Best Practices” button which loads a great group of settings.  I then clicked “Apply” and rebooted manually.   Yes – a reboot is necessary in order for some of these changes to happen, so plan your maintenance window accordingly.  The tool also offers buttons that would configure settings to attempt to satisfy PCI, FIPS 140-2, or allow you to choose settings manually.

Post reboot I ran the Qualys SSL Labs – SSL Server Test again, and got a much better score.   There are still a couple of things that I might tweak later but overall, this represents a huge improvement.

Qualys SSL Labs Test - A

Obviously, the ultimate goal is not a better score but ultimately a better, more secure SSL / TLS implementation.  In this case the better score represents a more secure implementation thanks to the following changes that it facilitated.

  • SSL2 – disabled
  • SSL3 – disabled
  • TLS 1.2 – enabled
  • RC4 cipher (old / weak) – disabled
  • Optimization of cipher suite preferred order to enable forward secrecy

Huge thanks to the folks at NARTAC Software who created this tool, and the fine folks at Qualys who created the best SSL / TLS test tool I know of. Both of these will be in my toolbox for a long time to come!

Important IT Security Reminders

I recently sent this reminder out to my consulting clients.  I thought I would post this here as well in case it might be helpful for any of you.

=========

Recently, I have seen an uptick in the number of machines infected by a category of malware, known as ransomware, particularly Crypto Wall 2.0.  New IT security threats are constantly emerging.  In many ways, none of this is really anything new.  However, as your trusted IT consultant, I wanted to pass along some information that I hope might be helpful to both you and your users.  It is more important than ever to make sure you and your businesses are well protected.  With a few exceptions, we are sending this reminder to our primary contact for each account.  Feel free to distribute this as you see fit.

A note to IT professionals:

I’m providing this information to you to serve as a reminder of things to consider.  You have my permission upfront to modify this information as you deem appropriate and send it out to others who might find it helpful.

A note to those of you who are not IT professionals:

Many of these suggestions are aimed at individual computer users.  If you are not an IT person, and you work on a computer network maintained by an IT professional, you should discuss these issues with your IT pro and NOT take action on your own.  In many cases, these same problems need to be handled differently on larger corporate networks.  If I happen to be the IT pro who helps you maintain your network, I’ll be happy to discuss these issues with you further and provide specific suggestions for your environment if you would like.

Suggestions:

1) Make sure you have good/successful backups of the data you care about.  This is important for many reasons, but simply having good backups of your data provides you the foundation to be able to recover should you be impacted by any event that destroys your data.

  • Think about how often your data is backed up.  You need to be able to handle the loss of the data since your last known good backup.  If you can’t do this, you need consider backing up more often.
  • Make sure you are monitoring your backup systems on a daily basis.  All too often backup systems fail silently.
  • Your backups need to be isolated from your production systems.  For example, if your PC was infected by ransomware, it is important that your backups are located in a location that the ransomware can’t reach (isolated cloud storage / disconnected external hard disk drives, etc.).
  • Run a periodic test restore from your backup system.  This tests the system and will help you discover any issues with the system itself before you are relying on it to save your data.

2) Use an alternate web browser.  We believe that currently Google Chrome is the most secure option.  However, Firefox also has an excellent track record.

2) Run Adblock Plus.  Adblock Plus is a browser add-in that blocks most ads embedded in web pages.  Running a tool like this will help prevent you from becoming infected by a malicious ad, potentially served up on a seemingly legitimate web site.

3) Uninstall Java.  If you can’t uninstall it entirely (because you use a program that requires Java) consider disabling it in your browser.

  • If you must keep Java be sure to keep up to date with the latest versions and security patches as they are released.  You can check the version you have now, and update if needed here.  Be careful when updating Java to make sure and de-select any additional bundled downloads you may be offered.

4) Make sure you have the latest version of Adobe Flash and Adobe Reader installed.  

  • Be careful when updating these tools.  Make sure to de-select any additional bundled downloads you may be offered,
  • Also make sure you have both of these set to update automatically when Adobe releases updates.
  • Out of date, vulnerable versions of these tools are some of the most attacked versions of software because both are often exposed through your web browser.

5) Run up to date antivirus software.  If you have not already, by now some of you are thinking: “Hey wait a minute, isn’t my antivirus software supposed to protect me from this stuff ?”  The sad truth is this: it simply is not capable of doing that in the comprehensive way that you may have been lead to hope and expect.  Anti-virus / anti-malware software does protect you from some threats, but, regrettably, it is woefully inadequate alone to protect you from many of these emerging threats.  Even with that being the case, we still recommend that you run anti-virus / anti-malware software.

6) Make sure you are current on all security updates for your operating system, and that your computer is configured to install these updates as they are released automatically.

Microsoft, Apple and other operating system vendors are constantly releasing security updates to correct problems they have found.  It is important that you make sure these updates are installed on your computer.

7) Be extremely careful when clicking links in email.  Deceptive links in e‑mail messages are often used as part of phishing and spyware scams.  Clicking a deceptive link can take you to a webpage that attempts to download malicious software onto your computer.  If you have any doubt, don’t click on a link.

  • If you feel you must click on a link embedded in an email, and you have any question in your mind about its legitimacy, I would suggest copying the link and pasting it into VirusTotal here.  VirusTotal will scan the site that the link goes to for known malware etc.
  • Please be aware that VirusTotal may not identify potential phishing sites that attempt to get you to enter credentials.  Most of those sites contain no actual malware at all.  They are simply configured to trick you into giving them your personal information.

8) Check your PC for other outdated software that might be vulnerable.  While the applications we mentioned above are the most targeted, if you would like to check your PC for other applications that might also be out of date and vulnerable feel free to download and run this tool on your system.

9) Use encryption software to protect sensitive data.  Using encryption software like Microsoft Bitlocker can protect that data on your hard disk drive in the event that your PC is lost or stolen.  In addition, many newer hard disk drives support hardware based AES encryption that provides a similar level of protection implemented at the hardware level.

10) Use unique passwords for each site or service that you use.  This can be very hard to do.  However, using a password manager can help you use complex and unique passwords for each site.  Our current favorite password managers are LastPass and Roboform.

While sadly no amount of work can be done to guarantee you 100% protection, doing the things listed above will go a long way towards making your data more secure.