Azure VMs need Internet Access

When customers move into the cloud, they tend to mimic their setup on-prem.  Not a bad thing, but when it comes to blocking internet access for servers this can create some unusual problems.

If you are using network security groups (NSGs), user defined routing (UDR), or forced-tunneling be sure to put in an exception for your Azure data center IP ranges, as lack of connectivity will impact many services including these:

  1. VM Extensions see https://blogs.msdn.microsoft.com/mast/2016/04/27/vm-stuck-in-updating-when-nsg-rule-restricts-outbound-internet-connectivity/
  2. Azure Backup see https://azure.microsoft.com/en-us/documentation/articles/backup-azure-vms-prepare/#network-connectivity
  3. Monitoring Agent/Extension see https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-proxy-firewall#configure-settings-with-the-microsoft-monitoring-agent
  4. KMShttps://docs.microsoft.com/en-us/azure/virtual-machines/troubleshooting/custom-routes-enable-kms-activation

Update 16 Aug 2018 – The use of service endpoints will limit the damage of blocking internet access.  Ensure all services you use/require are covered by service endpoints before blocking internet access.  https://docs.microsoft.com/en-us/azure/virtual-network/virtual-network-service-endpoints-overview

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2016/08/azure-vms-need-internet-access/

Azure Classic Portal | How to Add a Data Disk Using a Different Storage Account than the OS Disk

When you add a new disk to an existing VM, you can only modify the  disk name and size which means you are stuck using the current/default storage account (see https://azure.microsoft.com/en-us/documentation/articles/virtual-machines-windows-classic-attach-disk/)

Using PowerShell, you can specify a different storage location for your new data disk though (see https://msdn.microsoft.com/library/azure/jj152837.aspx)

o Make sure you use -MediaLocation parameter as per the documentation “[i]f no location is specified, the data disk will be stored in the VHDs container within the default storage account for the current subscription.”

o The disk label you enter in the command shows in the storage view, but under the VM dashboard the disk title will look like this: [CloudService]-[VMName]-[LUN #]-[Date/time stamp].  If you expand the VHD column (copying it into notepad may be easier) you will be able to see the full VHD name and that WILL match what you specified in the media location parameter.

Example: Get-AzureVM -ServiceName “NLW-MAGBox” -Name “NLW-MAGBox” | Add-AzureDataDisk -CreateNew -DiskSizeInGB 128 -DiskLabel “NLWTest” -LUN 0 -MediaLocation “https://storage2.blob.core.usgovcloudapi.net/mycontainer/MyNewDisk.vhd” | Update-AzureVM

 

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2016/09/azure-classic-portal-how-to-add-a-data-disk-using-a-different-storage-account-than-the-os-disk/

Connecting to MySQL for Azure Site Recovery (ASR)

When using ASR to replication VMware or physical machines into Azure two roles are required – the configuration and process servers (often combined on a single server) – to help coordinate and facilitate the data replication (https://docs.microsoft.com/en-us/azure/site-recovery/site-recovery-vmware-to-azure#run-site-recovery-unified-setup).  On the configuration server, configuration data is stored in a MySQL database.  *at this time this is a requirement to use MySQL, other databases types are not supported.

There are several scenarios when you may need to verify or modify data stored in this database.  Below are samples for your reference.

Note – database modifications will impact ASR and should be done with care

Login to MySQL and Connect to the ASR Database

from a command prompt:

mysql –u root –p  (you will be prompted to enter the password specified during installation)

show databases;  (will list all databases for your reference)

use svsdb1; (selects the ASR database so future queries will run against it)

image

 

To list all machines registered with the configuration server (CS)

from https://social.technet.microsoft.com/wiki/contents/articles/32026.how-do-we-cleanup-duplicatestale-entries-in-asr-vmware-to-azure-scenario.aspx

select id as hostid, name, ipaddress, ostype as operatingsystem, from_unixtime(lasthostupdatetime) as heartbeat from hosts where name!=’InMageProfiler’\G;

image

 

To Cleanup Duplicate/Stale Entries

see https://social.technet.microsoft.com/wiki/contents/articles/32026.how-do-we-cleanup-duplicatestale-entries-in-asr-vmware-to-azure-scenario.aspx

 

To Update the IP of a Machine

update hosts set ipaddress='[new address]’ where ipaddress='[old address]’;

example, update hosts set ipaddress=’192.168.0.4′ where ipaddress=’11.0.0.10′;

 

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/02/connecting-to-mysql-for-azure-site-recovery-asr/

Azure IP Ranges

*Updated June 15, 2018*

For a myriad of reasons it’s nice to know what IPs you can expect to see coming to/from your Azure space.  Below is a quick cheat sheet.

Microsoft Azure Datacenter IP Ranges

updated 20 Aug 2018, thanks to Michael Ketchum of Microsoft for the additional information

The XML file now breaks down the IP ranges as follows:

  • “<SERVICE>” = Includes all IP’s for that service across all regions in the applicable cloud
  •  “<SERVICE>.<REGION>” = Includes all IP’s for that service in a specific region
  •  “AzureCloud.<REGION>” = Includes all IP’s/and services for that region
  •  “AzureCloud” = Includes all IP’s/and services for that cloud, such as Gov, commercial, etc.

The “Secret Azure IPs” you MUST Include – 168.63.129.16 and 169.254.169.254

Office 365 URLs and IP address ranges

Office 365 US Government: Endpoints for US Federal and US Defense Clouds (preview)

 

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/02/azure-ip-ranges/

Using OMS to Alert on Azure Service Outages

If you are unable to alert from the Azure Portal, or simple wish to have all your alerting from one source, consider leveraging OMS (Operations Management Suite).  With the free tier option (7 days of day retained) there is no additional cost!

Azure Service events are logged automatically in the Azure Portal –> Monitoring –> Activity Log (only incidents believed to impact your subscription(s) will be listed).  This article will show you how to use OMS to review and alert on these events.

image

 

Setup OMS (if you do not already have an OMS workspace)

1. Create a new workspace – https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-get-started#2-create-a-workspace

  • select the free pricing tier unless you have further plans for OMS

 

Configure OMS to Pull the Azure Activity Logs

1. Add the Activity Logs Analytics solution – https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-get-started#3-add-solutions-and-solution-offerings (only steps 1-4 are required)

 

Setup Alerting

1. Open the OMS portal (URL varies based on your cloud)

2. Click on Log Search

3. In the query window, enter: Type=AzureActivity Category=ServiceHealth

    • This will looks for alerts from the Azure Activity logs of type Service Health.  This is how Azure Service outages are categorized in the Azure Activity Logs
    • it is OK if no results are returned.  That just means there were no Azure Service Incidents that impacted your subscription the time range.

image

4. Click Alert in the top left

5.Configure the alerting options (see https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-alerts-creating#create-an-alert-rule for more details)

image

*The alert looks every 15min (alert frequency) for events matching the query that were raised in the past 15min (time window).  If there are more than 0 found (number of results), then an email is sent to all recipients listed.  These emails do NOT need to be associated with an Azure logon, etc.  Any publically routable email address will work.

Your recipients will now receive an email for Azure Service incidents.  It will look something like this:

image

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/03/using-oms-to-alert-on-azure-service-outages/

The OMS Agent for Azure Government – A Cheat Sheet

Below are the quick and dirty details you need to connect your Windows servers to OMS hosted in Microsoft Azure Government (MAG).  Any server with internet access can report to an OMS workspace (including but not limited to servers located on-premises, in the Azure Commercial cloud, hosted by other cloud providers, etc.).

Initial Install

  1. Azure Extension –  Note: Azure VMs only, VM must be in the same subscription as the OMS Workspace.  In portal.azure.us goto Log Analytics –> Your Workspace –> Workspace Data Sources –> Virtual Machines –> Connect the desired VM (click on the VM, in the new blade click connect).  The extension installs the full OMS agent on your VM.  For details see https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-azure-vm-extension
  2. OMS Agent (MSI) – the MSI can be installed interactively or via command line.  Download the agent from the OMS Portal (settings –> connected sources –> Windows Servers).  For full details, see https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-windows-agents#download-the-agent-setup-file-from-oms
    1. If installing the agent interactively, be sure you specify the cloud as Azure Government
    2. If installing the agent via the command line, you’ll need to use the “OPINSIGHTS_WORKSPACE_AZURE_CLOUD_TYPE=1 parameter to point to Azure Government.   For example:
      1. run: extract MMASetup-AMD64.exe
      2. then run: setup.exe /qn ADD_OPINSIGHTS_WORKSPACE=1 OPINSIGHTS_WORKSPACE_AZURE_CLOUD_TYPE=1 OPINSIGHTS_WORKSPACE_ID=yourid OPINSIGHTS_WORKSPACE_KEY=yourkey AcceptEndUserLicenseAgreement=1

Adding an OMS Workspace to an Existing Installation

To update an existing OMS or SCOM agent to point to a new/additional OMS workspace you can either manually configure the new workspace via the GUI or leverage PowerShell.

1.  Interactively via the GUI, see https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-windows-agents#configure-an-agent-manually-or-add-additional-workspaces

2.  Programmatically via PowerShell.  Note: the 1 at the end of the AddCloudWorkspace cmdlet indicates the workspace is in Azure Government.

$workspaceID =”yourworkspaceID”
$workspacekey= “yourkey”

$mma = New-Object -ComObject ‘AgentConfigManager.MgmtSvcCfg’
$mma.AddCloudWorkspace($workspaceId, $workspaceKey, 1)
$mma.ReloadConfiguration()

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/05/the-oms-agent-for-azure-government-a-cheat-sheet/

 

Using Azure CLI 1.0 to Copy a File between Subscriptions

note: examples are from the Azure Government cloud but the command used will work in all clouds

 

Goal: Use Azure CLI 1.0 to copy a blob file between to different subscriptions

Syntax:  azure storage blob copy start “sourceSAS” destContainer -a destStorageAccount -k destStorageKey

Example: azure storage blob copy start “https://mystorage.blob.core.usgovcloudapi.net/vhds/OSDisk.vhd?sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2017-05-16T06:59:23Z&st=2017-05-15T22:59:23Z&spr=https&sig=lyzka%2F3ID1qFhLeoxlDcjBpNuHDB701qWL0ubiD66wo%3D” vhds –a secondstorage -k zcSRtkJO9LzJuiYbOgoRW6Fgr3lS7lIFIEvIb3hbzJ62XBmZl5Igg1zfogNee8FtwGNGoJ6ADr7kAls6b+wJNQ==

note: the SAS and storage account key are used for access to the storage accounts, subscription access is not required to execute this command.

 

Now let’s break it down….here’s how you gather each of the required inputs.

1.  SourceSAS – the SAS is the source file URL + the Shared Access Signature (SAS).  For example if your source URL is https://mystorage.blob.core.usgovcloudapi.net/mycontainer/myfile.txt and your SAS Token is ?sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2017-05-16T06:59:23Z&st=2017-05-15T22:59:23Z&spr=https&sig=lyzka%2F3ID1qFhLeoxlDcjBpNuHDB001qWL0ubiD66wo%3D your SourceSAS is https://mystorage.blob.core.usgovcloudapi.net/mycontainer/myfile.txt?sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2017-05-16T06:59:23Z&st=2017-05-15T22:59:23Z&spr=https&sig=lyzka%2F3ID1qFhLeoxlDcjBpNuHDB001qWL0ubiD66wo%3D.  Note: There are no spaces when you join the two items and you’ll need to put the string in quotes when used in AzureCLI.

  • Source URL – there are a few ways to get this, but the simplest is via the portal.  Browse to your storage account –> blob –> your container –> your file.  A new blade opens and the URL is the second item listed.

image

  • SAS – again the simplest way to generate the SAS token is via the portal.  Browse to your storage account –> Shared Access Signature, update the values (the default will work, but it’s more secure to restrict the SAS Token to only the time frame and resources needed), and then click “Generate SAS”

image

2.  Destination container – this is the name of the container only (not a URL) that already exists in the destination storage account

3.  Destination Storage Account – this is the name of the storage account only (not a URL) that already exists in the destination subscription

4.  Destination Storage Access Key – there are a few ways to get this, but the simplest is via the portal.  Browse to your storage account –>Access Keys and copy either key1 or key2.

…and a special thanks to Madan Nagarajan for his sourceSAS breakthrough!

 

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/05/using-azure-cli-1-0-to-copy-a-file-between-subscriptions/

Working with Azure ARM VMs, Images, and Unmanaged Disk (Storage Accounts)

A lot of the pre-managed disk documentation has become hard to find.  Below is a cheat-sheet on where to find the documents you need to work with Azure ARM VMs, images, and storage accounts.

Create an Azure VM from custom image (VHD) – https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sa-upload-generalized

Create an Azure VM from existing VHD – https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sa-create-vm-specialized

Create an Azure VM Image (and VM) from existing Azure VM – https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sa-copy-generalized

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/07/working-with-vms-images-and-unmanaged-disk-storage-accounts/

Moving files between Azure Storage and RHEL

There are several options when you want to move files in or out of an Azure Storage account to a Red Hat Linux (RHEL) server.  Below is a quick break down of the most commonly used options.

Azure CLI

Azure CLI is design to run on Linux or Windows so is the ideal tool when a Linux machine is involved.  Azure CLI 2.0 (https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest) is the latest and preferred version.  See https://docs.microsoft.com/en-us/azure/storage/common/storage-azure-cli#create-and-manage-blobs for syntax.

Example:

#Start the copy of the blob

az storage blob copy start –account-name ${storage_account} –account-key ${key} –source-account-name ${source_account} –source-account-key ${source_key} –source-container ${blob_container} –source-blob ${blob_name} –destination-container $target_container –destination-blob $target_blob_name

#Now wait while this copy happening. This could take a while.

while [ $(az storage blob show –account-name ${storage_account} –account-key ${key} –container-name $target_container –name $target_blob_name| jq .properties.copy.status| tr -d ‘”‘) == “pending” ]; do

progress=$(az storage blob show –account-name ${storage_account} –account-key ${key} –container-name ${target_container} –name ${target_blob_name} | jq -r .properties.copy.progress)

done_count=$(echo $progress | sed ‘s,\([0-9]*\)/\([0-9]*\),\1,g’)

total_count=$(echo $progress | sed ‘s,\([0-9]*\)/\([0-9]*\),\2,g’)

progress_percent=$((100 * $done_count / $total_count))

echo “Copying: $progress ($progress_percent %)” && sleep 5

done

echo “Copying done”

AzCopy

AzCopy on Linux is a command-line utility designed for copying data to/from Azure Blob and File storage using simple commands.  See https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-linux?toc=%2fazure%2fstorage%2ffiles%2ftoc.json.  Specific steps for the file copy are here.

To Install on RHEL, you first need to install .NET Core 1.1.1

  1. sudo yum install rh-dotnetcore11
  2. scl enable rh-dotnetcore11 bash
  3. source scl_source enable rh-dotnetcore11

Then you install AzCopy per the article.

  1. wget -O azcopy.tar.gz https://aka.ms/downloadazcopyprlinux
  2. tar -xf azcopy.tar.gz
  3. sudo ./install.sh

Example from RHEL 7.4:

clip_image002

Wget

The final approach is to downloading is to generate a SAS token, append it to the blob URL and simply use wget in Linux.

See https://blogs.msdn.microsoft.com/nicole_welch/2017/05/using-azure-cli-1-0-to-copy-a-file-between-subscriptions/ for details on how to generate the SAS token and append it to the the blob URL (aka, “sourceSAS”).

 

*previously posed on https://blogs.msdn.microsoft.com/nicole_welch/2017/09/moving-files-between-azure-storage-and-rhel/

Azure Government – Missing Features or Services?

If you’ve experimented with the general Azure cloud (often called Microsoft Azure Commercial or MAC), it can be a shock when you move into Microsoft Azure Government (MAG) and notice not everything is the same.  Today I’m briefly going to discuss why there are feature differences, how to track them down, and how to lobby for the features you need.

Note: in this blog I use the terms service, offering, and feature interchangeably.  There is debate (even internally!) on the difference between the terms, but my goal here is to cover both new offerings and extensions to existing offerings.

Is the feature available in MAG at all?

Many features are turned on by region (a physical data center).  The first thing is to check and see if the offering (say a DS series VM) is available anywhere in MAG.

  1. Goto https://azure.microsoft.com/en-us/regions/services/
  2. Select only the Azure Government region checkbox
  3. You can now see this VM is only available in the Virginia data center.

image

If possible, deploy in that region so you can use the feature/service.  However, we understand that is not always possible.  Before any new large-scale deployments you should take a look at the list of services per region and ensure you use the region that best offers the services you need.  You don’t want to end up putting yourself in a situation where you have to wait on required services to be deployed by Microsoft!

To stay current on new releases (and they happen at least once a week!) I recommend subscribing to the Azure Government blog at https://blogs.msdn.microsoft.com/azuregov/ 

The features/service is not available where I need it!

Why not?  This is the big question I get from customers and of course there is no one easy answer.  These are the main reasons cited:

  • Features often depend on physical hardware and so they often deploy to each region at slightly different times depending on the hardware deployment.  I.e. with a new VM size/series
  • Some features are delayed as we work on the required compliances or attestations.
  • Engineering wants to release features as quickly as possible.  If the product can be deployed/engineered the same in MAC and MAG they are often released at the same time.  However if significant changes are required for MAG they don’t want to delay the MAC release.  In those cases the feature most likely will be deployed to MAC and then MAG after the required modifications are made.  Often features are delayed 6-9 months in MAG.
  • There is no/limited demand expected in MAG.

But I need it!

First, contact Microsoft.  You can talk with your sales team, account team, or even open a case via the portal (portal.azure.us).  We may have an estimate that can be shared if you are under NDA.

Second, vote it up!  At https://feedback.azure.com/forums/558487-azure-government you can nominate new features, add your vote to requests made by others, and generally make your voice heard.  This is a VERY powerful tool, even though it looks innocuous.  Cloud computing is about consumption and by offering the services you want, we increase consumption.  However knowing exactly what customers want is complicated and can come down to a “gut feel.”  By assigning metrics (used for good this time!) we can see exactly how important a feature or change is to the community and react accordingly.  Azure Engineering does review this and uses this data in their planning.  While a highly “voted” item isn’t guaranteed to be implemented quickly, it will increase the visibility and urgency around it.

Here’s a few tips for submitting/voting on improvements.

  1. Include hard facts.  We want numbers, timelines, and project implications.  I.e. if the M series were available in Texas we would deploy an additional 10 VMs by the end of 1Q 2018.  This gives Microsoft an idea of the urgency/timeline, infrastructure required to support the request, and the impact to our accounts.
  2. Include clear details.  I.e. if this is available in MAC, include the blog announcement or documentation too so we know exactly what you are requesting.  Often features have similar names, vary between cloud providers, etc. resulting in confusion.
  3. Include the consequences.  If this request isn’t met, what is the result?  I.e. if Service Map is not made available in MAG OMS will we not migrate our monitoring of 8000 servers into OMS.
  4. Share with your friends!  Votes count, so if you add a request have all your peers vote for it too.
  5. And be realistic.  Asking for a feature not even planned for MAC is unlikely to be approved for MAG.  Asking for a feature fulfilling a clear business need is much more likely to get backing.

If you get stuck or have questions, feel free to comment!

*previously posted on https://blogs.msdn.microsoft.com/nicole_welch/2017/12/azure-government-missing-features-or-services/