Azure Classic Portal | How to Add a Data Disk Using a Different Storage Account than the OS Disk

When you add a new disk to an existing VM, you can only modify the  disk name and size which means you are stuck using the current/default storage account (see https://azure.microsoft.com/en-us/documentation/articles/virtual-machines-windows-classic-attach-disk/)

Using PowerShell, you can specify a different storage location for your new data disk though (see https://msdn.microsoft.com/library/azure/jj152837.aspx)

o Make sure you use -MediaLocation parameter as per the documentation “[i]f no location is specified, the data disk will be stored in the VHDs container within the default storage account for the current subscription.”

o The disk label you enter in the command shows in the storage view, but under the VM dashboard the disk title will look like this: [CloudService]-[VMName]-[LUN #]-[Date/time stamp].  If you expand the VHD column (copying it into notepad may be easier) you will be able to see the full VHD name and that WILL match what you specified in the media location parameter.

Example: Get-AzureVM -ServiceName “NLW-MAGBox” -Name “NLW-MAGBox” | Add-AzureDataDisk -CreateNew -DiskSizeInGB 128 -DiskLabel “NLWTest” -LUN 0 -MediaLocation “https://storage2.blob.core.usgovcloudapi.net/mycontainer/MyNewDisk.vhd” | Update-AzureVM

 

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2016/09/azure-classic-portal-how-to-add-a-data-disk-using-a-different-storage-account-than-the-os-disk/

Connecting to MySQL for Azure Site Recovery (ASR)

When using ASR to replication VMware or physical machines into Azure two roles are required – the configuration and process servers (often combined on a single server) – to help coordinate and facilitate the data replication (https://docs.microsoft.com/en-us/azure/site-recovery/site-recovery-vmware-to-azure#run-site-recovery-unified-setup).  On the configuration server, configuration data is stored in a MySQL database.  *at this time this is a requirement to use MySQL, other databases types are not supported.

There are several scenarios when you may need to verify or modify data stored in this database.  Below are samples for your reference.

Note – database modifications will impact ASR and should be done with care

Login to MySQL and Connect to the ASR Database

from a command prompt:

mysql –u root –p  (you will be prompted to enter the password specified during installation)

show databases;  (will list all databases for your reference)

use svsdb1; (selects the ASR database so future queries will run against it)

image

 

To list all machines registered with the configuration server (CS)

from https://social.technet.microsoft.com/wiki/contents/articles/32026.how-do-we-cleanup-duplicatestale-entries-in-asr-vmware-to-azure-scenario.aspx

select id as hostid, name, ipaddress, ostype as operatingsystem, from_unixtime(lasthostupdatetime) as heartbeat from hosts where name!=’InMageProfiler’\G;

image

 

To Cleanup Duplicate/Stale Entries

see https://social.technet.microsoft.com/wiki/contents/articles/32026.how-do-we-cleanup-duplicatestale-entries-in-asr-vmware-to-azure-scenario.aspx

 

To Update the IP of a Machine

update hosts set ipaddress='[new address]’ where ipaddress='[old address]’;

example, update hosts set ipaddress=’192.168.0.4′ where ipaddress=’11.0.0.10′;

 

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/02/connecting-to-mysql-for-azure-site-recovery-asr/

Azure IP Ranges

*Updated June 15, 2018*

For a myriad of reasons it’s nice to know what IPs you can expect to see coming to/from your Azure space.  Below is a quick cheat sheet.

Microsoft Azure Datacenter IP Ranges

updated 20 Aug 2018, thanks to Michael Ketchum of Microsoft for the additional information

The XML file now breaks down the IP ranges as follows:

  • “<SERVICE>” = Includes all IP’s for that service across all regions in the applicable cloud
  •  “<SERVICE>.<REGION>” = Includes all IP’s for that service in a specific region
  •  “AzureCloud.<REGION>” = Includes all IP’s/and services for that region
  •  “AzureCloud” = Includes all IP’s/and services for that cloud, such as Gov, commercial, etc.

The “Secret Azure IPs” you MUST Include – 168.63.129.16 and 169.254.169.254

Office 365 URLs and IP address ranges

Office 365 US Government: Endpoints for US Federal and US Defense Clouds (preview)

 

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/02/azure-ip-ranges/

OMS – Azure Activity Logs Solution – Access to the Subscription was Lost

Error

Seen in the Azure Activity Logs solution view.

Detail: Access to the subscription was lost. Ensure that the XXX subscription is in the XXX Azure Active Directory tenant. If the subscription is transferred to another tenant, there is no impact to the services, but information for the tenant could take up to an hour to propagate.
SourceSystem: OpsManager
SoureComputerID: all zeros
OperationStatus = Failed
OperationCategroy = Azure Activity Log Collection
Solution: AzureActivity

 

Possible Causes

  1. OMS is attempting to reach a subscription that has been deleted or is expired
  2. The AAD tenant associated with the subscription has changed

 

Solution

Method 1:

Disconnect the “problem” subscription from the Azure Activity Logs via the Azure ARM Portal.

  1. Open the ARM Portal
  2. Go to Log Analytics -> [your workspace] -> Azure Activity Logs (under Workspace Data Sources)
  3. From the listing of subscriptions, select the one you wish to disconnect
  4. In the new blade, click “Disconnect” from the blade header

Method 2:

Remove the OMS Azure Activity Logs solution (also known as Activity Log Analytics) either via the OMS portal or the Azure ARM Portal.  Then re-add the solution to your workspace.

Notes:

  • do NOT delete Log Analytics from the ARM portal.  This is NOT a solution and will delete your OMS workspace
  • Removing the Azure Activity Logs solution will NOT delete existing data.  Once you re-add the solution all events will reappear

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/03/oms-azure-activity-log-error-access-to-the-subscription-was-lost/

Using OMS to Alert on Azure Service Outages

If you are unable to alert from the Azure Portal, or simple wish to have all your alerting from one source, consider leveraging OMS (Operations Management Suite).  With the free tier option (7 days of day retained) there is no additional cost!

Azure Service events are logged automatically in the Azure Portal –> Monitoring –> Activity Log (only incidents believed to impact your subscription(s) will be listed).  This article will show you how to use OMS to review and alert on these events.

image

 

Setup OMS (if you do not already have an OMS workspace)

1. Create a new workspace – https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-get-started#2-create-a-workspace

  • select the free pricing tier unless you have further plans for OMS

 

Configure OMS to Pull the Azure Activity Logs

1. Add the Activity Logs Analytics solution – https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-get-started#3-add-solutions-and-solution-offerings (only steps 1-4 are required)

 

Setup Alerting

1. Open the OMS portal (URL varies based on your cloud)

2. Click on Log Search

3. In the query window, enter: Type=AzureActivity Category=ServiceHealth

    • This will looks for alerts from the Azure Activity logs of type Service Health.  This is how Azure Service outages are categorized in the Azure Activity Logs
    • it is OK if no results are returned.  That just means there were no Azure Service Incidents that impacted your subscription the time range.

image

4. Click Alert in the top left

5.Configure the alerting options (see https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-alerts-creating#create-an-alert-rule for more details)

image

*The alert looks every 15min (alert frequency) for events matching the query that were raised in the past 15min (time window).  If there are more than 0 found (number of results), then an email is sent to all recipients listed.  These emails do NOT need to be associated with an Azure logon, etc.  Any publically routable email address will work.

Your recipients will now receive an email for Azure Service incidents.  It will look something like this:

image

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/03/using-oms-to-alert-on-azure-service-outages/

The OMS Agent for Azure Government – A Cheat Sheet

Below are the quick and dirty details you need to connect your Windows servers to OMS hosted in Microsoft Azure Government (MAG).  Any server with internet access can report to an OMS workspace (including but not limited to servers located on-premises, in the Azure Commercial cloud, hosted by other cloud providers, etc.).

Initial Install

  1. Azure Extension –  Note: Azure VMs only, VM must be in the same subscription as the OMS Workspace.  In portal.azure.us goto Log Analytics –> Your Workspace –> Workspace Data Sources –> Virtual Machines –> Connect the desired VM (click on the VM, in the new blade click connect).  The extension installs the full OMS agent on your VM.  For details see https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-azure-vm-extension
  2. OMS Agent (MSI) – the MSI can be installed interactively or via command line.  Download the agent from the OMS Portal (settings –> connected sources –> Windows Servers).  For full details, see https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-windows-agents#download-the-agent-setup-file-from-oms
    1. If installing the agent interactively, be sure you specify the cloud as Azure Government
    2. If installing the agent via the command line, you’ll need to use the “OPINSIGHTS_WORKSPACE_AZURE_CLOUD_TYPE=1 parameter to point to Azure Government.   For example:
      1. run: extract MMASetup-AMD64.exe
      2. then run: setup.exe /qn ADD_OPINSIGHTS_WORKSPACE=1 OPINSIGHTS_WORKSPACE_AZURE_CLOUD_TYPE=1 OPINSIGHTS_WORKSPACE_ID=yourid OPINSIGHTS_WORKSPACE_KEY=yourkey AcceptEndUserLicenseAgreement=1

Adding an OMS Workspace to an Existing Installation

To update an existing OMS or SCOM agent to point to a new/additional OMS workspace you can either manually configure the new workspace via the GUI or leverage PowerShell.

1.  Interactively via the GUI, see https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-windows-agents#configure-an-agent-manually-or-add-additional-workspaces

2.  Programmatically via PowerShell.  Note: the 1 at the end of the AddCloudWorkspace cmdlet indicates the workspace is in Azure Government.

$workspaceID =”yourworkspaceID”
$workspacekey= “yourkey”

$mma = New-Object -ComObject ‘AgentConfigManager.MgmtSvcCfg’
$mma.AddCloudWorkspace($workspaceId, $workspaceKey, 1)
$mma.ReloadConfiguration()

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/05/the-oms-agent-for-azure-government-a-cheat-sheet/

 

Using Azure CLI 1.0 to Copy a File between Subscriptions

note: examples are from the Azure Government cloud but the command used will work in all clouds

 

Goal: Use Azure CLI 1.0 to copy a blob file between to different subscriptions

Syntax:  azure storage blob copy start “sourceSAS” destContainer -a destStorageAccount -k destStorageKey

Example: azure storage blob copy start “https://mystorage.blob.core.usgovcloudapi.net/vhds/OSDisk.vhd?sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2017-05-16T06:59:23Z&st=2017-05-15T22:59:23Z&spr=https&sig=lyzka%2F3ID1qFhLeoxlDcjBpNuHDB701qWL0ubiD66wo%3D” vhds –a secondstorage -k zcSRtkJO9LzJuiYbOgoRW6Fgr3lS7lIFIEvIb3hbzJ62XBmZl5Igg1zfogNee8FtwGNGoJ6ADr7kAls6b+wJNQ==

note: the SAS and storage account key are used for access to the storage accounts, subscription access is not required to execute this command.

 

Now let’s break it down….here’s how you gather each of the required inputs.

1.  SourceSAS – the SAS is the source file URL + the Shared Access Signature (SAS).  For example if your source URL is https://mystorage.blob.core.usgovcloudapi.net/mycontainer/myfile.txt and your SAS Token is ?sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2017-05-16T06:59:23Z&st=2017-05-15T22:59:23Z&spr=https&sig=lyzka%2F3ID1qFhLeoxlDcjBpNuHDB001qWL0ubiD66wo%3D your SourceSAS is https://mystorage.blob.core.usgovcloudapi.net/mycontainer/myfile.txt?sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2017-05-16T06:59:23Z&st=2017-05-15T22:59:23Z&spr=https&sig=lyzka%2F3ID1qFhLeoxlDcjBpNuHDB001qWL0ubiD66wo%3D.  Note: There are no spaces when you join the two items and you’ll need to put the string in quotes when used in AzureCLI.

  • Source URL – there are a few ways to get this, but the simplest is via the portal.  Browse to your storage account –> blob –> your container –> your file.  A new blade opens and the URL is the second item listed.

image

  • SAS – again the simplest way to generate the SAS token is via the portal.  Browse to your storage account –> Shared Access Signature, update the values (the default will work, but it’s more secure to restrict the SAS Token to only the time frame and resources needed), and then click “Generate SAS”

image

2.  Destination container – this is the name of the container only (not a URL) that already exists in the destination storage account

3.  Destination Storage Account – this is the name of the storage account only (not a URL) that already exists in the destination subscription

4.  Destination Storage Access Key – there are a few ways to get this, but the simplest is via the portal.  Browse to your storage account –>Access Keys and copy either key1 or key2.

…and a special thanks to Madan Nagarajan for his sourceSAS breakthrough!

 

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/05/using-azure-cli-1-0-to-copy-a-file-between-subscriptions/