The OMS Agent for Azure Government – A Cheat Sheet

Below are the quick and dirty details you need to connect your Windows servers to OMS hosted in Microsoft Azure Government (MAG).  Any server with internet access can report to an OMS workspace (including but not limited to servers located on-premises, in the Azure Commercial cloud, hosted by other cloud providers, etc.).

Initial Install

  1. Azure Extension –  Note: Azure VMs only, VM must be in the same subscription as the OMS Workspace.  In portal.azure.us goto Log Analytics –> Your Workspace –> Workspace Data Sources –> Virtual Machines –> Connect the desired VM (click on the VM, in the new blade click connect).  The extension installs the full OMS agent on your VM.  For details see https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-azure-vm-extension
  2. OMS Agent (MSI) – the MSI can be installed interactively or via command line.  Download the agent from the OMS Portal (settings –> connected sources –> Windows Servers).  For full details, see https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-windows-agents#download-the-agent-setup-file-from-oms
    1. If installing the agent interactively, be sure you specify the cloud as Azure Government
    2. If installing the agent via the command line, you’ll need to use the “OPINSIGHTS_WORKSPACE_AZURE_CLOUD_TYPE=1 parameter to point to Azure Government.   For example:
      1. run: extract MMASetup-AMD64.exe
      2. then run: setup.exe /qn ADD_OPINSIGHTS_WORKSPACE=1 OPINSIGHTS_WORKSPACE_AZURE_CLOUD_TYPE=1 OPINSIGHTS_WORKSPACE_ID=yourid OPINSIGHTS_WORKSPACE_KEY=yourkey AcceptEndUserLicenseAgreement=1

Adding an OMS Workspace to an Existing Installation

To update an existing OMS or SCOM agent to point to a new/additional OMS workspace you can either manually configure the new workspace via the GUI or leverage PowerShell.

1.  Interactively via the GUI, see https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-windows-agents#configure-an-agent-manually-or-add-additional-workspaces

2.  Programmatically via PowerShell.  Note: the 1 at the end of the AddCloudWorkspace cmdlet indicates the workspace is in Azure Government.

$workspaceID =”yourworkspaceID”
$workspacekey= “yourkey”

$mma = New-Object -ComObject ‘AgentConfigManager.MgmtSvcCfg’
$mma.AddCloudWorkspace($workspaceId, $workspaceKey, 1)
$mma.ReloadConfiguration()

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/05/the-oms-agent-for-azure-government-a-cheat-sheet/

 

Using Azure CLI 1.0 to Copy a File between Subscriptions

note: examples are from the Azure Government cloud but the command used will work in all clouds

 

Goal: Use Azure CLI 1.0 to copy a blob file between to different subscriptions

Syntax:  azure storage blob copy start “sourceSAS” destContainer -a destStorageAccount -k destStorageKey

Example: azure storage blob copy start “https://mystorage.blob.core.usgovcloudapi.net/vhds/OSDisk.vhd?sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2017-05-16T06:59:23Z&st=2017-05-15T22:59:23Z&spr=https&sig=lyzka%2F3ID1qFhLeoxlDcjBpNuHDB701qWL0ubiD66wo%3D” vhds –a secondstorage -k zcSRtkJO9LzJuiYbOgoRW6Fgr3lS7lIFIEvIb3hbzJ62XBmZl5Igg1zfogNee8FtwGNGoJ6ADr7kAls6b+wJNQ==

note: the SAS and storage account key are used for access to the storage accounts, subscription access is not required to execute this command.

 

Now let’s break it down….here’s how you gather each of the required inputs.

1.  SourceSAS – the SAS is the source file URL + the Shared Access Signature (SAS).  For example if your source URL is https://mystorage.blob.core.usgovcloudapi.net/mycontainer/myfile.txt and your SAS Token is ?sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2017-05-16T06:59:23Z&st=2017-05-15T22:59:23Z&spr=https&sig=lyzka%2F3ID1qFhLeoxlDcjBpNuHDB001qWL0ubiD66wo%3D your SourceSAS is https://mystorage.blob.core.usgovcloudapi.net/mycontainer/myfile.txt?sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2017-05-16T06:59:23Z&st=2017-05-15T22:59:23Z&spr=https&sig=lyzka%2F3ID1qFhLeoxlDcjBpNuHDB001qWL0ubiD66wo%3D.  Note: There are no spaces when you join the two items and you’ll need to put the string in quotes when used in AzureCLI.

  • Source URL – there are a few ways to get this, but the simplest is via the portal.  Browse to your storage account –> blob –> your container –> your file.  A new blade opens and the URL is the second item listed.

image

  • SAS – again the simplest way to generate the SAS token is via the portal.  Browse to your storage account –> Shared Access Signature, update the values (the default will work, but it’s more secure to restrict the SAS Token to only the time frame and resources needed), and then click “Generate SAS”

image

2.  Destination container – this is the name of the container only (not a URL) that already exists in the destination storage account

3.  Destination Storage Account – this is the name of the storage account only (not a URL) that already exists in the destination subscription

4.  Destination Storage Access Key – there are a few ways to get this, but the simplest is via the portal.  Browse to your storage account –>Access Keys and copy either key1 or key2.

…and a special thanks to Madan Nagarajan for his sourceSAS breakthrough!

 

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/05/using-azure-cli-1-0-to-copy-a-file-between-subscriptions/

Working with Azure ARM VMs, Images, and Unmanaged Disk (Storage Accounts)

A lot of the pre-managed disk documentation has become hard to find.  Below is a cheat-sheet on where to find the documents you need to work with Azure ARM VMs, images, and storage accounts.

Create an Azure VM from custom image (VHD) – https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sa-upload-generalized

Create an Azure VM from existing VHD – https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sa-create-vm-specialized

Create an Azure VM Image (and VM) from existing Azure VM – https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sa-copy-generalized

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2017/07/working-with-vms-images-and-unmanaged-disk-storage-accounts/

Moving files between Azure Storage and RHEL

There are several options when you want to move files in or out of an Azure Storage account to a Red Hat Linux (RHEL) server.  Below is a quick break down of the most commonly used options.

Azure CLI

Azure CLI is design to run on Linux or Windows so is the ideal tool when a Linux machine is involved.  Azure CLI 2.0 (https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest) is the latest and preferred version.  See https://docs.microsoft.com/en-us/azure/storage/common/storage-azure-cli#create-and-manage-blobs for syntax.

Example:

#Start the copy of the blob

az storage blob copy start –account-name ${storage_account} –account-key ${key} –source-account-name ${source_account} –source-account-key ${source_key} –source-container ${blob_container} –source-blob ${blob_name} –destination-container $target_container –destination-blob $target_blob_name

#Now wait while this copy happening. This could take a while.

while [ $(az storage blob show –account-name ${storage_account} –account-key ${key} –container-name $target_container –name $target_blob_name| jq .properties.copy.status| tr -d ‘”‘) == “pending” ]; do

progress=$(az storage blob show –account-name ${storage_account} –account-key ${key} –container-name ${target_container} –name ${target_blob_name} | jq -r .properties.copy.progress)

done_count=$(echo $progress | sed ‘s,\([0-9]*\)/\([0-9]*\),\1,g’)

total_count=$(echo $progress | sed ‘s,\([0-9]*\)/\([0-9]*\),\2,g’)

progress_percent=$((100 * $done_count / $total_count))

echo “Copying: $progress ($progress_percent %)” && sleep 5

done

echo “Copying done”

AzCopy

AzCopy on Linux is a command-line utility designed for copying data to/from Azure Blob and File storage using simple commands.  See https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-linux?toc=%2fazure%2fstorage%2ffiles%2ftoc.json.  Specific steps for the file copy are here.

To Install on RHEL, you first need to install .NET Core 1.1.1

  1. sudo yum install rh-dotnetcore11
  2. scl enable rh-dotnetcore11 bash
  3. source scl_source enable rh-dotnetcore11

Then you install AzCopy per the article.

  1. wget -O azcopy.tar.gz https://aka.ms/downloadazcopyprlinux
  2. tar -xf azcopy.tar.gz
  3. sudo ./install.sh

Example from RHEL 7.4:

clip_image002

Wget

The final approach is to downloading is to generate a SAS token, append it to the blob URL and simply use wget in Linux.

See https://blogs.msdn.microsoft.com/nicole_welch/2017/05/using-azure-cli-1-0-to-copy-a-file-between-subscriptions/ for details on how to generate the SAS token and append it to the the blob URL (aka, “sourceSAS”).

 

*previously posed on https://blogs.msdn.microsoft.com/nicole_welch/2017/09/moving-files-between-azure-storage-and-rhel/

Azure Government – Missing Features or Services?

If you’ve experimented with the general Azure cloud (often called Microsoft Azure Commercial or MAC), it can be a shock when you move into Microsoft Azure Government (MAG) and notice not everything is the same.  Today I’m briefly going to discuss why there are feature differences, how to track them down, and how to lobby for the features you need.

Note: in this blog I use the terms service, offering, and feature interchangeably.  There is debate (even internally!) on the difference between the terms, but my goal here is to cover both new offerings and extensions to existing offerings.

Is the feature available in MAG at all?

Many features are turned on by region (a physical data center).  The first thing is to check and see if the offering (say a DS series VM) is available anywhere in MAG.

  1. Goto https://azure.microsoft.com/en-us/regions/services/
  2. Select only the Azure Government region checkbox
  3. You can now see this VM is only available in the Virginia data center.

image

If possible, deploy in that region so you can use the feature/service.  However, we understand that is not always possible.  Before any new large-scale deployments you should take a look at the list of services per region and ensure you use the region that best offers the services you need.  You don’t want to end up putting yourself in a situation where you have to wait on required services to be deployed by Microsoft!

To stay current on new releases (and they happen at least once a week!) I recommend subscribing to the Azure Government blog at https://blogs.msdn.microsoft.com/azuregov/ 

The features/service is not available where I need it!

Why not?  This is the big question I get from customers and of course there is no one easy answer.  These are the main reasons cited:

  • Features often depend on physical hardware and so they often deploy to each region at slightly different times depending on the hardware deployment.  I.e. with a new VM size/series
  • Some features are delayed as we work on the required compliances or attestations.
  • Engineering wants to release features as quickly as possible.  If the product can be deployed/engineered the same in MAC and MAG they are often released at the same time.  However if significant changes are required for MAG they don’t want to delay the MAC release.  In those cases the feature most likely will be deployed to MAC and then MAG after the required modifications are made.  Often features are delayed 6-9 months in MAG.
  • There is no/limited demand expected in MAG.

But I need it!

First, contact Microsoft.  You can talk with your sales team, account team, or even open a case via the portal (portal.azure.us).  We may have an estimate that can be shared if you are under NDA.

Second, vote it up!  At https://feedback.azure.com/forums/558487-azure-government you can nominate new features, add your vote to requests made by others, and generally make your voice heard.  This is a VERY powerful tool, even though it looks innocuous.  Cloud computing is about consumption and by offering the services you want, we increase consumption.  However knowing exactly what customers want is complicated and can come down to a “gut feel.”  By assigning metrics (used for good this time!) we can see exactly how important a feature or change is to the community and react accordingly.  Azure Engineering does review this and uses this data in their planning.  While a highly “voted” item isn’t guaranteed to be implemented quickly, it will increase the visibility and urgency around it.

Here’s a few tips for submitting/voting on improvements.

  1. Include hard facts.  We want numbers, timelines, and project implications.  I.e. if the M series were available in Texas we would deploy an additional 10 VMs by the end of 1Q 2018.  This gives Microsoft an idea of the urgency/timeline, infrastructure required to support the request, and the impact to our accounts.
  2. Include clear details.  I.e. if this is available in MAC, include the blog announcement or documentation too so we know exactly what you are requesting.  Often features have similar names, vary between cloud providers, etc. resulting in confusion.
  3. Include the consequences.  If this request isn’t met, what is the result?  I.e. if Service Map is not made available in MAG OMS will we not migrate our monitoring of 8000 servers into OMS.
  4. Share with your friends!  Votes count, so if you add a request have all your peers vote for it too.
  5. And be realistic.  Asking for a feature not even planned for MAC is unlikely to be approved for MAG.  Asking for a feature fulfilling a clear business need is much more likely to get backing.

If you get stuck or have questions, feel free to comment!

*previously posted on https://blogs.msdn.microsoft.com/nicole_welch/2017/12/azure-government-missing-features-or-services/

The Inevitability of the Cloud

Overview

After working on Microsoft Azure for over three years I think I understand the cloud pretty well (as well as I can with a constantly changing technology!). I understand the benefits (cost, scalability, etc.), the migration drivers, and the goal of a digital transformation, but I don’t really think much about how the cloud came to be. However something clicked while reading one paragraph in Satya Nadella’s book Hit Refresh (Employee edition) and I suddenly see the cloud as an inevitable step in the evolution of computing. The cloud is more than a new technology; it the platform that will enable the future of computing.

What exactly is the cloud?

Let’s start by defining what the cloud is. The cloud is a concept which has been implemented independently by multiple providers including Microsoft, Amazon, and Google. Each cloud is separate, but often you can span them with VPNs (virtual private networks). To quote Microsoft, “Simply put, cloud computing is the delivery of computing services—servers, storage, databases, networking, software, analytics, and more—over the Internet (“the cloud”). Companies offering these computing services, cloud providers, typically charge for cloud computing services based on usage, similar to how you’re billed for water or electricity at home.” (see https://azure.microsoft.com/en-us/overview/what-is-cloud-computing/)

Below I break down the major phases in the history of computing and hope to show why the cloud was the logical next step after on-premises data centers. Note, these phases are strictly my own organizational scheme. They often overlap and aren’t strictly consecutive.

Evolution of Digital Computing

Before Digital Computing

Computation, mathematical calculation, has been around since the discovery of numbers. Early analog computers include the abacus, astrolabe, and slide rule followed by mechanical computers including Pascal’s calculator or Pascaline, the Thomas Arithmometer, and the landmark programmable looms in the early 1700s. While generally single purpose, these devices were huge achievements in engineering and laid the foundation for modern computing.

The Rise of the Modern Computer

In 1936 Alan Turing wrote a ground-breaking paper detailing what we know now as the modern computer and shortly thereafter the first computers were build. These computers were programmable – able to perform tasks as detailed in a specific algorithms or programs – giving them the flexibility to do anything that could be expressed via a computer program. This opened the door for computers to be used in nearly every industry, but the extreme cost and extensive training requirements led to limited market adoption. Only the “big players” had computers so often programmers, highly trained computer specialists, would share a computer by reserving time or submitting jobs in batches and waiting for the results.

The Democratization of Computing

Multiple advancements in computer hardware, software, and human computer interaction (HCI) combined for the birth of the personal computer (PC). Simultaneously computers got smaller, cheaper, and more user friendly. Things we take for granted today — like an operating system, monitor, and mouse –made computers more accessible to non-techies. With the release of standard operating systems came a flood of applications aimed at both businesses (word processing, accounting, etc.) and consumers (games). As the use of computers expanded, the global data volume began to rapidly grow.

Connecting Devices

No man is an island and that goes for computers too. The solution to this isolation was computer networking. At first these were small, private, physical networks connecting devices within a single location (client-server model). Over time they evolved to span geographic locations, allowed for secure connections, removed the physical connection requirement, and culminated in the internet which allows us to share data instantly on a global scale. This mesh of omnipresent communication networks changed how we share, store, and access data. This is where the cloud began…

The Internet of Things (IoT)

Devices have been talking to computers for decades (SCADA has been around since the 1950s), but the release of smart phones in 2007 accelerated the path to near constant connection of both people and devices. Beds, light bulbs, cars, etc. now regularly gather, transmit, and receive information with the goal of improving our lives. This unprecedented surge in data made it impractical for most organizations with on-premise storage to keep up. This is where the cloud becomes a necessity – allowing you to store vast amounts of data, access it from anywhere, analyze it quickly, and do it all on-demand.

Leveraging our Data

The scientific method involves careful observation, data gathering, and analysis and in this way scientist have reached conclusions that revolutionized how we live, i.e. disinfecting hands and medical implements reduces the spread of disease. The cloud allows for the data, compute, and analytic resources at an unprecedented scale. Since massive data sets can now be composed from a variety of sources not typically combined, we should see more insights, right? Unfortunately, there are two big blockers I see.

1. Trust – For example, do we trust that our employer, health insurance provider, and medical team can share data in a way that won’t impact our career or health care premiums? I believe Microsoft Azure itself is secure as demonstrated by the countless certifications, attestations, and compliances and the use of blockchain technologies like Microsoft Coco could be the answer to trusted exchanges. Combined these technologies enable data sharing in new and exciting ways.

2. Data mining – This is the difficult process of finding patterns in large data sets – the insights that can be used for prediction, etc. Finding these patterns is so resource intensive that we are increasingly turning to Artificial Intelligence (AI) to assist.

Artificial Intelligence (AI)

AI allows computing to move into new roles that require adaptability and creativity, previously the purview of humans who have limited lifespans and workdays. I believe new AI technologies like machine learning will provide the breakthroughs needed to harness the power of our data. The Microsoft Azure Cloud already offers AI services (Cognitive Services and Machine Learning) and examples to show just how easy it is to use. This will be a major shift in how humanity functions and will be disruptive in the short term. However I don’t foresee a dystopian future, but rather a future where humans use AI to solve “the big problems.” Humanity always depends on the tools we invent – AI is simply a little fancier than the wheel and alphabet.

And then?

If our tech-neck, poor eyesight, and carpal tunnel health issues are any indication, humans still conform to the computer instead of it conforming to us. I expect advancements that make interacting with computers more natural and less physically demanding, with special focus on assisting those with disabilities. Turning to hardware, quantum computing is poised to break Moore’s Law and we can now encode massive volumes of data within DNA. Walls we perceive now as technical limits will disappear as humans prove yet again that we are ingenious and can innovate our way past any blocker.

Closing

In short, I see an exciting future of possibilities powered by the cloud. Transport devices that drive themselves, education tailored to each student, and medical breakthroughs that extend the human lifespan. Our lives thirty years in the future will be just as different from today as our lives were thirty years ago and likely in ways we can’t even image. Better? Hopefully. Different? Definitely!

Special thanks go to my father Joel Luedeman for his help with this article. After nearly 50 years in computing, he was the perfect partner to help me put the past in perspective.

*previously posted on https://blogs.msdn.microsoft.com/nicole_welch/2018/01/the-inevitability-of-the-cloud/

 

The Importance of a Growth Mindset in a Technology First World

This week I attended an internal Microsoft conference. The little group I spend my week with was fairly diverse – largely American but with Finnish, Israeli, and Indian participants; male and female; some from large cities other from small towns. This group broke the cardinal rule of polite conversation and discussed religion, politics, and diversity at length…..without antagonism! How did we manage this? We give credit to the growth mindset fostered at Microsoft and essential to survival in a cloud-first world.

Growth Mindset

For those of you not familiar with the concept of a growth mindset, psychologist Carol Dweck describes it as follows:

In a fixed mindset students believe their basic abilities, their intelligence, their talents, are just fixed traits. They have a certain amount and that’s that, and then their goal becomes to look smart all the time and never look dumb. In a growth mindset students understand that their talents and abilities can be developed through effort, good teaching and persistence. They don’t necessarily think everyone’s the same or anyone can be Einstein, but they believe everyone can get smarter if they work at it.[

As babies we are born with a growth mindset. We are constantly trying new things, right or wrong, eager to learn and improve. The result is important (i.e. learning to walk and talk) but so is the journey – think of parents applauding every attempt to stand, crawl, and walk for months in addition to the final accomplishment of walking. Over the years though, we are retrained by society to focus on results or accomplishments. Kids won’t raise their hand in class because having the wrong answer is worse than no answer. Kids don’t try out for sports because they are worried they won’t make the team. Without correction, these kids become adults with a fixed mindset – more focused on how they are perceived than actually being intelligent. Often these individuals come to believe they have all the required answers and are unwilling to accept additional information that challenges their views.

My Personal Journey

For me, three key items helped me transition to a growth mindset. This was a very natural journey for me, but also difficult. So much of my self-worth and identity was wrapped up in “being right” and I had to sacrifice my ego to grow.

1. Having conclusive evidence that I wasn’t the smartest. When one friend scored a 1600 on the SAT and another’s science fair project on chaos theory placed at state (in 7th grade!), I had to accept there is often more than one “smart person” in a room. If your ego prevents you from learning from others you can’t have a growth mindset; you must decouple your ego from your “smart person” status.

2. Dating a person who challenged me. When I began dating my now-husband, my ego took a huge hit. He kept beating at games, trivia, and even remembering key data points. I was used to be “always right” and was forced to realize that maybe I didn’t know as much as I thought.

3. Talking with children. Kids are endlessly curious and it’s infectious. When my son was a preschooler he asked me, “what is the difference between a seed and a nut?” After a brief pause I replied, “I don’t know. Let’s look that up.” Children are a constant reminder that adults still have so much to learn about our world and how things work. Just because we are out of school does not mean we are done learning.

How does this impact you?

All of us live in a technology-driven society that constantly changes. And it’s not just the technology that changes, it’s anything that technology touches…which is EVERYTHING. Doctors, mechanics, parents, teachers all take continuing education. Having a fixed mindset will at best hold you back and at worst lead to poor decisions that could have deadly consequences. Having a growth mindset allows you to take advantage of the latest discoveries, innovations, and all the benefits they offer.

A growth mindset is key to succeeding information technology. Standards from five years ago are fading, those from ten years ago are archaic, and those from 15+ years ago are almost completely gone. We must build on the past as we reach toward the future. We must be willing to try new technologies and IT strategies. Do new paradigms and technologies scare you? If so, take time to evaluate why. One of the common reasons is fear — fear of change, fear of uncertainty, fear of becoming obsolete. With a growth mindset, these concerns are opportunities to grow not possibilities to fear.

Summary

My advice? Keep trying new things and keep learning!

  • Learn a new skill (for work or fun)
  • Read
  • Travel or watch programming that exposes you to different cultures and ways of life
  • Listen and learn from those around you
  • Surround yourself with smart people
  • Go outside your comfort zone

Only be learning and taking risks will you stay afloat. By learning continually, you can thrive.

previously posted on https://blogs.msdn.microsoft.com/nicole_welch/2018/02/the-importance-of-a-growth-mindset-in-a-technology-first-world/

Azure Automation–Using Sample Runbooks in Azure Government

If you hit issues using the Gallery runbooks (like StopAzureV2Vm seen below), you may need to add a value to the environment name.  Just like when connecting to Azure Government via Azure PowerShell or CLI, you need to specify the environment name.

Add the value per the screenshot below and you should be fine!

clip_image001

previously posted on https://blogs.msdn.microsoft.com/nicole_welch/2018/02/azure-automation-using-sample-runbooks-in-azure-government/

Free Azure Training Resources

updated January 7, 2019

When you’re getting started with Azure there is so much to learn and so little time!  Below is a quick summary of the resources I recommend to my customers as they start ramping.

And for those of you already familiar with AWS:

*previously posted at https://blogs.msdn.microsoft.com/nicole_welch/2018/06/free-azure-training-resources/