Tuesday, March 7, 2017

How to download Azure blob storage contents in Azure Linux VM using Azure CLI

Abstract

I always get this question – how can I download Azure blob storage files in Azure Linux VM? When I say use Azure CLI (Command Line Interface) then next question asked is – Do you have step by step guide?
Well, this blog post is the answer to both questions.

The high level approach is outlined below –
  1. Provision Linux Azure VM in a subnet. [Of course this step is out of scope of this article. For detailed steps refer - this guide .]
  2. Install Azure CLI in Linux VM
  3. level Upload sample files to azure storage and then download them in a folder in Linux VM.

This article assumes you understand Azure storage and related concepts.

Wow, this is first blog post from me on Linux and Azure.

Prepare you Azure Linux VM

I have provisioned an Azure Linux VM with OS as CentOs 7.2. Added this VM in a Subnet, with NSG having only port 22 inbound open. Also I have attached a public IP to this VM so that I can make SSH to this VM from anywhere over port no 22. Step by step guide link is already shared above.
If you are having different OS than CentOS then commands in below steps will change however high level approach remains same.

Install CLI in Azure CentOS VM

First make SSH to your Linux VM and run command “sudo su” [without double quotes]. So in subsequent steps we will not face awesome “access denied” or “permission denied” errors. Or we don’t have to add “sudo” word in every command we run.

There are two versions of Azure CLI –
1.0    – This stuff is written in node.js and supports both Classic [old way of doing things on Azure] and ARM [new fancy way of doing things on Azure].
2.0    – To make this version impressive Microsoft calls it “Next generation CLI” and is written in python. Only supports ARM mode.

I will be using 2.0 version. Hence I need Python as well installed on the Linux Azure VM. So let’s first install python latest version on Azure CentOS VM.
Let’s make sure that yum is up to date by running below command –

sudo yum -y update

-y flag tells system that “relax, we are aware that we are making changes, hence do not prompt for confirmation and save our valuable time”. This command execution will take good amount of time.
Next install yum-utils using below command –

sudo yum -y install yum-utils

Now we need install IUS (Inline with upstream stable). Don’t get scared by name. This is community project which will ensure that whatever version we install for Python 3, we will get the most stable version. Run below command to install IUS –
sudo yum -y install https://centos7.iuscommunity.org/ius-release.rpm

After IUS now we can install recent version of Python. As of writing, the recent version is 3.6 but I will install 3.5 to be on safer side. In python 3.5 version I see 3.5.3 is the latest so let’s install it.
sudo yum -y install python35u-3.5.3 python35u-pip

To verify, simple run below command and it’s output should be 3.5.3.
python3.5 -V

Now install the required prerequisites on CentOS using below command –
sudo yum check-update; sudo yum install -y gcc libffi-devel python-devel openssl-devel

Finally, back to installation of CLI 2.0 -


curl -L https://aka.ms/InstallAzureCli | bash
This may prompt you to download CLI in which directory. Press enter to keep the default path of installation which would be “/root/lib/azure-cli”. Similarly keep pressing enter if more prompts are displayed.
Restart command shell to take changes effect –
Exec -l $SHELL
Just type “az”[without quotes] and it should you Azure cli commands information in CentOs. This means installation of Azure CLI 2.0 on Linux is successful.

Run below command to list storage related commands.
az storage -h

Upload sample files to Azure Storage

This step is straight forward. Use Azure portal and create one standard [not premium] ARM based storage account. Create container and upload 4 sample files in the container. It would look like below –






Add Azure account to CLI

Run command  as shown below. It will prompt you with a code and link to enter the code. After this you will be asked for login using existing azure related credentials. Successful login will show you the subscriptions associated to your account as below –



Set storage account and download the blob

Now set credentials for storage account.
export AZURE_STORAGE_ACCOUNT=YourStorageAccount
export AZURE_STORAGE ACCESS_KEY=YourStorageAccountKey

Create a directory named as test1 using the command. This is the directory in which we will download blob contents. -
mkdir test1/

After this run below command to download the blob file in test1 folder
az storage blob download -c sample -n File1.txt -f /test1/File1.txt
az storage blob download -c sample  /test1

Change directory to test by command cd test1 and run ls -l. This should list File1.txt as shown below.

Limitation

Using Azure CLI you can’t download all the blob from a container. You have to download each and every blob individually; bulk download of azure blobs is not supported.

Resolution to Bulk download

To download all blobs from a container instead of Azure CLI, we will need to use Azure XPlat CLI. Or we can also use Powershell as it is open source now [although I have not tried yet]. It’s common to refer many approaches to achieve one task when you are in open source. J
Azure XPlat CLI is a project that provides cross platform command line interface to manage Azure. Refer documentation here - https://github.com/Azure/azure-xplat-cli. But this is another blog on another day.

Conclusion

So I hope now you understand how easy it is to download azure blob storage contents in Linux Virtual Machine.
Please provide your valuable comments. Good news is its free!!
Keep Downloading!!

Thursday, January 5, 2017

Domain join Azure VM using Azure Automation DSC

Abstract

Azure automation has changed a lot since I wrote last blog about AutoShutdown of Azure VMs using Azure Automation. Looking at the phenomenal rate of Azure platform evolution it makes perfect sense to revisit same services and write a new blog with absolutely new feature and tasks.
This article highlights step by step guide to make an Azure VM domain joined automatically using Automation DSC feature. This guide does not cover
-        Step by step flow on creating Azure Automation account in Azure Portal.
-        Azure VM provisioning
-        Domain configurations on domain controller

What is DSC?

DSC stands for Desired State Configuration. It’s a configuration management tool. There are many configuration tools available in the market. Few popular names are Chef and Puppet. DSC is also configuration management tool from Microsoft. Basically, it helps to automate tasks which would be very boring to do manually otherwise.
Example of such a boring task is, domain join the Azure VM when it is provisioned. I am working with one of the customer where almost every month they provision 100+ VMs on Azure and remove them. To satisfy the organization compliance and security policies all VMs should domain joined. Poor IT team had to do this domain joining repetitive task almost every day manually. There was a dedicated team member for this. He was about to go under psychiatric treatment. Thanks to Azure Automation DSC, he is back to normal now.
If interested more in knowing about DSC then link is here - https://msdn.microsoft.com/en-us/powershell/dsc/overview.
Note -
As of today Azure supports Classic(ASM) and ARM (Azure Resource Manager) type of deployments of resources. ARM is the future and this articles talks about ARM based resources only. Provisionof Azure ARM VM and configuring domain controller is out of scope of this article. Refer article - http://www.dotnetcurry.com/windows-azure/1145/active-directory-adfs-azure-virtual-machine-authentication-aspnet-mvc to understand quick steps about domain controller provisioning. The article talks about classic VM provisioning, which you can ignore and directly follow steps from section “Configure Active Directory” to promote the VM as domain controller.

Provision Azure Automation Account

Below link specifies the steps to provision Azure Automation account – CreateAzure Automation account. I am using below values for the same –



In above screenshot, subscription name is blurred; because your subscription name will be different from me and I want to keep it secret for security purpose. sssssshhhh…
New automation account will look as below -



To know about meaning of various options in Automation account like Runbooks, Assets, Hybrid Worker Groups and all refer - https://mva.microsoft.com/en-US/training-courses/automating-the-cloud-with-azure-automation-8323?l=C6mIpCay_4804984382.
As our focus is specifically on writing DSC script to make VMs auto domain join I will not spend time on various concepts and information related to Azure Automation.
With this let’s move forward to actual implementation.

Import xDSCDomainJoin module

xComputerManagement is the DSC module which can be used to make a computer domain joined. xDSCDomainjoin is stripped version of the same. This module is available on PowerShell Gallery. The central repository of PowerShell is known as PowerShell gallery. To know more refer - https://www.powershellgallery.com/.
So this PowerShell gallery has xDSCDomainjoin module and we must first import in our automation account before we use it in our script. The best way to import a module in Automation Account is from Azure Portal.
On the Azure Portal, select your Automation account. The click Assets -> Modules. All existing modules will be shown as below –



Click on “Browse Gallery” option. Search xDSCDomainjoin in the search box and it will be appear as shown below. The click on “Import” and then click Ok to complete importing procedure of module in the automation account. –



A message will appear as “Activities being extracted”. Let this procedure continue. After successful import the assets count will increase by 1 on the main page of Automation account.

Installing Azure PowerShell on local machine

On your local machine/ laptop open PowerShell ISE. You need all Azure PowerShell commands available on your local machine. Working on Azure without PowerShell is like Superman without Powers (or underwear…). Therefore, first install Azure PowerShell as per the guide given here - https://docs.microsoft.com/en-us/powershell/azureps-cmdlets-docs/#install-and-configure.

Writing DSC script for domain join

Now after installation first we must provide authentication information of Azure account to current open PowerShell ISE window. For this run the command –
Add-AzureRmAccount

This will prompt for login. Go ahead and login to complete the authentication.
Create new file in PowerShell ISE and save it as DomainJoinConfiguration.ps1. Write below PowerShell in the same file –
#first import below configuration in Azure automation account xDSCDomainjoin

Configuration DomainJoinConfiguration
{   
    Import-DscResource -ModuleName 'xDSCDomainjoin'
   
    #domain credentials to be given here   
    $secdomainpasswd = ConvertTo-SecureString "YourDomainPassword" -AsPlainText -Force
    $mydomaincreds = New-Object System.Management.Automation.PSCredential                       ("UserName@Domain", $secdomainpasswd)
   
        
    node $AllNodes.NodeName   
    {
        xDSCDomainjoin JoinDomain
        {
            Domain = 'YourDomain'
            Credential = $mydomaincreds
           
        }
    }
}

In above script replace YourDomainPassword, UserName@Domain, 'YourDomain' values by your own values. This is your final script to make an Azure VM domain joined. Now we must upload this file on Azure automation account. Therefore, click on DSC Configuration -> Add Configuration as shown below –



On the next window upload the file we created in above step and then click on Ok to complete the configuration of domain join DSC. And yes, please provide some meaningful description as shown below -




Why password in plain text?

In above script, you must have observed below line -
$secdomainpasswd = ConvertTo-SecureString "YourDomainPassword" -AsPlainText -Force

This forces to keep the password as plain text. As you have guessed this is not good practice. But I am not going to leave it here. Please read out next sections to understand why we are keeping the password in plain text. So, hold on your emotions.

Adding Configuration Data to DSC script of Domain Join

Configuration data allows you to separate structural configuration from any environment specific configuration while using PowerShell DSC.

This way, we want to separate “WHAT” from “WHERE”. DSC script we have written above specifies the structural configuration (what). This is where we define “What is needed” and does not change based on environment. Irrespective of environment; whether development or production, we want VMs to be Domain Joined. Environmental configuration specifies the environment in which the configuration is deployed (where). For example, we need common settings for all nodes and specific settings for specific nodes.

To specify environment configuration, we use “Config Data” and then we compile entire DSC script using config data. This should contain a key “All Nodes” where you specify all common configurations for all nodes that wishes to get domain joined automatically and then it can contain other node specific keys. By the way, Azure VMs we add to DSC configuration are termed as “Nodes”.
For all nodes, I want to allow “Plain Text Password” and “domain user credentials” and specific nodes I want domain joined. Therefore, we will write config data as –

$ConfigData = @{
    AllNodes = @(
        @{
            NodeName = "*"
            PSDscAllowPlainTextPassword = $True
            PSDscAllowDomainUser = $true
           
        }
        @{
            NodeName = "DomainJoined"
        }
    )
}

This configuration data I will need use to compile my DSC script.

Compile the DSC configuration

The domain join DSC script has been added to Azure automation account and now it is time to compile it so that .MOF file will be generated on Azure Pull Server. Once .MOF is generated all DSC nodes added to automation account receives configuration from the same .MOF file. If you open the DSC configuration, you will observe that “Compile” button is available at top in the portal itself.
However as of today, if you have DSC script and you wish to get compiled using Configuration data then PowerShell is only option. If you compile DSC script with config data using portal, you will receive errors. So, we will write compilation script and pass above mentioned configuration data, to compile DSC script present in Azure Automation, from local machine.
To compile DSC script present in Azure Automation account from your laptop, you will need credentials. This is where Service Principal helps. When we create Azure Automation account, an Azure Active Directory service principal automatically gets added to your Azure AD tenant under which current subscription is present. To verify, just go to Azure AD on Azure Portal and then select “App Registrations”. You will see the automation account added as web app.



Every app registered in Azure AD has application Id – this is your username.



Every app registered in Azure Ad has secret key – this is your password.



Specify duration as per your choice and give logical name to key as “KeyForDSC” and then click Save. This will automatically generate key. Secret key is visible only once, after which it won’t be visible on the portal. So make sure you get it stored in nice place instead of desktop for future use.
Azure AD tenant has unique id – this is your tenant id.



We need use Login-AzureRmAccount command and use above credentials to start a compilation job for DSC script in Azure Automation account from local machine. The complete script of compilation job with config data and service principal credentials is as below –
$ConfigData = @{
    AllNodes = @(
        @{
            NodeName = "*"
            PSDscAllowPlainTextPassword = $True
            PSDscAllowDomainUser = $true
           
        }
        @{
            NodeName = "DomainJoined"
        }
    )
}

$secpasswd = ConvertTo-SecureString "YourSecretKey" -AsPlainText -Force
$mycreds = New-Object System.Management.Automation.PSCredential ("YourApplicationId", $secpasswd)
Login-AzureRmAccount -ServicePrincipal -Tenant "YourADTenantId" -Credential $mycreds

$compilationJob = Start-AzureRmAutomationDscCompilationJob -ResourceGroupName 'YourResourceGroupName' -AutomationAccountName 'YourAutomationAccountName' -ConfigurationName 'DomainJoinConfiguration' -ConfigurationData $ConfigData
$compilationJob

Run above command and compilation job will start. Once completed, you can view the job status as below –



This completes the configuration DSC domain join in Azure Automation account.

Password in Plain Text

In both, DSC and compilation script above we used password in plain text.

The best way to protect the password is to use certificate based encryption. If you need details around the same then refer - https://blogs.technet.microsoft.com/ashleymcglone/2015/12/18/using-credentials-with-psdscallowplaintextpassword-and-psdscallowdomainuser-in-powershell-dsc-configuration-data/.
However, even if you keep the password in simple text in script it is fine. Because, the .MOF file generated from compilation of DSC script in Azure is always encrypted. But Automation doesn’t know that Azure is going to keep the entire .MOF file encrypted and it throws error. Therefore, we must explicitly specify the force keyword in PowerShell script. So, if you are sure that no untrusted admin will access automation account from azure portal then you can always keep the credentials in plain text. Choice is yours!

Add DSC node to make domain join – finally!

On the Azure portal, under automation account click on DSC Nodes -> Add Azure VM -> Virtual Machines. This will list all VMs present in the azure subscription. Select the VMs of your choice to get them domain joined.
Under Registration tab, Select Node configuration name as “DomainJoinConfiguration.DomainJoined” and provide other information as shown below.



After a while, selected node will display the information as Compliant and VM will show as domain joined.

Bonus Tip - To join VM to specific OU

This is simple. In the main DSC script you will need to provide your OU information if you want all VMs to be joined to specific OU. Example below -
xDSCDomainjoin JoinDomain
{
Domain = $Domain
Credential = $Credential # Credential to join to domain
JoinOU = "CN=Computers,DC=someplace,DC=kunal,DC=gov,DC=au"
}
Sometimes you may face errors related to domain credentials in log generated out of DSC job. In such case instead of specifying user@domain you can also specify domain\user format for credentials in DSC script.


Conclusion

I hope this post have helped you to save you from very repetitive tasks of domain joining a machine.
Please provide your valuable comments. Good news is its free!!

Keep Automating!!

Saturday, May 30, 2015

Export to Excel Microsoft Azure VM List


Introduction


As of today there is no way on full management portal ( http://manage.windowsazure.com ) or on new Azure portal (http://portal.azure.com) to export the list of Azure Virtual Machines to Excel. This blog talks about the same. Actually this is very keen requirement and I have seen lot of people in search of this functionality. This blog is an attempt to simplify their life a bit.
The advantage of Azure VM list export to excel is that you have this excel maintained in TFS or SharePoint to track the Azure environment resources. This simplifies the Azure resource management drastically when the number of VM’s present in an Azure subscription is in hundreds. I have seen my client’s Azure subscription wherein the number of VM’s was 120 and was not having user friendly names. Then every day they were facing challenge to understand the purpose of VM, who is the owner of it, till how long we wish to maintain it, which all need to be shutdown everyday in the late night and to be started everyday in the morning and so on. So I suggested them to have this tracked in an excel sheet. They liked the idea but again came back to me stating that, writing so many VM’s list manually in excel sheet is so boring. Can you please export these Azure VM’s list to excel and provide to us K.
I was thinking there has to be Export To Excel button on Azure portal but unfortunately there is none. So, if you need your Azure VM list to export to excel then the only answer is to write Azure Powershell commands and get it exported as .csv file.
The below is the powershell commands that cane provide you Azure VM list exported as CSV which you can then manually save as Excel sheet. So here we go!

Applicable Technology Stack

  1. Azure subscription with at least 1 Azure  VM’s provisioned in it.

Setting up the connection

Open the Powershell ISE as an administrator as shown below –

First run the command -
Add-AzureAccount

This open a pop up and asks you to enter the credentials of Azure subscription. Please enter subscription live Id and password. This will make current powershell window connected to Azure.

Setting up Subscription of Azure


It may be possible that one live ID account may have multiple azure subscription associated to it. Therefore we need to first select the subscription of Azure from which we wish to export the VM list by using below command –
select-azuresubscription -SubscriptionName "Your Azure Subscription Name Goes Here"

Enter your subscription name in above command. The subscription name can be found from management portal as shown below –


Get Azure VM’s and export the list


First let’s declare an array to hold the list of Azure VM’s.
$results = @() 

Then retrieve the list of VM using below command –
$vms = Get-AzureVM 

The variable $vms consist of all VM’s present in the selected subscription. Therefore we need to iterate over this list to retrieve important information from Azure VM’s and at the end append it to array we defined above.
Once added to array, simply export to Azure with pipe as below –
$results | export-csv -Path "Path to CSV goes here. Example, c:\AzureVMInfo.csv"

So the entire source code I have written in a function as below –
Function Export-AzureVMsToCSV
{
    $results = @()
    $vms = Get-AzureVM      

    foreach ($vm in $vms)
    {
        $details = @{           
                HostName                  = $vm.HostName
                CloudServiceName    = $vm.DNSName
                Name                          = $vm.Name
                VirtualNetworkName = $vm.VirtualNetworkName
                IPAddress                   = $vm.IpAddress
                PublicIPAddress         = $vm.PublicIPAddress
        } 
        $results += New-Object PSObject -Property $details      
    }   
    $results | export-csv -Path "Path to CSV goes here. Example, c:\AzureVMInfo.csv"
    Write-Output "export succeeded"
}
Export-AzureVMsToCSV

This way the .csv file will get created at the specified location with all details. Then you can save it as excel manually by opening the .csv file in excel sheet itself.

Source Code

Entire source code of Azure VM export to excel is available on below location –
https://gallery.technet.microsoft.com/Export-to-excel-Azure-19930e31
Hope this helps.
Cheers…