Accelerate resource tagging with PowerShell — Microsoft Azure Tag Report

Once, I prepared this script to quickly tag many resources deployed on the Microsoft Azure platform. There are many ways to do it quickly and easily, but I tried to make a universal solution that is easy and safe to use. Therefore, I decided to make one script that generates all resources to a CSV file. And the second script based on the CSV file will pull resource data from it and overwrite it on the platform.

The scripts are available on GitHub:

Description of the scripts:

  • GetAllTags.ps1 — generates two CSV files. One file for Resource Group and one file for Resources. This is important because we have separate groups and we can tag groups differently from resources without looking for resource types.
  • SetTagsResourceGroups.ps1 — a script that takes data from a CSV file where there is a Resource Group inventory with tags to be deployed.
  • SetTagsResources.ps1 — a script that retrieves data from a CSV file where we have saved resources with tags for implementation.

This division gives us:

  1. Saving the current state of tags saved to a CSV file. This is a tag report, i.e. the current status of the tags implemented.
  2. The division into groups and resources.
  3. Ability to fix tags only on resources or only on resource groups.

How do scripts work?

GetAllTags

  1. The file is saved automatically to the location where we execute the script or we can use the -path option where the files are to be saved.
  2. The file name includes the date, if the file exists it will be overwritten.
  3. The script checks if you are logged in to Azure. When executing the script, you can specify the -tenantId parameter to make sure that you are logging into the appropriate Azure Directory and -subId for selecting the correct subscription.

SetTagsResourceGroups and SetTagsResources

  1. Data from files is imported automatically based on the naming convention of the previously generated file. The script imports the latest CSV file in the specified directory. There is a path specified via the -path parameter.
  2. Tags should be separated by commas.
  3. The tags saved in the CSV file work as “Key: Value” pairs in separate columns.
  4. The script gets all the items from the CSV file. The script then removes the tags in Microsoft Azure then entered them from the CSV file.
  5. If you do not want to move a given resource, just remove it from the CSV file.
  6. The script runs in parallel mode, which allows you to make changes faster. The throttle for writing tags is 5, keep in mind this is the optimal value.
  7. The script checks if you are logged in to Azure. You can specify the -tenantId parameter when executing a script.
  8. If you want to clear tags on resources, just leave an empty cell or enter “empty”.
  9. We can test the introduction of tags by entering only the resource that we want to change in the CSV file. The rest of the resources will not be considered when implementing tags.

Below is an example of how it works on my subscription.

Running the script:

  • GetAllTags.ps1

Command: ./GetAllTags.ps1 -path /tags

On the print screen, you can see the output with the information generated by the script. It is:

  • Are you logged in, if not then it will ask you to log in.
  • The number of resource groups and resources found.
  • Information where and what files are created.
  • Resource List.

The CSV files look like this:

  • Resource Group
  • Resources

Below I will present the tag editing for a resource file.

Attention! If you want to keep a copy of the current state, protect the generated file or rename it to a completely different one.

The corrected file looks like this. Note that I have filled in the tags for all objects and added a new tag: Test: Key to show what assigning multiple tags looks like.

The script output with the changes made:

Command: ./SetTagsResources.ps1 -path /tags

The effect of changes in the Azure Portal:

You can customize the script to suit your purpose so that it runs on multiple subscriptions, for example. There are many customization options for this script that I have intentionally left out. The scripts work modularly, so they can be easily used with another script, and the rest will be prepared and implemented based on the input data from the file.

Article available in Polish: https://justcloud.azurewebsites.net/blog/tags-in-azure/

Follow Piotr Rogala in Medium!


Blog

Building better SaaS products with UX Writing (Part 3)

UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

Blog

Building better SaaS products with UX Writing (Part 2)

The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

Blog

Building better SaaS products with UX Writing (Part 1)

UX writing is the process of creating all the copy and content of a digital experience.

Get in Touch

Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








    How to manage resources in the external tenant in Azure?

    CATEGORIES

    Tech Community

    The case

    You want to manage resources in other tenants without additional workload around that, such as accounts creation for you and your teammates in the destination Azure tenant or disabling those accounts if someone leaves the team. Is there any solution for that?

    The answer is: yes! And it is called Azure Lighthouse.

    From docs:

    Azure Lighthouse enables cross- and multi-tenant management, allowing for higher automation, scalability, and enhanced governance across resources and tenants. ~What is Azure Lighthouse?

    How does it work?

    Azure Lighthouse enables managing resources in another tenant directly from your ‘home’ tenant. You can define with that which users and/or Active Directory Security groups should have access to which resources and with which roles.

    Requirements

    To set up the Azure Lighthouse solution, you have to fulfill some requirements and gather a bunch of information. The full list you can find here:

    • Your tenant ID
    • Customer’s tenant ID
    • List of subscription IDs and/or resource group IDs to which you have to have access
    • List of users and/or Active Directory groups which you want to assign to manage customer environments – IMPORTANT: AD groups have to be Security groups!
    • List of roles and permissions (with their IDs) which you want to assign
    • Someone with non-guest account in customer’s tenant and Microsoft.Authorization/roleAssignments/write permission – it is required to create assignment (Deploy the Azure Resource Manager templates)
    • Something called mspOfferName and mspOfferDescription – the first one defines the connection between both tenants and their resources. IMPORTANT: You can’t have multiple assignments at the same scope with the same mspOfferName.

    How to onboard the customer?

    If you gathered all of the above requirements, on Github you can find example ARM templates that allow you to onboard. You have to deploy prepared and fulfilled templates to each subscription separately.

    Example

    For preparing example I will use the template from the official docs:

    with parameters file:

    I translated the above template to Bicep – it’s much easier to read and edit for me:

    The Bicep file can be used with the same parameters file as the classic ARM template.

    For test purposes, I used two different Azure tenants. I ‘home’ tenant I created AD Security group called ‘PW MC’. I wanted to give to chosen subscription the Contributor rights for ‘PW MC’ AD group, so I deployed the above template with the Azure CLI:

    Previously I logged into the tenant with az login --tenant command.

    Effect

    The effect should be visible in two places, in both tenants. In the customer tenant when we will go to the Service providers page, we will able to see something like on the screenshot below:

    Service providers in Customer’s tenant

    and in the ‘home’ tenant when we will visit the My customers page:

    My customers page in home tenant

    If we click on ‘Default directory’:

    Subscription details

    Then you can click on the subscription name and do whatever you want with the resources.

    Summary

    You can see that implementing Azure Lighthouse is quite easy and reduces management concerns. And what is the most important – it is free!

    To stay up to date about our Azure Cloud Engineer Piotr Wachulec’s latest posts, make sure to check his private blog and subscribe for his newsletter!

    Blog

    Building better SaaS products with UX Writing (Part 3)

    UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

    Blog

    Building better SaaS products with UX Writing (Part 2)

    The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

    Blog

    Building better SaaS products with UX Writing (Part 1)

    UX writing is the process of creating all the copy and content of a digital experience.

    Get in Touch

    Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








      Passwordless ARM templates using Azure Key Vault

      We at Nordcloud implement ARM templates on the Microsoft Azure platform regularly. In parameters, we sometimes operate with confidential data and store them in private repositories such as Azure DevOps Repos, Github or others. To maintain security at a high level, we should use solutions adapted to storing passwords (secrets).

      Below I will describe how we can implement a sample Azure Key Vault to store passwords and implement a virtual machine that will use the Azure Key Vault password during deployment.

      Required for this task

      1. ARM template Azure Key Vault:
        1. Link: https://github.com/Azure/azure-quickstart-templates/tree/master/101-key-vault-create
      2. ARM template virtual machine:
        1. Link: https://github.com/Azure/azure-quickstart-templates/tree/master/101-vm-simple-linux

      Prerequisites

      1. Powershell Core with Az module or Azure CLI
        1. PowerShell Core: https://docs.microsoft.com/en-us/powershell/scripting/install/installing-powershell-core-on-windows?view=powershell-7.1
        2. Az Module: https://www.powershellgallery.com/packages/Az/5.5.0
        3. Azure CLI: https://docs.microsoft.com/en-us/cli/azure/install-azure-cli

      If you’ve never deployed a code before, you can check how to do it on this page: https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/deploy-powershell

      Let’s get started!

      First, we create a resource group for Azure Key Vault with the command:

      New-AzResourceGroup -Name my-test-keyvault -Location westeurope

      Then we deploy Azure Key Vault from a ready template using the command:

      New-AzResourceGroupDeployment -ResourceGroupName 'my-test-keyvault' -TemplateUri 'https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/101-key-vault-create/azuredeploy.json' -keyVaultName myTestKeyVaultNC -objectId 'YOUR-OBJECT-ID' -secretName 'secret1' -secretValue $(ConvertTo-SecureString ‘PASSWORD' -AsPlainText -Force) -enabledForDeployment $true

      Hints

      • objectID – that is the user ID or SPN object that is to access this password from Azure Key Vault
      • enabledForTemplateDeployment – with the setting true because it allows us to retrieve the password during deployment
      • secretsPermissions – this will allow us to get and list the password
      • secretValue – in the password field, enter the password you want to enter in the Key Vault. Here you can also use a password generator to automatically send you a generated password that nobody knows

      Screen from Azure Key Vault – secrets:

      Screen from Azure Key Vault – Access policy:

      We start deploying a virtual machine by creating a new resource group:

      New-AzResourceGroup -Name my-test-vm -Location westeurope

      Then we need to create our own parameter file on the disk to refer to Azure Key Vault. Save the parameters file locally: https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/101-vm-simple-linux/azuredeploy.parameters.json

      Then change the values to as below:

      {
        "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentParameters.json#",
        "contentVersion": "1.0.0.0",
        "parameters": {
          "adminUsername": {
            "value": "USRE-NAME"
          },
          "adminPasswordOrKey": {
            "reference": {
                "keyVault": {
                "id": "/subscriptions/ID-SUBSCRIPTION/resourceGroups/my-test-keyvault/providers/Microsoft.KeyVault/vaults/myTestKeyVaultNC"
                },
                "secretName": "secret1"
              }
          },
          "dnsLabelPrefix": {
            "value": "UNIQ-DNS-NAME"
          }
        }
      }
      

      If you don’t know what your Key Vault Resource ID is, use the command:

      (Get-AzKeyVault -ResourceGroupName my-test-keyvault -VaultName myTestKeyVaultNC).ResourceId

      To run deployment with reference to Azure Key Vault, execute the command:

      New-AzResourceGroupDeployment -ResourceGroupName 'my-test-vm' -TemplateUri 'https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/101-vm-simple-linux/azuredeploy.json' -TemplateParameterFile/azuredeploy.parameters.json

      Summary

      We implemented the Azure Key Vault template with a password and additional access for your user ID. You then used the reference to Azure Key Vault in the parameters file to implement the password from the Key Vault for the virtual machine deployment.

      It is a solution for password management during deployments and for designing confidential data of choice for selected users. The above solution can be implemented using Azure DevOps and fully automated to keep all confidential parameters and have up-to-date data retrieved from Azure Key Vault during the implementation.

      If you liked the post, share it!

      Read more cloud blog texts on our Community & Culture pages.

      We at Nordcloud are constantly hiring Azure experts – check the open positions here, apply and join our learning community!

      Blog

      Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

      When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

      Blog

      Building better SaaS products with UX Writing (Part 3)

      UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

      Blog

      Building better SaaS products with UX Writing (Part 2)

      The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

      Get in Touch

      Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








        Monitoring your home temperature – Part 3: Visualizing your data with Power BI

        Finally, we have our infrastructure up and running and we can start visualizing it with Power BI.

        No.. not like this 🙂

        Just login to your Power BI account:

        https://app.powerbi.com/

        Select My Workspace and click Datasets + dataflows. Click the three dots from your dataset and choose Create Report.

        Let’s create line charts for temperature, humidity and air pressure. Each of them on a separate page. I’ll show how to configure the temperature first.

        Go to Visualizations pane and choose Line chart:

        Stretch the chart to fill your page and go under Fields and expand your table. Drag ‘EventEnqueuedUtcTime’ to Axis and ‘temperature’ to Values:

        You should already see the graph of your temperature. You can rename Axis and Values to a more friendly name like Time and Temperature (Celsius). Change the color of your graph etc..

        Add filter just next to the graph to show data from the past 7 days:

        The end result should be something like this:

        Temperature

        Mine has been customised for Finnish language. It also has an average temperature line. Just for demo purpose, I placed my RuuviTag outside, so there are some changes for temperature (If you wonder why my floor temperature is 7 degrees at night..) You can add the humidity and the air pressure with the same method to the same page or you can create own separate pages for them. To make it more clear, I have separate pages for each of them:

        Air pressure
        Humidity

        I also added one page for minimum, maximum and average values:

        So go ahead and customize however you like and leave a comment if you have good suggestions for customization. Remember to save your report from File and Save. If you want to share it, you have an option to publish it to anyone with Publish to web.

        After this, you can scale up your environment by adding more RuuviTags. This is the end of this series, thanks for reading and let me know if you have any questions!


        Get to know the whole project by reading the parts 1 and 2 of the series here and here!

        This blog text is originally published in Senior Cloud Architect Sami Lahti’s personal Tech Life blog.

        Blog

        Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

        When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

        Blog

        Building better SaaS products with UX Writing (Part 3)

        UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

        Blog

        Building better SaaS products with UX Writing (Part 2)

        The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

        Get in Touch

        Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








          Monitoring your home temperature – Part 2: Setting up Azure

          Now we need to setup the Azure side and for that you need an Azure subscription. You can get a new subscription with 170€ of credits for 30 days with this link, if you don’t have anything yet:

          https://azure.microsoft.com/en-us/free/

          As this is a demo environment, we do not focus heavily on security. You need some basic understanding how to deploy resources. I’ll guide you on the configuration side. Btw, If you want to learn basics of Azure, there is a nice learning path at Microsoft Learning:

          https://docs.microsoft.com/en-us/learn/paths/azure-fundamentals/

          Here you can see my demo environment:

          IoT Hub is for receiving temperature messages from Raspberry Zero. Stream Analytics Job is for pushing those messages to the Power BI. The Automation Account is for starting and shutting down the Stream Analytics job, so it won’t consume too much credits.

          Normally, we would deploy this process with ARM, but for make it more convenience to present and instruct, let’s do it from the Portal. And if you like, you can follow naming convention from CAF:

          https://docs.microsoft.com/en-us/azure/cloud-adoption-framework/ready/azure-best-practices/naming-and-tagging

          Start deploying the IoT Hub. Use a globally unique name and public endpoints (You can restrict access to your home ip-address if you like) and choose F1: Free Tier. With the Free Tier you can send 8000 messages per day, so if you have five Raspberry Pis, each of them can send one message per minute. Enough for home use usually.

          After you have created the IoT Hub, you need to create an IoT Device under it. I used same hostname as my Raspberry Pi has:

          Then you need to copy your Primary Connection String from your IoT Device:

          After copying that string, you need to create an environment variable for your Raspberry Pi. You can use a script to automatically add it after every boot.

          Here is an example how you create an environment variable:

          export IOTHUB_DEVICE_CONNECTION_STRING="HostName=YourIoTHub-devices.net;DeviceId=YourIotDevice;SharedAccessKey=YourKey"

          Last thing to IoT Hub is to add a consumer group. You can add it under Built-in Endpoints and Events. Just add a custom name under $Default. Here you can see, that I added ‘ruuvitagcg’:

          Next, you want to create a Stream Analytics Job. For that, you need only one unit and environment should be Cloud. There is no free tier for this and it costs some money to keep it running. Luckily, we can turn it off whenever we don’t use it. I used the Automation Account to start it a few minutes before I receive a message and turn it off few minutes after. There is a minor cost for the Automation Account also, but without it, the total cost would be much higher. I receive a message only once per hour, so Stream Analytics is running only 96 minutes every day, instead of 1440 minutes. The total monthly cost is something like 4€. Normally it would be almost 70€.

          Here are my Automation Account scripts:

          $connectionName = "RuuvitagConnection"
          $servicePrincipalConnection=Get-AutomationConnection -Name $connectionName         
          
          Connect-AzAccount `
              -ServicePrincipal `
              -TenantId $servicePrincipalConnection.TenantId `
              -ApplicationId $servicePrincipalConnection.ApplicationId `
              -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint 
          
          Start-AzStreamAnalyticsJob -ResourceGroupName "YourRG" -Name "YourStreamAnalytics" -OutputStartMode "JobStartTime"
          
          $connectionName = "RuuvitagConnection"
          $servicePrincipalConnection=Get-AutomationConnection -Name $connectionName         
          
          Connect-AzAccount `
              -ServicePrincipal `
              -TenantId $servicePrincipalConnection.TenantId `
              -ApplicationId $servicePrincipalConnection.ApplicationId `
              -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint 
          
          Stop-AzStreamAnalyticsJob -ResourceGroupName "YourRG" -Name "YourStreamAnalytics"

          Next, head to Stream Analytics Job and click Inputs.

          Click Add stream input. Above is my configuration, you just need to add your own consumer group configured earlier. For Endpoint, choose Messaging. Use service for Shared access policy name (It is created default with a new IoT Hub).

          Now, move on to Outputs and click Add. Choose Power Bi and click Authorize, if you dont have Power BI yet, you can Sign Up.

          Fill in Dataset name and Table name. For the Authentication mode, we need to use User token if you have a Free Power BI version (v2 upgrade is not yet possible in Free).

          Then create the following Query and click Test query, you should see some results:

          Now, the only thing to do is to visualize our data with the Power BI. We will cover that part in the next post, but the infrastructure side is ready to rock. Grab a cup of coffee and pat yourself to the back 🙂


          Get to know the whole project by reading the parts 1 and 3 of the series here and here!

          This blog text is originally published in Senior Cloud Architect Sami Lahti’s personal Tech Life blog. Follow the blog to stay up to date about Sami’s writings!

          Blog

          Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

          When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

          Blog

          Building better SaaS products with UX Writing (Part 3)

          UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

          Blog

          Building better SaaS products with UX Writing (Part 2)

          The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

          Get in Touch

          Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








            Monitoring your home temperature – Part 1: Setting up RuuviTags and Raspberry Pi

            We moved to a new house three years ago and since then, we have had issues with floor temperatures. It’s hard to maintain steady temperature for all rooms and I wanted to build a solution to keep track of it and see temperature trend for a whole week. That way, I know exactly what’s happening.

            I will guide how to setup your own environment using same method.

            First, you need temperature sensors and those I recommend RuuviTags. You can find detailed information at their website, but they are very handy bluetooth beacons with battery that lasts multiple years. You can measure temperature, movement, humidity and air pressure. There is a mobile app for them also, but it only shows current status, so it didn’t fit to my purpose.

            So, I needed to push data to somewhere and an obvious choice was Azure. I will write more about Azure side of things in the next part of the blog. But in this part, we will setup a single RuuviTag and Raspberry Pi for sending data. You can add more RuuviTags later, like I did when everything is working as expected.

            First, I recommend updating RuuviTag to the latest firmware. This page has instructions for it:

            https://lab.ruuvi.com/dfu/

            Updating firmware to latest with nRF Connect

            I used an iOS app called nRF Connect for it and it went quite smoothly. You can check that your RuuviTag still works after updating with Ruuvi Station app:

            iOS or Android

            You will also need Raspberry Pi for sending data to the cloud. I recommend using Raspberry Zero W, because we need only WiFi and Bluetooth for this. I have mine plugged in my kitchen at the moment, but you will just need it to be range of the RuuviTags bluetooth signal (and of course WiFi).

            Raspberry Pi Zero W with clear plastic case

            Mine has a clear acrylic case for protection, a power supply and a memory card. Data is not saved to Raspberry memory card, so no need for bigger memory card than usual. I installed Raspbian Buster, because at time, there were issues with Azure IoT Hub Python modules with the latest Raspbian image.

            Here is a page with instructions how to install a Raspbian image to an SD card:

            https://www.raspberrypi.org/documentation/installation/installing-images/

            After you have installed image, boot up your Raspberry and do basic stuff like update it, change passwords etc…

            You can find how to configure your Raspberry here:

            https://www.raspberrypi.org/documentation/configuration/

            After you have done everything, you need to setup Python scripts and modules. Two modules are needed: RuuviTag (ruuvitag_sensor) and Azure IoT Hub Client (azure-iot-device). And you need MAC of your RuuviTag, you can find it with Ruuvi Station App under Settings of your RuuviTag.

            Then you need to create a Python script to get that temperature data, here is my script:

            import asyncio
            import os
            from azure.iot.device.aio import IoTHubDeviceClient
            from ruuvitag_sensor.ruuvitag import RuuviTag
            
            #Replace 'xx:xx:xx:xx:xx:xx' with your RuuviTags MAC
            sensor = RuuviTag('xx:xx:xx:xx:xx:xx')
            state = sensor.update()
            state = str(sensor.state)
            
            
            async def main():
                # Fetch the connection string from an enviornment variable
                conn_str = os.getenv("IOTHUB_DEVICE_CONNECTION_STRING")
                
                # Create instance of the device client using the connection string
                device_client = IoTHubDeviceClient.create_from_connection_string(conn_str)
            
                # Send a single message
                try:
                  print("Sending message...")
                  await device_client.send_message(state)
                  print("Message successfully sent!")
                
                except:
                  print("Message sent failed!")
            
                # finally, disconnect
                await device_client.disconnect()
            
            
            if __name__ == "__main__":
                asyncio.run(main())

            Code gets current temperature from RuuviTag and sends it to Azure IoT Hub. You can see that this code needs IOTHUB_DEVICE_CONNECTION_STRING environment variable and you don’t have that yet, so we need to update it later.

            You can put this code to crontab and run it like every 15min or whatever suits your needs.

            Next time, we will setup Azure side…

            Links:

            https://lab.ruuvi.com/dfu/ (RuuviTag firmware)

            https://github.com/ttu/ruuvitag-sensor (RuuviTag module)

            https://github.com/Azure/azure-iot-sdk-python (Azure IoT module)

            https://www.raspberrypi.org/ (Raspberry Pi)


            Get to know the whole project by reading the parts 2 and 3 of the series here and here.

            This blog text is originally published in Senior Cloud Architect Sami Lahti’s personal Tech Life blog. Follow the blog to stay up to date about Sami’s writings!

            Blog

            Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

            When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

            Blog

            Building better SaaS products with UX Writing (Part 3)

            UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

            Blog

            Building better SaaS products with UX Writing (Part 2)

            The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

            Get in Touch

            Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








              Multi-Cloud: Why Stop At One Platform?

              We deal with a number of major providers, each with its own outstanding features and strengths. A business just has to identify its needs and pick the service that best meets those.

              Everyone knows the importance of picking the right tool for the job. Keen woodworkers have a selection of hammers, chisels and saws, golfers carry bags full of clubs, and even the most ardent sports car enthusiasts can see the limitations of a Ferrari when it’s time to take five kids to the beach.

              The cloud is the same. We deal with a number of major providers, each with its own outstanding features and strengths. A business just has to identify its needs and pick the service that best meets those.

              Except Why Stop At One Cloud Provider?

              Let’s say, for the sake of an example, that a substantial portion of your compute needs are predictable, stable and that latency isn’t an overriding issue. You might consider a provider that offers a relatively inflexible service, doesn’t necessarily have data centres located close to your users, but that is highly cost effective. The lack of flexibility isn’t an issue because of the predictability of your requirements while the low cost makes it highly attractive.

              However, let us also suppose that you also offer applications where latency is an issue and where it’s also important to be able to scale usage up and down do meet spikes in demand. A second cloud provider, one that has data centres close to your main users and that offers a flexible deal on capacity, is an attractive option even though its charges are higher than the first.

              So, does it have to be an either or? Of course not. We live in a world where it’s possible to choose both.

              But Which Cloud Provider Excels In Which Areas?

              However, as the psychologist Barry Schwartz has argued, choices can complicate matters. You have to understand which cloud provider excels in which areas, the likely impact of their terms and conditions, and you also have to have a breadth of expertise in order to take advantage of multiple platforms, both to develop applications within the different environments and to create the architecture needed so that data can flow easily between platforms where required.

              This is very much one of Nordcloud’s roles, to act as an expert facilitator between customer and cloud providers. It’s our job to know how to match a particular offering to a particular requirement. It’s our job to understand the implications of each provider’s terms of business for our customers, and it’s one of our great strengths that we have the resources to supplement our customers’ in-house technical expertise with our own. So, if your team’s proficiencies allow you to manage one provider’s platform but not another, we can help you to clear that hurdle. Our expertise in building a businesses’ Security & Governance models and core infrastructure, as well as delivering data centre migrations and optimised Cloud environments in a consistent way across the major Cloud platforms has allowed us to become one of the most trusted providers.

              Benefits Of Microsoft Azure

              Though we were already working with a number of excellent cloud providers, we have partnered with Microsoft to offer Azure cloud services to our customers. Azure offers particular advantages that make it an attractive option for businesses looking to locate some or all of their computing needs in the cloud.

              For starters, there’s the familiarity of the MS environment, though it should be pointed out that Azure is equally adept at hosting Linux-based applications. Windows is ubiquitous and Microsoft’s range of tools and apps is beyond comprehensive.

              Azure has put especially put emphasis on simplicity and speed. If you need to spin up a project quickly, you need to consider Azure. The human resources are easy to come by – most businesses have no shortage of people skilled in Microsoft-related development – and the tools are easy to use.

              Azure has also addressed concerns relating to server stability with a comprehensive outage protection plan that mirrors users’ data to secure virtual environments. If the possibility of outages and lost data is a worry, then Azure is a good answer. Microsoft has an impressive data centre network with global coverage and is moving into Southern Europe, Africa and South America ahead of the competition. We’re confident that, as providers expand their infrastructure, Azure users won’t find themselves left behind. The business also offers great means of analysing and mining your data for business intelligence through its managed SQL and NoSQL data services.

              Of course, the other cloud services that Nordcloud offers come with their own strengths, but a growing number of businesses, perhaps a majority, are now looking to mix and match with cloud providers to get the best of each to suit their specific needs. It’s a trend we only expect to keep growing.

              Blog

              Part 1 – GCP Networking Philosophy

              When working with cloud architecture, it's important to see the world from different perspectives.

              Blog

              Part 2 – Two Different Types of GCP Network Designs

              When designing your network in GCP, you need to decide if you want to go fully GCP native or use...

              Blog

              Accelerate resource tagging with PowerShell — Microsoft Azure Tag Report

              Once, I prepared this script to quickly tag many resources deployed on the Microsoft Azure platform. There are many ways...

              Get in Touch

              Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








                Leading Azure team in the UK by Harry Azariah

                CATEGORIES

                Life at Nordcloud

                Harry Azariah joined UK Azure team almost 1,5 years ago as Senior Cloud Architect, and was quickly promoted to lead the local Azure team. Here’s his Nordcloudian story. Enjoy!

                 

                1. Where are you from and how did you end up at Nordcloud?

                 

                I’ve lived in South London my whole life. I had been following Nordcloud for couple of years and then my ex colleague who had joined Nordcloud was talking about how great it was and he invited me for a chat with the team.
                After the meeting, I was really interested in joining and contributing to the success and the vision of the company.

                 

                2. What is your role and core competence?

                 

                I am the Azure team lead in the UK; I have come originally from an infrastructure background but have been working with Azure for the past four years. Doing everything from solutions down to implementation.
                My role in the team is to ensure quality in all UK lead Azure engagements and to build a strong team culture through social events and group activities.

                 

                3. What do you like most about working at Nordcloud?

                 

                I like our dynamic, young, modern thinking – we build processes in a new way compared to big consultancies and system integrators etc. I also like our general flexibility and that we are not managed from top down; everybody can contribute and influence.

                 

                4. What’s your favourite thing with public cloud?

                 

                Constant change, always things to learn and new stuff to play with.

                 

                5. What do you do outside work?

                 

                Socialising, watching sports, occasionally playing sports. I’m also a massive foodie so naturally cooking and eating are too of my favourite past times.

                 

                6. Best Nordcloudian memory?

                 

                I have a lot of good (and fun) memories from our social team evenings (Christmas party, leaving or welcoming parties etc).

                 

                Harry is one of our UK Technical interviewers, and if you like his story and can imagine working closely in his Azure team, have a look at our openings here!

                Blog

                Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

                When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

                Blog

                Building better SaaS products with UX Writing (Part 3)

                UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

                Blog

                Building better SaaS products with UX Writing (Part 2)

                The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

                Get in Touch

                Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








                  Meet us at Microsoft Ignite 2019

                  CATEGORIES

                  Blog

                  MICROSOFT IGNITE

                  Ignite is Microsoft’s annual gathering of technology leaders and practitioners consisting of four days of visionary discussions and hands-on learning about the latest tools and insights that are driving tomorrow’s innovations.

                  • Learn innovative ways to build solutions and migrate and manage your infrastructure.
                  • Connect with over 25,000 individuals focused on software development, security, architecture, and IT.
                  • Explore new hands-on experiences that will help you innovate in areas such as security, cloud, and hybrid infrastructure and development.

                  With 1000+ sessions and 200+hands-on experiences, Microsoft Ignite is quite the ultimate tech conference. See the complete agenda on the official Microsoft Ignite website.

                  Heading to Orlando?

                  We would love to meet you at Ignite to share insights over coffee or drinks.

                  Let us know you’ll be there!

                   

                  Blog

                  Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

                  When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

                  Blog

                  Building better SaaS products with UX Writing (Part 3)

                  UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

                  Blog

                  Building better SaaS products with UX Writing (Part 2)

                  The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

                  Get in Touch

                  Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








                    A Series of Fortunate Events

                    CATEGORIES

                    Life at Nordcloud

                    Meet Mika Haapsaari who is working as a Senior Azure Cloud Architect at Nordcloud! Mika is one of our senior pros and his responsibilities include leading one of our Azure teams in Helsinki. Here’s his story.

                    1. Where are you from and how did you end up at Nordcloud?

                    I’m originally from a small town in Western Finland. After I decided to study IT, I wound up in Tampere, which I still like to call my “spiritual home” – now I’m part of the Nordcloud HQ right here in Helsinki! 

                    Me ending up at Nordcloud is actually a pretty strange series of fortunate events, which could be a whole separate story on it’s own. I was actually never “supposed to” end up working with the cloud to begin with – I studied networking and software development in college – but, long story short, after meeting the right people almost a decade ago, getting into Azure from the very beginning and then Reeta from our recruitment calling me at a really opportune time, here I am!

                    2. What is your core competence? On top of that, please also tell about your role and projects shortly.

                    I like the big picture stuff. Designing the architecture from the ground up on a detailed level alongside with the customer is what I like to think is my strong suit. My role currently is to work as a technical lead for one of our major accounts. Not to forget the getting-your-hands dirty technical stuff as well.

                    3. What’s your favourite thing technically with public cloud?

                    This is a tough question actually, there’s so many things to choose from. I love the different options you have for automation, everyone can pick and choose the tools you would like to use for building the automation workflow and it will work just as well as the one next to it. Public cloud in general is a really interesting field because there’s a lot of new stuff coming out all the time. There’s new possibilities every week to do things differently that you necessarily did not know existed the week before!

                    4. What do you like most about working at Nordcloud?

                    There could be a lot of things listed here as well, but the #1 thing I would bring up is the level of support you get while working here. In my career I’ve mostly been working as a one-man wolf pack (yes, that’s a timely reference to pop culture) and working with such skilled colleagues is a little new to me. But here, whether you need help with anything, there is always a person ready to help or point you in the right direction.

                     5. What do you do outside work?

                    I read that Paul is building space suits in his spare time, and now I wish I had something as cool to share with you.. 

                    I love to travel as much as I can. While I’m waiting for my next trip, I like to participate in some social activism. These days I am on the board of my labor union plus I try to be active in a certain political party. Elections are the best! If I want to just de-stress, I play videogames 😎

                    8. How would you describe Nordcloud’s culture in 3 words?

                    Open.

                    Trusting.

                    Fun.

                    Blog

                    Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

                    When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

                    Blog

                    Building better SaaS products with UX Writing (Part 3)

                    UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

                    Blog

                    Building better SaaS products with UX Writing (Part 2)

                    The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

                    Get in Touch

                    Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.