Accelerate resource tagging with PowerShell — Microsoft Azure Tag Report

Tech Community • 3 min read

Once, I prepared this script to quickly tag many resources deployed on the Microsoft Azure platform. There are many ways to do it quickly and easily, but I tried to make a universal solution that is easy and safe to use. Therefore, I decided to make one script that generates all resources to a CSV file. And the second script based on the CSV file will pull resource data from it and overwrite it on the platform.

The scripts are available on GitHub:

Description of the scripts:

  • GetAllTags.ps1 — generates two CSV files. One file for Resource Group and one file for Resources. This is important because we have separate groups and we can tag groups differently from resources without looking for resource types.
  • SetTagsResourceGroups.ps1 — a script that takes data from a CSV file where there is a Resource Group inventory with tags to be deployed.
  • SetTagsResources.ps1 — a script that retrieves data from a CSV file where we have saved resources with tags for implementation.

This division gives us:

  1. Saving the current state of tags saved to a CSV file. This is a tag report, i.e. the current status of the tags implemented.
  2. The division into groups and resources.
  3. Ability to fix tags only on resources or only on resource groups.

How do scripts work?


  1. The file is saved automatically to the location where we execute the script or we can use the -path option where the files are to be saved.
  2. The file name includes the date, if the file exists it will be overwritten.
  3. The script checks if you are logged in to Azure. When executing the script, you can specify the -tenantId parameter to make sure that you are logging into the appropriate Azure Directory and -subId for selecting the correct subscription.

SetTagsResourceGroups and SetTagsResources

  1. Data from files is imported automatically based on the naming convention of the previously generated file. The script imports the latest CSV file in the specified directory. There is a path specified via the -path parameter.
  2. Tags should be separated by commas.
  3. The tags saved in the CSV file work as “Key: Value” pairs in separate columns.
  4. The script gets all the items from the CSV file. The script then removes the tags in Microsoft Azure then entered them from the CSV file.
  5. If you do not want to move a given resource, just remove it from the CSV file.
  6. The script runs in parallel mode, which allows you to make changes faster. The throttle for writing tags is 5, keep in mind this is the optimal value.
  7. The script checks if you are logged in to Azure. You can specify the -tenantId parameter when executing a script.
  8. If you want to clear tags on resources, just leave an empty cell or enter “empty”.
  9. We can test the introduction of tags by entering only the resource that we want to change in the CSV file. The rest of the resources will not be considered when implementing tags.

Below is an example of how it works on my subscription.

Running the script:

  • GetAllTags.ps1

Command: ./GetAllTags.ps1 -path /tags

On the print screen, you can see the output with the information generated by the script. It is:

  • Are you logged in, if not then it will ask you to log in.
  • The number of resource groups and resources found.
  • Information where and what files are created.
  • Resource List.

The CSV files look like this:

  • Resource Group
  • Resources

Below I will present the tag editing for a resource file.

Attention! If you want to keep a copy of the current state, protect the generated file or rename it to a completely different one.

The corrected file looks like this. Note that I have filled in the tags for all objects and added a new tag: Test: Key to show what assigning multiple tags looks like.

The script output with the changes made:

Command: ./SetTagsResources.ps1 -path /tags

The effect of changes in the Azure Portal:

You can customize the script to suit your purpose so that it runs on multiple subscriptions, for example. There are many customization options for this script that I have intentionally left out. The scripts work modularly, so they can be easily used with another script, and the rest will be prepared and implemented based on the input data from the file.


Article available in Polish:

Follow Piotr Rogala in Medium!

Piotr Rogala
Piotr RogalaAzure Cloud Architect Lead
Related topics

Get in Touch.

Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.

Ilja Summala
Ilja Summala LinkedIn
Ilja’s passion and tech knowledge help customers transform how they manage infrastructure and develop apps in cloud.