Monitoring your home temperature – Part 3: Visualizing your data with Power BI

Finally, we have our infrastructure up and running and we can start visualizing it with Power BI.

No.. not like this 🙂

Just login to your Power BI account:

https://app.powerbi.com/

Select My Workspace and click Datasets + dataflows. Click the three dots from your dataset and choose Create Report.

Let’s create line charts for temperature, humidity and air pressure. Each of them on a separate page. I’ll show how to configure the temperature first.

Go to Visualizations pane and choose Line chart:

Stretch the chart to fill your page and go under Fields and expand your table. Drag ‘EventEnqueuedUtcTime’ to Axis and ‘temperature’ to Values:

You should already see the graph of your temperature. You can rename Axis and Values to a more friendly name like Time and Temperature (Celsius). Change the color of your graph etc..

Add filter just next to the graph to show data from the past 7 days:

The end result should be something like this:

Temperature

Mine has been customised for Finnish language. It also has an average temperature line. Just for demo purpose, I placed my RuuviTag outside, so there are some changes for temperature (If you wonder why my floor temperature is 7 degrees at night..) You can add the humidity and the air pressure with the same method to the same page or you can create own separate pages for them. To make it more clear, I have separate pages for each of them:

Air pressure
Humidity

I also added one page for minimum, maximum and average values:

So go ahead and customize however you like and leave a comment if you have good suggestions for customization. Remember to save your report from File and Save. If you want to share it, you have an option to publish it to anyone with Publish to web.

After this, you can scale up your environment by adding more RuuviTags. This is the end of this series, thanks for reading and let me know if you have any questions!


Get to know the whole project by reading the parts 1 and 2 of the series here and here!

This blog text is originally published in Senior Cloud Architect Sami Lahti’s personal Tech Life blog.

Blog

Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

Blog

Building better SaaS products with UX Writing (Part 3)

UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

Blog

Building better SaaS products with UX Writing (Part 2)

The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

Get in Touch

Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








    Monitoring your home temperature – Part 2: Setting up Azure

    Now we need to setup the Azure side and for that you need an Azure subscription. You can get a new subscription with 170€ of credits for 30 days with this link, if you don’t have anything yet:

    https://azure.microsoft.com/en-us/free/

    As this is a demo environment, we do not focus heavily on security. You need some basic understanding how to deploy resources. I’ll guide you on the configuration side. Btw, If you want to learn basics of Azure, there is a nice learning path at Microsoft Learning:

    https://docs.microsoft.com/en-us/learn/paths/azure-fundamentals/

    Here you can see my demo environment:

    IoT Hub is for receiving temperature messages from Raspberry Zero. Stream Analytics Job is for pushing those messages to the Power BI. The Automation Account is for starting and shutting down the Stream Analytics job, so it won’t consume too much credits.

    Normally, we would deploy this process with ARM, but for make it more convenience to present and instruct, let’s do it from the Portal. And if you like, you can follow naming convention from CAF:

    https://docs.microsoft.com/en-us/azure/cloud-adoption-framework/ready/azure-best-practices/naming-and-tagging

    Start deploying the IoT Hub. Use a globally unique name and public endpoints (You can restrict access to your home ip-address if you like) and choose F1: Free Tier. With the Free Tier you can send 8000 messages per day, so if you have five Raspberry Pis, each of them can send one message per minute. Enough for home use usually.

    After you have created the IoT Hub, you need to create an IoT Device under it. I used same hostname as my Raspberry Pi has:

    Then you need to copy your Primary Connection String from your IoT Device:

    After copying that string, you need to create an environment variable for your Raspberry Pi. You can use a script to automatically add it after every boot.

    Here is an example how you create an environment variable:

    export IOTHUB_DEVICE_CONNECTION_STRING="HostName=YourIoTHub-devices.net;DeviceId=YourIotDevice;SharedAccessKey=YourKey"

    Last thing to IoT Hub is to add a consumer group. You can add it under Built-in Endpoints and Events. Just add a custom name under $Default. Here you can see, that I added ‘ruuvitagcg’:

    Next, you want to create a Stream Analytics Job. For that, you need only one unit and environment should be Cloud. There is no free tier for this and it costs some money to keep it running. Luckily, we can turn it off whenever we don’t use it. I used the Automation Account to start it a few minutes before I receive a message and turn it off few minutes after. There is a minor cost for the Automation Account also, but without it, the total cost would be much higher. I receive a message only once per hour, so Stream Analytics is running only 96 minutes every day, instead of 1440 minutes. The total monthly cost is something like 4€. Normally it would be almost 70€.

    Here are my Automation Account scripts:

    $connectionName = "RuuvitagConnection"
    $servicePrincipalConnection=Get-AutomationConnection -Name $connectionName         
    
    Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $servicePrincipalConnection.TenantId `
        -ApplicationId $servicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint 
    
    Start-AzStreamAnalyticsJob -ResourceGroupName "YourRG" -Name "YourStreamAnalytics" -OutputStartMode "JobStartTime"
    
    $connectionName = "RuuvitagConnection"
    $servicePrincipalConnection=Get-AutomationConnection -Name $connectionName         
    
    Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $servicePrincipalConnection.TenantId `
        -ApplicationId $servicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint 
    
    Stop-AzStreamAnalyticsJob -ResourceGroupName "YourRG" -Name "YourStreamAnalytics"

    Next, head to Stream Analytics Job and click Inputs.

    Click Add stream input. Above is my configuration, you just need to add your own consumer group configured earlier. For Endpoint, choose Messaging. Use service for Shared access policy name (It is created default with a new IoT Hub).

    Now, move on to Outputs and click Add. Choose Power Bi and click Authorize, if you dont have Power BI yet, you can Sign Up.

    Fill in Dataset name and Table name. For the Authentication mode, we need to use User token if you have a Free Power BI version (v2 upgrade is not yet possible in Free).

    Then create the following Query and click Test query, you should see some results:

    Now, the only thing to do is to visualize our data with the Power BI. We will cover that part in the next post, but the infrastructure side is ready to rock. Grab a cup of coffee and pat yourself to the back 🙂


    Get to know the whole project by reading the parts 1 and 3 of the series here and here!

    This blog text is originally published in Senior Cloud Architect Sami Lahti’s personal Tech Life blog. Follow the blog to stay up to date about Sami’s writings!

    Blog

    Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

    When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

    Blog

    Building better SaaS products with UX Writing (Part 3)

    UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

    Blog

    Building better SaaS products with UX Writing (Part 2)

    The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

    Get in Touch

    Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








      Monitoring your home temperature – Part 1: Setting up RuuviTags and Raspberry Pi

      We moved to a new house three years ago and since then, we have had issues with floor temperatures. It’s hard to maintain steady temperature for all rooms and I wanted to build a solution to keep track of it and see temperature trend for a whole week. That way, I know exactly what’s happening.

      I will guide how to setup your own environment using same method.

      First, you need temperature sensors and those I recommend RuuviTags. You can find detailed information at their website, but they are very handy bluetooth beacons with battery that lasts multiple years. You can measure temperature, movement, humidity and air pressure. There is a mobile app for them also, but it only shows current status, so it didn’t fit to my purpose.

      So, I needed to push data to somewhere and an obvious choice was Azure. I will write more about Azure side of things in the next part of the blog. But in this part, we will setup a single RuuviTag and Raspberry Pi for sending data. You can add more RuuviTags later, like I did when everything is working as expected.

      First, I recommend updating RuuviTag to the latest firmware. This page has instructions for it:

      https://lab.ruuvi.com/dfu/

      Updating firmware to latest with nRF Connect

      I used an iOS app called nRF Connect for it and it went quite smoothly. You can check that your RuuviTag still works after updating with Ruuvi Station app:

      iOS or Android

      You will also need Raspberry Pi for sending data to the cloud. I recommend using Raspberry Zero W, because we need only WiFi and Bluetooth for this. I have mine plugged in my kitchen at the moment, but you will just need it to be range of the RuuviTags bluetooth signal (and of course WiFi).

      Raspberry Pi Zero W with clear plastic case

      Mine has a clear acrylic case for protection, a power supply and a memory card. Data is not saved to Raspberry memory card, so no need for bigger memory card than usual. I installed Raspbian Buster, because at time, there were issues with Azure IoT Hub Python modules with the latest Raspbian image.

      Here is a page with instructions how to install a Raspbian image to an SD card:

      https://www.raspberrypi.org/documentation/installation/installing-images/

      After you have installed image, boot up your Raspberry and do basic stuff like update it, change passwords etc…

      You can find how to configure your Raspberry here:

      https://www.raspberrypi.org/documentation/configuration/

      After you have done everything, you need to setup Python scripts and modules. Two modules are needed: RuuviTag (ruuvitag_sensor) and Azure IoT Hub Client (azure-iot-device). And you need MAC of your RuuviTag, you can find it with Ruuvi Station App under Settings of your RuuviTag.

      Then you need to create a Python script to get that temperature data, here is my script:

      import asyncio
      import os
      from azure.iot.device.aio import IoTHubDeviceClient
      from ruuvitag_sensor.ruuvitag import RuuviTag
      
      #Replace 'xx:xx:xx:xx:xx:xx' with your RuuviTags MAC
      sensor = RuuviTag('xx:xx:xx:xx:xx:xx')
      state = sensor.update()
      state = str(sensor.state)
      
      
      async def main():
          # Fetch the connection string from an enviornment variable
          conn_str = os.getenv("IOTHUB_DEVICE_CONNECTION_STRING")
          
          # Create instance of the device client using the connection string
          device_client = IoTHubDeviceClient.create_from_connection_string(conn_str)
      
          # Send a single message
          try:
            print("Sending message...")
            await device_client.send_message(state)
            print("Message successfully sent!")
          
          except:
            print("Message sent failed!")
      
          # finally, disconnect
          await device_client.disconnect()
      
      
      if __name__ == "__main__":
          asyncio.run(main())

      Code gets current temperature from RuuviTag and sends it to Azure IoT Hub. You can see that this code needs IOTHUB_DEVICE_CONNECTION_STRING environment variable and you don’t have that yet, so we need to update it later.

      You can put this code to crontab and run it like every 15min or whatever suits your needs.

      Next time, we will setup Azure side…

      Links:

      https://lab.ruuvi.com/dfu/ (RuuviTag firmware)

      https://github.com/ttu/ruuvitag-sensor (RuuviTag module)

      https://github.com/Azure/azure-iot-sdk-python (Azure IoT module)

      https://www.raspberrypi.org/ (Raspberry Pi)


      Get to know the whole project by reading the parts 2 and 3 of the series here and here.

      This blog text is originally published in Senior Cloud Architect Sami Lahti’s personal Tech Life blog. Follow the blog to stay up to date about Sami’s writings!

      Blog

      Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

      When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

      Blog

      Building better SaaS products with UX Writing (Part 3)

      UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

      Blog

      Building better SaaS products with UX Writing (Part 2)

      The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

      Get in Touch

      Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.