Monitoring your home temperature – Part 3: Visualizing your data with Power BI

Finally, we have our infrastructure up and running and we can start visualizing it with Power BI.

No.. not like this 🙂

Just login to your Power BI account:

https://app.powerbi.com/

Select My Workspace and click Datasets + dataflows. Click the three dots from your dataset and choose Create Report.

Let’s create line charts for temperature, humidity and air pressure. Each of them on a separate page. I’ll show how to configure the temperature first.

Go to Visualizations pane and choose Line chart:

Stretch the chart to fill your page and go under Fields and expand your table. Drag ‘EventEnqueuedUtcTime’ to Axis and ‘temperature’ to Values:

You should already see the graph of your temperature. You can rename Axis and Values to a more friendly name like Time and Temperature (Celsius). Change the color of your graph etc..

Add filter just next to the graph to show data from the past 7 days:

The end result should be something like this:

Temperature

Mine has been customised for Finnish language. It also has an average temperature line. Just for demo purpose, I placed my RuuviTag outside, so there are some changes for temperature (If you wonder why my floor temperature is 7 degrees at night..) You can add the humidity and the air pressure with the same method to the same page or you can create own separate pages for them. To make it more clear, I have separate pages for each of them:

Air pressure
Humidity

I also added one page for minimum, maximum and average values:

So go ahead and customize however you like and leave a comment if you have good suggestions for customization. Remember to save your report from File and Save. If you want to share it, you have an option to publish it to anyone with Publish to web.

After this, you can scale up your environment by adding more RuuviTags. This is the end of this series, thanks for reading and let me know if you have any questions!


Get to know the whole project by reading the parts 1 and 2 of the series here and here!

This blog text is originally published in Senior Cloud Architect Sami Lahti’s personal Tech Life blog.

Blog

Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

Blog

Building better SaaS products with UX Writing (Part 3)

UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

Blog

Building better SaaS products with UX Writing (Part 2)

The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

Get in Touch

Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








    Monitoring your home temperature – Part 2: Setting up Azure

    Now we need to setup the Azure side and for that you need an Azure subscription. You can get a new subscription with 170€ of credits for 30 days with this link, if you don’t have anything yet:

    https://azure.microsoft.com/en-us/free/

    As this is a demo environment, we do not focus heavily on security. You need some basic understanding how to deploy resources. I’ll guide you on the configuration side. Btw, If you want to learn basics of Azure, there is a nice learning path at Microsoft Learning:

    https://docs.microsoft.com/en-us/learn/paths/azure-fundamentals/

    Here you can see my demo environment:

    IoT Hub is for receiving temperature messages from Raspberry Zero. Stream Analytics Job is for pushing those messages to the Power BI. The Automation Account is for starting and shutting down the Stream Analytics job, so it won’t consume too much credits.

    Normally, we would deploy this process with ARM, but for make it more convenience to present and instruct, let’s do it from the Portal. And if you like, you can follow naming convention from CAF:

    https://docs.microsoft.com/en-us/azure/cloud-adoption-framework/ready/azure-best-practices/naming-and-tagging

    Start deploying the IoT Hub. Use a globally unique name and public endpoints (You can restrict access to your home ip-address if you like) and choose F1: Free Tier. With the Free Tier you can send 8000 messages per day, so if you have five Raspberry Pis, each of them can send one message per minute. Enough for home use usually.

    After you have created the IoT Hub, you need to create an IoT Device under it. I used same hostname as my Raspberry Pi has:

    Then you need to copy your Primary Connection String from your IoT Device:

    After copying that string, you need to create an environment variable for your Raspberry Pi. You can use a script to automatically add it after every boot.

    Here is an example how you create an environment variable:

    export IOTHUB_DEVICE_CONNECTION_STRING="HostName=YourIoTHub-devices.net;DeviceId=YourIotDevice;SharedAccessKey=YourKey"

    Last thing to IoT Hub is to add a consumer group. You can add it under Built-in Endpoints and Events. Just add a custom name under $Default. Here you can see, that I added ‘ruuvitagcg’:

    Next, you want to create a Stream Analytics Job. For that, you need only one unit and environment should be Cloud. There is no free tier for this and it costs some money to keep it running. Luckily, we can turn it off whenever we don’t use it. I used the Automation Account to start it a few minutes before I receive a message and turn it off few minutes after. There is a minor cost for the Automation Account also, but without it, the total cost would be much higher. I receive a message only once per hour, so Stream Analytics is running only 96 minutes every day, instead of 1440 minutes. The total monthly cost is something like 4€. Normally it would be almost 70€.

    Here are my Automation Account scripts:

    $connectionName = "RuuvitagConnection"
    $servicePrincipalConnection=Get-AutomationConnection -Name $connectionName         
    
    Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $servicePrincipalConnection.TenantId `
        -ApplicationId $servicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint 
    
    Start-AzStreamAnalyticsJob -ResourceGroupName "YourRG" -Name "YourStreamAnalytics" -OutputStartMode "JobStartTime"
    
    $connectionName = "RuuvitagConnection"
    $servicePrincipalConnection=Get-AutomationConnection -Name $connectionName         
    
    Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $servicePrincipalConnection.TenantId `
        -ApplicationId $servicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint 
    
    Stop-AzStreamAnalyticsJob -ResourceGroupName "YourRG" -Name "YourStreamAnalytics"

    Next, head to Stream Analytics Job and click Inputs.

    Click Add stream input. Above is my configuration, you just need to add your own consumer group configured earlier. For Endpoint, choose Messaging. Use service for Shared access policy name (It is created default with a new IoT Hub).

    Now, move on to Outputs and click Add. Choose Power Bi and click Authorize, if you dont have Power BI yet, you can Sign Up.

    Fill in Dataset name and Table name. For the Authentication mode, we need to use User token if you have a Free Power BI version (v2 upgrade is not yet possible in Free).

    Then create the following Query and click Test query, you should see some results:

    Now, the only thing to do is to visualize our data with the Power BI. We will cover that part in the next post, but the infrastructure side is ready to rock. Grab a cup of coffee and pat yourself to the back 🙂


    Get to know the whole project by reading the parts 1 and 3 of the series here and here!

    This blog text is originally published in Senior Cloud Architect Sami Lahti’s personal Tech Life blog. Follow the blog to stay up to date about Sami’s writings!

    Blog

    Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

    When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

    Blog

    Building better SaaS products with UX Writing (Part 3)

    UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

    Blog

    Building better SaaS products with UX Writing (Part 2)

    The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

    Get in Touch

    Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








      Monitoring your home temperature – Part 1: Setting up RuuviTags and Raspberry Pi

      We moved to a new house three years ago and since then, we have had issues with floor temperatures. It’s hard to maintain steady temperature for all rooms and I wanted to build a solution to keep track of it and see temperature trend for a whole week. That way, I know exactly what’s happening.

      I will guide how to setup your own environment using same method.

      First, you need temperature sensors and those I recommend RuuviTags. You can find detailed information at their website, but they are very handy bluetooth beacons with battery that lasts multiple years. You can measure temperature, movement, humidity and air pressure. There is a mobile app for them also, but it only shows current status, so it didn’t fit to my purpose.

      So, I needed to push data to somewhere and an obvious choice was Azure. I will write more about Azure side of things in the next part of the blog. But in this part, we will setup a single RuuviTag and Raspberry Pi for sending data. You can add more RuuviTags later, like I did when everything is working as expected.

      First, I recommend updating RuuviTag to the latest firmware. This page has instructions for it:

      https://lab.ruuvi.com/dfu/

      Updating firmware to latest with nRF Connect

      I used an iOS app called nRF Connect for it and it went quite smoothly. You can check that your RuuviTag still works after updating with Ruuvi Station app:

      iOS or Android

      You will also need Raspberry Pi for sending data to the cloud. I recommend using Raspberry Zero W, because we need only WiFi and Bluetooth for this. I have mine plugged in my kitchen at the moment, but you will just need it to be range of the RuuviTags bluetooth signal (and of course WiFi).

      Raspberry Pi Zero W with clear plastic case

      Mine has a clear acrylic case for protection, a power supply and a memory card. Data is not saved to Raspberry memory card, so no need for bigger memory card than usual. I installed Raspbian Buster, because at time, there were issues with Azure IoT Hub Python modules with the latest Raspbian image.

      Here is a page with instructions how to install a Raspbian image to an SD card:

      https://www.raspberrypi.org/documentation/installation/installing-images/

      After you have installed image, boot up your Raspberry and do basic stuff like update it, change passwords etc…

      You can find how to configure your Raspberry here:

      https://www.raspberrypi.org/documentation/configuration/

      After you have done everything, you need to setup Python scripts and modules. Two modules are needed: RuuviTag (ruuvitag_sensor) and Azure IoT Hub Client (azure-iot-device). And you need MAC of your RuuviTag, you can find it with Ruuvi Station App under Settings of your RuuviTag.

      Then you need to create a Python script to get that temperature data, here is my script:

      import asyncio
      import os
      from azure.iot.device.aio import IoTHubDeviceClient
      from ruuvitag_sensor.ruuvitag import RuuviTag
      
      #Replace 'xx:xx:xx:xx:xx:xx' with your RuuviTags MAC
      sensor = RuuviTag('xx:xx:xx:xx:xx:xx')
      state = sensor.update()
      state = str(sensor.state)
      
      
      async def main():
          # Fetch the connection string from an enviornment variable
          conn_str = os.getenv("IOTHUB_DEVICE_CONNECTION_STRING")
          
          # Create instance of the device client using the connection string
          device_client = IoTHubDeviceClient.create_from_connection_string(conn_str)
      
          # Send a single message
          try:
            print("Sending message...")
            await device_client.send_message(state)
            print("Message successfully sent!")
          
          except:
            print("Message sent failed!")
      
          # finally, disconnect
          await device_client.disconnect()
      
      
      if __name__ == "__main__":
          asyncio.run(main())

      Code gets current temperature from RuuviTag and sends it to Azure IoT Hub. You can see that this code needs IOTHUB_DEVICE_CONNECTION_STRING environment variable and you don’t have that yet, so we need to update it later.

      You can put this code to crontab and run it like every 15min or whatever suits your needs.

      Next time, we will setup Azure side…

      Links:

      https://lab.ruuvi.com/dfu/ (RuuviTag firmware)

      https://github.com/ttu/ruuvitag-sensor (RuuviTag module)

      https://github.com/Azure/azure-iot-sdk-python (Azure IoT module)

      https://www.raspberrypi.org/ (Raspberry Pi)


      Get to know the whole project by reading the parts 2 and 3 of the series here and here.

      This blog text is originally published in Senior Cloud Architect Sami Lahti’s personal Tech Life blog. Follow the blog to stay up to date about Sami’s writings!

      Blog

      Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

      When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

      Blog

      Building better SaaS products with UX Writing (Part 3)

      UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

      Blog

      Building better SaaS products with UX Writing (Part 2)

      The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

      Get in Touch

      Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








        Controlling lights with Ikea Trådfri, Raspberry Pi and AWS

        CATEGORIES

        BlogTech Community

        A few months back we purchased Ikea Trådfri smart lights to our home. However, after the initial hype, those were almost forgotten as controlling was just too complicated via ready-made tools. For example, the Ikea Home app works only on mobile and it’s only possible to use it when connected to wifi. Luckily, Trådfri offers an API for controlling the lights, so it was possible to build our customized UI.

        Connecting to Trådfri

        With quick googling, I found a tutorial by Pimoroni that helped me get started. I already had Raspberry Pi running as a home server, so all it took was to download and install the required packages provided by the tutorial. However, at the time of writing this article (and implementing my solution), the Pimoroni tutorial was a bit outdated. Because of that I just couldn’t get the communication working, but after banging my head against the wall for a while I found out that in 2017 Ikea changed the authentication method. I’ve contacted Pimoroni and asked them to update the article.

        After getting the communication between Raspberry Pi and Trådfri Gateway working I started writing the middleware server on Raspberry Pi. As I’m a javascript developer I chose to build this with NodeJS. Luckily, there is a node-tradfri-client package available that made the integration simple. Here’s a code example of how I’m connecting to Trådfri and storing the devices in application memory.

        I also added ExpressJS to handle requests. With just a few lines of code, I had API endpoints for

        • Listing all devices in our house
        • Toggling a lightbulb or socket
        • Turning lightbulb or socket on
        • Turning lightbulb or socket off
        • Setting brightness of a lightbulb

        Writing the client application

        As we wanted to control the lights from anywhere with any device, we chose to build a web app that can be used on a laptop, mobile, and tablet without installing. After the first POC, we decided on the most common use cases and Eija designed the UI on Sketch. Actual implementation was done using ReactJS with help of Ant Design and react-draggable.

        Source codes for the client app is available in my Github.

        Making the app accessible from anywhere

        In Finland, we have fast and unlimited data plans on mobile and because of that we rarely have wifi enabled on mobile (nothing is more irritating than bad wifi connection on the yard). To solve this, we chose to publish the app to the public web. As the UI is built as a single-page app, it’s basically free to host it with AWS S3 and Cloudfront. Since Cloudfront domains are random strings, we decided that this is enough security for now. This means that knowing the Cloudfront domain, anyone can control our lights. If this becomes a problem, it’s quite simple to integrate some authentication methods too.

        The app is also hosted on the Raspberry Pi on our local network, so guests can control the lights if they are connected to our wifi.

        The bridge between physical & digital world is not yet seamless

        Even with this accessible application we quickly figured out that we still need physical control buttons for the lights. For example, when going upstairs and not bringing the phone with you, you might end up in a dark room without the possibility to turn on the lights. Luckily Ikea provides physical switches for the Trådfri lights, so we had to make one more Ikea trip to get the extra controller upstairs.

        Another way to reduce the need for physical switches would be using a smart speaker with voice recognition. Unfortunately, only Apple Home Pod is the only speaker that currently understands Finnish and it’s a tad out of our budget and probably not possible to integrate into our system either. Once Amazon adds Finnish support for Alexa we’ll definitely try that.

        …and while writing the previous chapter, I figured that since Apple supports Finnish, it’s possible to create a Siri Shortcut to control our lights. With few more lines of code in the web app, it now supports anchor links from Shortcut to trigger a preset lighting mode.

        Conclusion

        It’s great that companies like Ikea provide open access to their smart lights since at least for us the ready-made tooling was not enough. Also with the help of the AWS serverless offering, we can host this solution securely in the cloud for free. If you have any questions about our solution, please feel free to get in touch.

        For more tech content follow Arto and Nordcloud Engineering in Medium.

        Blog

        Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

        When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

        Blog

        Building better SaaS products with UX Writing (Part 3)

        UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

        Blog

        Building better SaaS products with UX Writing (Part 2)

        The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

        Get in Touch

        Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








          Look ma, I created a home IoT setup with AWS, Raspberry Pi, Telegram and RuuviTags

          CATEGORIES

          BlogTech Community

          Hobby projects are a fun way to try and learn new things. This time, I decided to build a simple IoT setup for home, to collect and visualise information like temperature, humidity and pressure. While learning by doing was definitely one of the reasons I decided to embark the project, I for example wanted to control the radiators located in the attic: Not necessarily by switching power on/off, but getting alarms if I’m heating it too much or little, so that I can tune the power manually. Saving some money, in practice. Also, it is nice the get reminders from humidor that the cigars are getting dried out 😉

          I personally learned several things while working on it, and via this blog post, hopefully you can too!

          Overview

          Idea of the project is relatively simple: Place a few RuuviTag -sensors around the house, collect the data and push it into AWS cloud for permanent storage and additional processing. From there, several solutions can be built around the data, visualisation and alarms being being only few of them.

          Overview of the setup

          Solution is built using AWS serverless technologies that keeps the running expenses low while requiring almost non-existing maintenance. Following code samples are only snippets from the complete solution, but I’ve tried to collect the relevant parts.

          Collect data with RuuviTags and Raspberry Pi

          Tag sensors broadcasts their data (humidity, temperature, pressure etc.) via Bluetooth LE periodically. Because Ruuvi is an open source friendly product, there are already several ready-made solutions and libraries to utilise. I went with node-ruuvitag, which is a Node.js module (Note: I found that module works best with Linux and Node 8.x but you may be successful with other combinations, too).

          Raspberry Pi runs a small Node.js application that both listens the incoming messages from RuuviTags and forwards them into AWS IoT service. App communicates with AWS cloud using thingShadow client, found in AWS IoT Device SDK module. Application authenticates using X.509 certificates generated by you or AWS IoT Core.

          The scripts runs as a Linux service. While tags broadcast data every second or so, the app in Raspberry Pi forwards the data only once in 10 minutes for each tag, which is more than sufficient for the purpose. This is also an easy way to keep processing and storing costs very low in AWS.

          When building an IoT or big data solution, one may initially aim for near real-time data transfers and high data resolutions while the solution built on top of it may not really require it. Alternatively, consider sending data in batches once an hour and with 10 minute resolution may be sufficient and is also cheaper to execute.

          When running the broadcast listening script in Raspberry Pi, there are couple things to consider:

          • All the tags may not appear at first reading: (Re)run ruuvi.findTags() every 30mins or so, to ensure all the tags get collected
          • Raspberry Pi can drop from WLANSetup a script to automatically reconnect in a case that happens

          With these in place, the setup have been working without issues, so far.

          Process data in AWS using IoT Core and friends

          AWS processing overview

          Once the data hits the AWS IoT Core there can be several rules for handling the incoming data. In this case, I setup a lambda to be triggered for each message. AWS IoT provides also a way to do the DynamoDB inserts directly from the messages, but I found it more versatile and development friendly approach to use the lambda between, instead.

          AWS IoT Core act rule

          DynamoDB works well as permanent storage in this case: Data structure is simple and service provides on demand based scalability and billing. Just pay attention when designing the table structure and make sure it fits with you use cases as changes done afterwards may be laborious. For more information about the topic, I recommend you to watch a talk about Advanced Design Patterns for DynamoDB.

          DynamoDB structure I end up using

          Visualise data with React and Highcharts

          Once we have the data stored in semi structured format in AWS cloud, it can be visualised or processed further. I set up a periodic lambda to retrieve the data from DynamoDB and generate CSV files into public S3 bucket, for React clients to pick up. CSV format was preferred over for example JSON to decrease the file size. At some point, I may also try out using the Parquet -format and see if it suits even better for the purpose.

          Overview visualisations for each tag

          The React application fetches the CSV file from S3 using custom hook and passes it to Highcharts -component.

          During my professional career, I’ve learnt the data visualisations are often causing various challenges due to limitations and/or bugs with the implementation. After using several chart components, I personally prefer using Highcharts over other libraries, if possible.

          Snapshot from the tag placed outside

          Send notifications with Telegram bots

          Visualisations works well to see the status and how the values vary by the time. However, in case something drastic happens, like humidor humidity gets below preferred level, I’d like to get an immediate notification about it. This can be done for example using Telegram bots:

          1. Define the limits for each tag for example into DynamoDB table
          2. Compare limits with actual measurement whenever data arrives in custom lambda
          3. If value exceeds the limit, trigger SNS message (so that we can subscribe several actions to it)
          4. Listen into SNS topic and send Telegram message to message group you’re participating in
          5. Profit!

          Limits in DynamoDB

           

          Summary

          By now, you should have some kind of understanding how one can combine IoT sensor, AWS services and outputs like web apps and Telegram nicely together using serverless technologies. If you’ve built something similar or taken very different approach, I’d be happy hear it!

          Price tag

          Building and running your own IoT solution using RuuviTags, Raspberry Pi and AWS Cloud does not require big investments. Here are some approximate expenses from the setup:

          • 3-pack of RuuviTags: 90e (ok, I wish these were a little bit cheaper so I’d buy these more since the product is nice)
          • Raspberry Pi with accessories: 50e
          • Energy used by RPi: http://www.pidramble.com/wiki/benchmarks/power-consumption
          • Lambda executions: $0,3/month
          • SNS notifications: $0,01/month
          • S3 storage: $0,01/month
          • DynamoDB: $0,01/month

          And after looking into numbers, there are several places to optimise as well. For example, some lambdas are executed more often than really needed.

          Next steps

          I’m happy say this hobby project has achieved that certain level of readiness, where it is running smoothly days through and being valuable for me. As a next steps, I’m planning to add some kind of time range selection. As the amount of data is increasing, it will be interesting to see how values vary in long term. Also, it would be a good exercise to integrate some additional AWS services, detect drastic changes or communication failures between device and cloud when they happen. This or that, at least now I have a good base for continue from here or build something totally different next time 🙂

          References, credits and derivative work

          This project is no by means a snowflake and has been inspired by existing projects and work:

           


          For more content follow Juha and Nordcloud Engineering on Medium.

          At Nordcloud we are always looking for talented people. If you enjoy reading this post and would like to work with public cloud projects on a daily basis — check out our open positions here.

          Blog

          Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

          When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

          Blog

          Building better SaaS products with UX Writing (Part 3)

          UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

          Blog

          Building better SaaS products with UX Writing (Part 2)

          The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

          Get in Touch

          Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.