Look ma, I created a home IoT setup with AWS, Raspberry Pi, Telegram and RuuviTags

CATEGORIES

BlogTech Community

Hobby projects are a fun way to try and learn new things. This time, I decided to build a simple IoT setup for home, to collect and visualise information like temperature, humidity and pressure. While learning by doing was definitely one of the reasons I decided to embark the project, I for example wanted to control the radiators located in the attic: Not necessarily by switching power on/off, but getting alarms if I’m heating it too much or little, so that I can tune the power manually. Saving some money, in practice. Also, it is nice the get reminders from humidor that the cigars are getting dried out 😉

I personally learned several things while working on it, and via this blog post, hopefully you can too!

Overview

Idea of the project is relatively simple: Place a few RuuviTag -sensors around the house, collect the data and push it into AWS cloud for permanent storage and additional processing. From there, several solutions can be built around the data, visualisation and alarms being being only few of them.

Overview of the setup

Solution is built using AWS serverless technologies that keeps the running expenses low while requiring almost non-existing maintenance. Following code samples are only snippets from the complete solution, but I’ve tried to collect the relevant parts.

Collect data with RuuviTags and Raspberry Pi

Tag sensors broadcasts their data (humidity, temperature, pressure etc.) via Bluetooth LE periodically. Because Ruuvi is an open source friendly product, there are already several ready-made solutions and libraries to utilise. I went with node-ruuvitag, which is a Node.js module (Note: I found that module works best with Linux and Node 8.x but you may be successful with other combinations, too).

Raspberry Pi runs a small Node.js application that both listens the incoming messages from RuuviTags and forwards them into AWS IoT service. App communicates with AWS cloud using thingShadow client, found in AWS IoT Device SDK module. Application authenticates using X.509 certificates generated by you or AWS IoT Core.

The scripts runs as a Linux service. While tags broadcast data every second or so, the app in Raspberry Pi forwards the data only once in 10 minutes for each tag, which is more than sufficient for the purpose. This is also an easy way to keep processing and storing costs very low in AWS.

When building an IoT or big data solution, one may initially aim for near real-time data transfers and high data resolutions while the solution built on top of it may not really require it. Alternatively, consider sending data in batches once an hour and with 10 minute resolution may be sufficient and is also cheaper to execute.

When running the broadcast listening script in Raspberry Pi, there are couple things to consider:

  • All the tags may not appear at first reading: (Re)run ruuvi.findTags() every 30mins or so, to ensure all the tags get collected
  • Raspberry Pi can drop from WLANSetup a script to automatically reconnect in a case that happens

With these in place, the setup have been working without issues, so far.

Process data in AWS using IoT Core and friends

AWS processing overview

Once the data hits the AWS IoT Core there can be several rules for handling the incoming data. In this case, I setup a lambda to be triggered for each message. AWS IoT provides also a way to do the DynamoDB inserts directly from the messages, but I found it more versatile and development friendly approach to use the lambda between, instead.

AWS IoT Core act rule

DynamoDB works well as permanent storage in this case: Data structure is simple and service provides on demand based scalability and billing. Just pay attention when designing the table structure and make sure it fits with you use cases as changes done afterwards may be laborious. For more information about the topic, I recommend you to watch a talk about Advanced Design Patterns for DynamoDB.

DynamoDB structure I end up using

Visualise data with React and Highcharts

Once we have the data stored in semi structured format in AWS cloud, it can be visualised or processed further. I set up a periodic lambda to retrieve the data from DynamoDB and generate CSV files into public S3 bucket, for React clients to pick up. CSV format was preferred over for example JSON to decrease the file size. At some point, I may also try out using the Parquet -format and see if it suits even better for the purpose.

Overview visualisations for each tag

The React application fetches the CSV file from S3 using custom hook and passes it to Highcharts -component.

During my professional career, I’ve learnt the data visualisations are often causing various challenges due to limitations and/or bugs with the implementation. After using several chart components, I personally prefer using Highcharts over other libraries, if possible.

Snapshot from the tag placed outside

Send notifications with Telegram bots

Visualisations works well to see the status and how the values vary by the time. However, in case something drastic happens, like humidor humidity gets below preferred level, I’d like to get an immediate notification about it. This can be done for example using Telegram bots:

  1. Define the limits for each tag for example into DynamoDB table
  2. Compare limits with actual measurement whenever data arrives in custom lambda
  3. If value exceeds the limit, trigger SNS message (so that we can subscribe several actions to it)
  4. Listen into SNS topic and send Telegram message to message group you’re participating in
  5. Profit!
Limits in DynamoDB

 

Summary

By now, you should have some kind of understanding how one can combine IoT sensor, AWS services and outputs like web apps and Telegram nicely together using serverless technologies. If you’ve built something similar or taken very different approach, I’d be happy hear it!

Price tag

Building and running your own IoT solution using RuuviTags, Raspberry Pi and AWS Cloud does not require big investments. Here are some approximate expenses from the setup:

  • 3-pack of RuuviTags: 90e (ok, I wish these were a little bit cheaper so I’d buy these more since the product is nice)
  • Raspberry Pi with accessories: 50e
  • Energy used by RPi: http://www.pidramble.com/wiki/benchmarks/power-consumption
  • Lambda executions: $0,3/month
  • SNS notifications: $0,01/month
  • S3 storage: $0,01/month
  • DynamoDB: $0,01/month

And after looking into numbers, there are several places to optimise as well. For example, some lambdas are executed more often than really needed.

Next steps

I’m happy say this hobby project has achieved that certain level of readiness, where it is running smoothly days through and being valuable for me. As a next steps, I’m planning to add some kind of time range selection. As the amount of data is increasing, it will be interesting to see how values vary in long term. Also, it would be a good exercise to integrate some additional AWS services, detect drastic changes or communication failures between device and cloud when they happen. This or that, at least now I have a good base for continue from here or build something totally different next time 🙂

References, credits and derivative work

This project is no by means a snowflake and has been inspired by existing projects and work:

 


For more content follow Juha and Nordcloud Engineering on Medium.

At Nordcloud we are always looking for talented people. If you enjoy reading this post and would like to work with public cloud projects on a daily basis — check out our open positions here.

Get in Touch.

Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.









    Building Cloud-Based IoT Solutions and Serverless Web-Apps

    Our Cloud Application Architect Afaque Hussain has been on his cloud journey for some years already. At Nordcloud he builds cloud-native IoT solutions in our Data-Driven team. Here’s his story!

    aws cloud IoT internet of things web applications development

    1. Where are you from and how did you end up at Nordcloud?

    I’ve been living in Finland for the past 7 years and I’m from India. Prior to Nordcloud, I was working at Helvar, developing cloud-based, IoT enabled, lighting solutions. I’ve been excited about public cloud services ever since I got to know them and I generally attend cloud conferences and meetups. During one such conference, I met the Nordcloud team who introduced me to the company and invited me for an interview and since then, my Nordcloud journey has begun.

    2. What is your core competence? On top of that, please also tell about your role and projects shortly.

    My core-competence is building cloud-based web-services, that act as an IoT platform to which IoT devices connect and exchange data. Generally preferring Serverless computing and Infrastructure as Code, I primarily use AWS and Javascript (Node.js) in our projects.

    My current role is Cloud Application Architect, where I’m involved in our customer projects in designing and implementing end-to-end IoT solutions. In our current project, we’re building a web-service using which our customer can connect, monitor and manage their large fleet of sensors and gateways. The CI/CD pipelines for our project have been built using AWS Developer Tools such CodePipeline, CodeBuild & CodeDeploy. Our CI/CD pipelines have been implemented as Infrastructure as Code, which enables us to deploy another instance of our CI/CD pipelines in a short period time. Cool!

    3. What sets you on fire / what’s your favourite thing technically with public cloud?

    The ever increasing serverless service offerings by public cloud vendors, which enables us to rapidly build web-applications & services.

    4. What do you like most about working at Nordcloud?

    Apart from the opportunity to work on interesting projects, I like my peers. They’re very talented, knowledgeable and ready to offer help when needed.

    5. What is the most useful thing you have learned at Nordcloud?

    Although I’ve learnt a lot at Nordcloud, I believe the knowledge of  the toolkit and best practices for cloud-based web-application development has been the most useful thing I’ve learnt.

    6. What do you do outside work?

    I like doing sports and I generally play cricket, tennis or hit the gym. During the weekends, I generally spend time with my family, exploring the beautiful Finnish nature, people or different cuisines. 

    7. How would you describe Nordcloud’s culture in 3 words?

    Nurturing, collaborative & rewarding.

    8. Best Nordcloudian memory?

    Breakfast @ Nordcloud every Thursday. I always look forward to this day. I get to meet other Norcloudians, exchange ideas or just catch-up over a delicious breakfast!

     

    Get in Touch.

    Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.









      Counting Faces with AWS DeepLens and IoT Analytics

      CATEGORIES

      BlogTech Community

      It’s pretty easy to detect faces with AWS DeepLens. Amazon provides a pre-trained machine learning model for face detection so you won’t have to deal with any low-level algorithms or training data. You just deploy the ML model and a Lambda function to your DeepLens device and it starts automatically sending data to the cloud.

      In the cloud you can leverage AWS IoT and IoT Analytics to collect and process the data received from DeepLens. No programming is needed. All you need to do is orchestrate the services to work together and enter one SQL query that calculates daily averages of the faces seen.

      Connecting DeepLens to the cloud

      We’ll assume that you have been able to obtain a DeepLens device. They are currently only being sold in the US, so if you live in another country, you may need to get creative.

      Before you can do anything with your DeepLens, you must connect it to the Amazon cloud. You can do this by opening the DeepLens service in AWS Console and following the instructions to register your device. We won’t go through the details here since AWS already provides pretty good setup instructions.

      Deploying a DeepLens project

      To deploy a machine learning application on DeepLens, you need to create a project. Amazon provides a sample project template for face detection. When you create a DeepLens project based on this template, AWS automatically creates a Lambda function and attaches the pre-trained face detection machine learning model to the project.

      The default face detection model is based on MXNet. You can also import your own machine learning models developed with TensorFlow, Caffe and other deep learning frameworks. You’ll be able to train these models with the AWS SageMaker service or using a custom solution. For now, you can just stick with the pre-trained model to get your first application running.

      Once the project has been created, you can deploy it to your DeepLens device.  DeepLens can run only one project at a time, so your device will be dedicated to running just one machine learning model and Lambda function continuously.

      After a successful deployment, you will start receiving AWS IoT MQTT messages from the device. The sample application sends messages continuously, even if no faces are detected.

      You probably want to optimize the Lambda function by adding an “if” clause to only send messages when one or more faces are actually detected. Otherwise you’ll be sending empty data every second. This is fairly easy to change in the Python code, so we’ll leave it as an exercise for the reader.

      At this point, take note of your DeepLens infer topic. You can find the topic by going to the DeepLens Console and finding the Project Output view under your Device. Use the Copy button to copy it to your clipboard.

      Setting up AWS IoT Analytics

      You can now set up AWS IoT Analytics to process your application data. Keep in mind that because DeepLens currently only works in the North Virginia region (us-east-1), you also need to create your AWS IoT Analytics resources in this region.

      First you’ll need to create a Channel. You can choose any Channel ID and keep most of the settings at their defaults.

      When you’re asked for the IoT Core topic filter, paste the topic you copied earlier from the Project Output view. Also, use the Create new IAM role button to automatically create the necessary role for this application.

      Next you’ll create a Pipeline. Select the previously created Channel and choose Actions / Create a pipeline from this channel.

      AWS Console will ask you to select some Attributes for the pipeline, but you can ignore them for now and leave the Pipeline activities empty. These activities can be used to preprocess messages before they enter the Data Store. For now, we just want to messages to be passed through as they are.

      At the end of the pipeline creation, you’ll be asked to create a Data Store to use as the pipeline’s output. Go ahead and create it with the default settings and choose any name for it.

      Once the Pipeline and the Data Store have been created, you will have a fully functional AWS IoT Analytics application. The Channel will start receiving incoming DeepLens messages from the IoT topic and sending them through the Pipeline to the Data Store.

      The Data Store is basically a database that you can query using SQL. We will get back to that in a moment.

      Reviewing the auto-created AWS IoT Rule

      At this point it’s a good idea to take a look at the AWS IoT Rule that AWS IoT Analytics created automatically for the Channel you created.

      You will find IoT Rules in the AWS IoT Core Console under the Act tab. The rule will have one automatically created IoT Action, which forwards all messages to the IoT Analytics Channel you created.

      Querying data with AWS IoT Analytics

      You can now proceed to create a Data Set in IoT Analytics. The Data Set will execute a SQL query over the data in the Data Store you created earlier.

      Find your way to the Analyze / Data sets section in the IoT Analytics Console. Select Create and then Create SQL.

      The console will ask you to enter an ID for the Data Set. You’ll also need to select the Data Store you created earlier to use as the data source.

      The console will then ask you to enter this SQL query:

      SELECT DATE_TRUNC(‘day’, __dt) as Day, COUNT(*) AS Faces
      FROM deeplensfaces
      GROUP BY DATE_TRUNC(‘day’, __dt)
      ORDER BY DATE_TRUNC(‘day’, __dt) DESC

      Note that “deeplensfaces” is the ID of the Data Source you created earlier. Make sure you use the same name consistently. Our screenshots may have different identifiers.

      The Data selection window can be left to None.

      Use the Frequency setting to setup a schedule for your SQL query. Select Daily so that the SQL query will run automatically every day and replace the previous results in the Data Set.

      Finally, use Actions / Run Now to execute the query. You will see a preview of the current face count results, aggregated as daily total sums. These results will be automatically updated every day according to the schedule you defined.

      Accessing the Data Set from applications

      Congratulations! You now have IoT Analytics all set up and it will automatically refresh the face counts every day.

      To access the face counts from your own applications, you can write a Lambda function and use the AWS SDK to retrieve the current Data Set content. This example uses Node.js:

      const AWS = require('aws-sdk')
      const iotanalytics = new AWS.IoTAnalytics()
      iotanalytics.getDatasetContent({
        datasetName: 'deeplensfaces',
      }).promise().then(function (response) {
        // Download response.dataURI
      })

      The response contains a signed dataURI which points to the S3 bucket with the actual results in CSV format. Once you download the content, you can do whatever you wish with the CSV data.

      Conclusion

      This has been a brief look at how to use DeepLens and IoT Analytics to count the number of faces detected by the DeepLens camera.

      There’s still room for improvement. Amazon’s default face detection model detects faces in every video frame, but it doesn’t keep track whether the same face has already been seen in previous frames.

      It gets a little more complicated to enhance the system to detect individual persons, or to keep track of faces entering and exiting frames. We’ll leave all that as an exercise for now.

      If you’d like some help in developing machine learning applications, please feel free to contact us.

      Get in Touch.

      Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.









        Looking ahead: what’s next for AI in manufacturing?

        CATEGORIES

        BlogTech Community

        AI and manufacturing have been on an exciting journey together. It’s a combination that is fast changing the world of manufacturing: 92 percent of senior manufacturing executives believe that the ‘Smart Factory’ will empower their staff to work smarter and increase productivity.

        How does AI benefit manufacturers?

        Some of the biggest companies are already adopting AI. Why? A big reason is increased uptime and productivity through predictive maintenance. AI enables industrial technology to track its own performance and spot trends and looming problems that humans might miss. This gives the operator a better chance of planning critical downtime and avoiding surprises.

        But what’s the next big thing? Let’s look to the immediate future, to what is on the horizon and a very real possibility for manufacturers.

        Digital twinning

        ‘A digital twin is an evolving digital profile of the historical and current behaviour of a physical object or process that helps optimize business performance.’ – According to Deloitte.

        Digital twinning will be effective in the manufacturing industry because it could replace computer-aided design (CAD). CAD is highly effective in computer-simulated environments and has shown some success in modelling complex environments, yet its limitations lay in the interactions between the components and the full lifecycle processes.

        The power of a digital twin is in its ability to provide a real-time link between the digital and physical world of any given product or system. A digital twin is capable of providing more realistic measurements of unpredictability. The first steps in this direction have been taken by cloud-based building information modelling (BIM), within the AEC industry. It enables a manufacturer to make huge design and process changes ahead of real-life occurrences.

        Predictive maintenance

        Take a wind farm. You’re manufacturing the turbines that will stand in a wind farm for hundreds of years. With the help of a CAD design you might be able to ‘guesstimate’ the long-term wear, tear and stress that those turbines might encounter in different weather conditions. But a digital twin will use predictive machine learning to show the likely effects of varying environmental events, and what impact that will have on the machinery.

        This will then affect future designs and real-time manufacturing changes. The really futuristic aspect will be the incredible increases in accuracy as the AI is ‘trained.’

        Smart factories

        An example of a digital twin in a smart factory setting would be to create a virtual replica of what is happening on the factory floor in (almost) real-time. Using thousands or even millions of sensors to capture real-time performance and data, artificial intelligence can assess (over a period of time) the performance of a process, machine or even a person. Cloud-based AI, such as those technologies offered by Microsoft Azure, have the flexibility and capacity to process this volume of data.

        This would enable the user to uncover unacceptable trends in performance. Decision-making around changes and training will be based on data, not gut feeling. This will enhance productivity and profitability.

        The uses of AI in future manufacturing technologies are varied. Contact us to discuss the possibilities and see how we can help you take the next steps towards the future.

        Get in Touch.

        Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.









          What is Amazon FreeRTOS and why should you care?

          CATEGORIES

          BlogTech Community

          At Nordcloud, we’ve been working with AWS IoT since Amazon released it

          We’ve enabled some great customer success stories by leveraging the high-level features of AWS IoT. We combine those features with our Serverless development expertise to create awesome cloud applications. Our projects have ranged from simple data collection and device management to large-scale data lakes and advanced edge computing solutions.

           

          In this article we’ll take a look at what Amazon FreeRTOS can offer for IoT solutions

          First released in November 2017, Amazon FreeRTOS is a microcontroller (MCU) operating system. It’s designed for connecting lightweight microcontroller-based devices to AWS IoT and AWS Greengrass. This means you can have your sensor and actuator devices connect directly to the cloud, without having smart gateways acting as intermediaries.

          DSC06409 - Saini Patala Photography

          What are microcontrollers?

          If you’re unfamiliar with microcontrollers, you can think of them as a category of smart devices that are too lightweight to run a full Linux operating system. Instead, they run a single application customized for some particular purpose. We usually call these applications firmware. Developers combine various operating system components and application components into a firmware image and “burn” it on the flash memory of the device. The device then keeps performing its task until a new firmware is installed.

          Firmware developers have long used the original FreeRTOS operating system to develop applications on various hardware platforms. Amazon has extended FreeRTOS with a number of features to make it easy for applications to connect to AWS IoT and AWS Greengrass, which are Amazon’s solutions for cloud based and edge based IoT. Amazon FreeRTOS currently includes components for basic MQTT communication, Shadow updates, AWS Greengrass endpoint discovery and Over-The-Air (OTA) firmware updates. You get these features out-of-the-box when you build your application on top of Amazon FreeRTOS.

          Amazon also runs a FreeRTOS qualification program for hardware partners. Qualified products have certain minimum requirements to ensure that they support Amazon FreeRTOS cloud features properly.

          Use cases and scenarios

          Why should you use Amazon FreeRTOS instead of Linux? Perhaps your current IoT solution depends on a separate Linux based gateway device, which you could eliminate to cut costs and simplify the solution. If your ARM-based sensor devices already support WiFi and are capable of running Amazon FreeRTOS, they could connect directly to AWS IoT without requiring a separate gateway.

          Edge computing scenarios might require a more powerful, Linux based smart gateway that runs AWS Greengrass. In such cases you can use Amazon FreeRTOS to implement additional lightweight devices such as sensors and actuators. These devices will use MQTT to talk to the Greengrass core, which means you don’t need to worry about integrating other communications protocols to your system.

          In general, microcontroller based applications have the benefit of being much more simple than Linux based systems. You don’t need to deal with operating system updates, dependency conflicts and other moving parts. Your own firmware code might introduce its own bugs and security issues, but the attack surface is radically smaller than a full operating system installation.

          How to try it out

          If you are interested in Amazon FreeRTOS, you might want to order one of the many compatible microcontroller boards. They all sell for less than $100 online. Each board comes with its own set of features and a toolchain for building applications. Make sure to pick one that fits your purpose and requirements. In particular, not all of the compatible boards include support for Over-The-Air (OTA) firmware upgrades.

          At Nordcloud we have tried out two Amazon-qualified boards at the time of writing:

          • STM32L4 Discovery Kit
          • Espressif ESP-WROVER-KIT (with Over-The-Air update support)

          ST provides their own SystemWorkBench Ac6 IDE for developing applications on STM32 boards. You may need to navigate the websites a bit, but you’ll find versions for Mac, Linux and Windows. Amazon provides instructions for setting everything up and downloading a preconfigured Amazon FreeRTOS distribution suitable for the device. You’ll be able to open it in the IDE, customize it and deploy it.

          Espressif provides a command line based toolchain for developing applications on ESP32 boards which works on Mac, Linux and Windows. Amazon provides instructions on how to set it up for Amazon FreeRTOS. Once the basic setup is working and you are able to flash your device, there are more instructions for setting up Over-The-Air updates.

          Both of these devices are development boards that will let you get started easily with any USB-equipped computer. For actual IoT deployments you’ll probably want to look into more customized hardware.

          Conclusion

          We hope you’ll find Amazon FreeRTOS useful in your IoT applications.

          If you need any help in planning and implementing your IoT solutions, feel free to contact us.

          Get in Touch.

          Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.









            Nordcloud @ Smart Factory 2018 in Jyväskylä – 20-22.11.2018

            CATEGORIES

            Blog

            Nordcloud at Smart Factory 2018 Jyväskylä

            Make sure to visit Nordcloud’s booth (C430) at the ‘Smart Factory 2018’ event, which is held in The Congress and Trade Fair Centre Jyväskylän Paviljonki, Jyväskylä between the 20-22.11.2018.

             

            Smart Factory 2018 is an event focused on how to utilise opportunities offered by digitalisation

            The event gathers together the themes of Industry 4.0 and the related technology, service and expertise offering. Smart Factory 2018 is targeted at all operators who are involved with changes associated with digital transformation in production activities and related new services and concepts. It strongly emphasizes the already known future-building themes, such as automation, machine vision, robotics, industrial internet and cybersecurity.

            You can register for the event here:

            Register to Smart Factory 2018

            See you there!

            Nordcloud at Smart Factory 2018

            Get in Touch.

            Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.









              Leveraging AWS Greengrass for Edge IoT Solutions

              CATEGORIES

              Blog

              There is a growing demand for intelligent edge solutions that not only collect data, but also control on-premise equipment at industrial customer sites. Historically such solutions have often been based on low-level custom firmware that has required technical specialists to develop and maintain.

              AWS Greengrass has significantly lowered the barrier for edge IoT development by extending familiar cloud technologies to the edge. Cloud architects and cloud application developers can use their existing knowledge of serverless development and programming languages they already master. In many cases the same exact code can be run both in the cloud and at the edge as a Greengrass Lambda application. This has proven very useful for use cases like KPI algorithms and diagnostic logic that need to be executed both centrally in the cloud and in distributed fashion on the equipment located at the edge.

              Building blocks for IoT

              It’s important to keep in mind that Amazon usually offers the building blocks for making applications, not the actual end-user applications. This also applies to Greengrass and AWS IoT in general. You get an extensive set of features for building IoT applications, but you still need to put them together into an application that solves the business case requirements. Amazon calls this eliminating the “undifferentiated heavy lifting”. Application developers don’t have to deal with low level issues like scaling databases or designing communication protocols which have already been solved in general. Instead they can focus on implementing the business-specific features and logic relevant to the use case.

              In fact, as the AWS IoT platform has evolved in recent years, the need custom databases has been almost completely eliminated. AWS IoT Device Management provides a flexible way to organize IoT devices into groups and hierarchies. Custom metadata can be attached to the devices, enabling indexing and searching. You no longer start a project by designing database tables from scratch, but instead you first look at what AWS IoT already offers you out-of-the-box.

              The same principle applies to business logic. In many cases there is no need to write custom code, because AWS IoT’s MQTT based messaging platform offers simpler ways to filter, route and process data. This is particularly important for datalake solutions, because the amount of data processed can be quite large. If you can completely omit custom code, you don’t have to worry about scaling it. The best datalake solutions simply connect a few services like AWS IoT, Kinesis Firehose and Amazon S3 together, and the data is automatically collected into S3 buckets regardless of its size and bandwidth.

              Business logic at the Edge

              In the case of Greengrass edge solutions you still usually need Lambda functions to implement business logic. Greengrass contains functionality for topic-based MQTT routing, but to process the contents of MQTT messages, some code is needed. However, the implementation can be just a few lines of code to execute the required algorithm as a Lambda function. Developers don’t have to worry about building containers, opening network connections or configuring security settings. Greengrass takes care of all the details of deploying the Lambda function.

              It’s worth noting though that larger customers usually prefer to build a customized management system on top of AWS IoT and Greengrass. There are lots of exposed details and moving parts when dealing with “raw” AWS IoT devices and Greengrass deployments. When a lightweight business-specific management layer is built on top of them, end-users can deal with familiar concepts and ignore most unnecessary details. Power users can still access the underlying technologies simply by using the AWS Console.

              Get in Touch.

              Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.









                Cloud Computing News #5: AI, IoT and cloud in manufacturing

                CATEGORIES

                Blog

                This week we focus on how AI, IoT and cloud computing are transforming manufacturing.

                Cloud Computing Will Drive Manufacturing Growth

                Manufacturing.net lists 10 ways cloud computing will drive manufacturing growth during this year:

                1. Quality gains greater value company-wide when a cloud-based application is used to track, analyse and report quality status by center and product.
                2. Manufacturing cycle times are accelerated through the greater insights available with cloud-based manufacturing intelligence systems.
                3. Insights into overall equipment effectiveness (OEE) get stronger using cloud-based platforms to capture, track and analyse the health of the equipment.
                4. Automating compliance and reporting saves valuable time.
                5. Real-time tracking and traceability become easier to achieve with cloud based applications.
                6. APIs let help scale manufacturing strategies faster than ever.
                7. Cloud-based systems enable higher supply chain performance. 
                8. Order cycle times and rework are reduced.
                9. Integrating teams’ functions increases new product introduction success. 
                10. Perfect order performance is tracked across multiple production centers for the first time.

                Read more in manufacturing.net

                Machine learning in manufacturing

                According to CIO Review, the challenge with machine learning in manufacturing is not always just the machines. Machine learning in IoT has focused on optimizing at the machine level but now it’s time for manufacturers, to unlock the true poten­tial of machine learn­ing, start looking at network-wide efficiency.

                By opening up the entire network’s worth of data to these network-based algorithms we can unlock an endless amount of previously unattainable opportunities:

                1. With the move to network-based machine learning algorithms, engineers will have the ability to determine the optimal workflow based on the next stage of the manufacturing process.
                2. Machine-learning algorithms can reduce labor costs and improve the work-life balance of plant employees. 
                3. Manufacturers will be able to more effectively move to a multi-modal facility production model where the capacity of each plant is optimized to increase the efficiency of the entire network.
                4. By sharing data across the network, manufacturing plants can optimize capacity.
                5. In the future, the algorithms will be able to provide the ability to schedule for purpose to optimize cost and delivery and to meet the demand.

                Read more in CIO Review

                Introducing IOT into manufacturing

                According to Global Manufacturing, IoT offers manufacturers many potential benefits in product innovation, but it also brings challenges, particularly around the increased dependency on software:

                1. Compliance: Manufacturers developing IoT-based products must demonstrate compliance due to critical safety and security demands. In order to do this, development organisations must be able to trace and even have an audit trail for all the changes involved in a product lifecycle.
                2. Diversity and number of contributors, who may be spread across different locations or time-zones and working with different platforms or systems.  Similarly, over-the-air updates, also exacerbate the need for control and managing complex dependency issues at scale and over long periods of time.
                3. Need to balance speed to market, innovation and flexibility, against the need for reliability, software quality and compliance, all in an environment that is more complex and involving many more components.

                Because of these challenges, increasing number of manufacturing companies revise how they approach development projects. More of them are moving away from traditional processes like Waterfall, towards Agile, Continuous Delivery and DevOps or hybrids of more than one.  These new ways of working also help empower internal teams, while simultaneously providing the rigour and control that management requires.

                In addition to new methodology, this change requires the right supporting tools. Many existing tools may no longer be fit for purpose, though equally many have also evolved to meet the specific requirements of IoT. Building the right foundation of tools, methodologies and corporate thinking in place is essential to success.

                Read more in Global Manufacturing

                Data driven solutions and devops at Nordcloud

                Our data driven solutions and DevOps will make an impact on your business with better control and valuable business insight with IoT, modern data platforms and advanced analytics based on machine learning. How can we help you take your business to the next level? 

                Get in Touch.

                Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.









                  Cloud Computing News #4: IoT in the Cloud

                  CATEGORIES

                  Blog

                  This week we focus on IoT in the cloud.

                   

                  AWS IOT platform is great for startups

                  IoT for all lists 7 reasons why start up companies like iRobot, GoPro, and Under Armour have chosen AWS IoT platform:

                  1. Starting with AWS IOT is easy: The AWS IoT platform connects IoT devices to the cloud and allows them to securely interact with each other and various IoT applications.
                  2.  High IoT security: Amazon doesn’t spare resources to protect its customers’ data, devices, and communication.
                  3. AWS cherises and cultivates startup culture: AWS has helped multiple IoT startups get off the ground and startups are a valuable category of Amazon’s target audience.
                  4. Serverless approach and AWS Lambda are right for startups: startups can reduce the cost of building prototypes and add agility to the development process as well as build a highly customizable and flexible, serverless back end that is highly automated.
                  5. AWS IoT Analytics paired with AI and Machine LearningAWS IoT Analytics and Amazon Kinesis Analytics answer to high demand for data-analytic capacities in IoT.
                  6. Amazon partners with a broad network of IoT device manufacturers, IoT device startups, and IoT software providers.
                  7. The range of AWS products and services: the top provider of cloud services has a range of solutions tailored for major customer categories, including startups.

                  Read more in IoT for all

                   

                  IoT – 5 predictions for 2019 and their impact

                  Forbes makes five IoT predictions for 2019:

                  1. Growth across the board: IoT market and connectivity statistics show numbers mostly in the billions (check the article below)
                  2. Manufacturing and healthcare – deeper penetration: Market analysts predict the number of connected devices in the manufacturing industry will double between 2017 and 2020.
                  3. Increased security at all end points: Increase in end point security solutions to prevent data loss and give insights into network health and threat protection.
                  4. Smart areas or smart neighborhoods in cities. Smart sensors around the neighborhood will record everything from walking routes, shared car use, sewage flow, and temperature choice 24/7.
                  5. Smart cars – increased market penetration for IoT: Diagnostic information, connected apps, voice search, current traffic information, and more to come.

                  Read more on these predictions in Forbes

                   

                  IoT is growing at an exponential rate

                  According to Forbes IoT is one of the most-researched emerging markets globally. The magazine lists 10 charts on the explosive growth of IoT adoption and market.

                  Here below a few teasers, check all charts in Forbes.

                  1. According to Statista, by 2020, Discrete Manufacturing, Transportation & Logistics and Utilities industries are projected to spend $40B each on IoT platforms, systems, and services.
                  2. McKinsey predicts the IoT market will be worth $581B for ICT-based spend alone by 2020, growing at a Compound Annual Growth Rate (CAGR) between 7 and 15%.
                  3. Smart Cities (23%), Connected Industry (17%) and Connected Buildings (12%) are the top three IoT projects in progress (IoT Analytics).
                  4. GE found that Industrial Internet of Things (IIoT) applications are relied on by 64% power and energy (utilities) companies to succeed with their digital transformation initiatives.
                  5. Industrial products lead all industries in IoT adoption at 45% with an additional 22% planning in 12 months, according to Forrester.
                  6. Harley Davidson reduced its build-to-order cycle by a factor of 36 and grew overall profitability by 3% to 4% by shifting production to a fully IoT-enabled plant according to Deloitte.

                   

                  Philips is tapping into the IoT market with AWS

                  According to the NetworkWorld, IDC forecasts the IoT market will reach $1.29 trillion by 2020. Philips is turning toothbrushes and MRI machines into IoT devices to tap this market and to keep patients more healthy and the machines running more smoothly.

                  “We’re transforming from mainly a device-focused business to a health technology company focused on the health continuum of care and service”, says Dale Wiggins, VP and General Manager of the Philips HealthSuite Digital Platform. “By connecting our devices and modalities in the hospital or consumer environment, it provides more data that can be used to benefit our customers.”

                  Philips relies on a combination of AWS services and tools, including the company’s IoT platform, Amazon’s CloudWatch and Cloud Formation. Philips uses predictive algorithms and data analysis tools to monitor activity, identify trends and report abnormal behavior.

                  Read more in NetworkWorld

                   

                  DATA DRIVEN SOLUTIONS AT NORDCLOUD

                  Our data driven solutions will make an impact on your business with better control and valuable business insight with IoT, modern data platforms and advanced analytics based on machine learning. How can we help you take your business to the next level? 

                   

                   

                   

                  Get in Touch.

                  Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.









                    Cloud Computing News #2: Digital transformation in the cloud

                    CATEGORIES

                    Blog

                    This week we focus on digital transformation and IT transformation in the cloud.

                     

                    Campbell’s Drives IT Transformation on Azure

                    Campbell Soup Co has partnered with Microsoft to modernize Campbell’s IT platform through the Azure cloud by streamlining workflows and driving efficiencies.

                    “Campbell’s migration to Azure will increase our flexibility, agility and resiliency,” said Francisco Fraga, Campbell Soup’s CIO. “Azure will give us the ability to respond quickly to evolving business needs, introduce new solutions, and support our 24/7, always-on architecture. The Microsoft cloud is a proven, reliable and highly secure platform.”

                    The Microsoft solution will provide additional benefits, including increased security, compliance and information protection. The move to Azure will allow Campbell to re-architect its data warehousing capabilities to be able to support the company’s data and analytics needs.

                    Read the full article here

                    Nordcloud is also Microsoft Gold Cloud Partner and Microsoft Azure Expert Managed Services Provider. Accelerate operations by moving IT to the public cloud with our solutions, you can find them here.

                     

                    Walmart Picked Microsoft To Accelerate Digital Transformation in the cloud

                    According to Forbes, Walmart has signed a 5-year strategic partnership with Microsoft to accelerate digital transformation. This is an extension of an existing relationship between Walmart and Microsoft.

                    This new agreement will see the companies collaborating on machine learnings, AI and data-platform solutions that span customer-facing projects as well as those aimed at optimizing internal operations.

                    3 focus ares of the partnership are:

                    1. Digital transformation:  Walmart will have the full range of Microsoft cloud solutions, move hundreds of existing applications to cloud-native architectures, migrate of a significant portion of walmart.com and samsclub.com to Azure to grow and enhance the online customer experience.
                    2. Innovation: Walmart will build a global IoT platform on Azure.
                    3. Changing way of working at Walmart: Walmart is investing in its people with a phased rollout of Microsoft 365.

                    More on Walmart´s digital transformation in Forbes.

                    Read also our blog post on how to accelerate digital transformation with culture and cloud here.

                    Our data driven solutions that will make an impact on your business you can find here.

                     

                    Gartner identifies 6 barriers to becoming a digital business

                    According to a recent survey by Gartner, companies embracing digital transformation are finding that digital business is not as simple as buying the latest technology but requires changes in systems and culture.

                    Gartner lists six barriers that CIOs must overcome to transform their business:

                    1. A Change-Resisting Culture. Digital innovation requires collaborative cross-functional and self-directed teams that are not afraid of uncertain outcomes.
                    2. Limited Sharing and Collaboration. Issues of ownership and control of processes, information and systems make people reluctant to share their knowledge. But it is not necessary to have everyone on board in the early stages.
                    3. The Business Isn’t Ready. When a CIO wants to kick-off a transformation, they find that the business doesn’t have the resources or skills needed.
                    4. The Talent Gap. Markus Blosch, research vice president at Gartner, says: “There are two approaches to breach the talent gap — upskill and bimodal.”
                    5. The Current Practices Don’t Support the Talent. Highly structured and slow traditional processes don’t work for digital.
                    6. Change Isn’t Easy. Gartner advocates adopting a platform-based strategy which supports continuous change.

                    Read more about the survey on Gartner Newsroom.

                    Read also our blog post on how to support cloud and digital transformation here.

                    Get in Touch.

                    Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.