Why?

If you’re interested and want to experiment in IoT field, if you have been fascinated by Arduino and the Maker’s World, one natural target is Home Automation. At least, it has worked this way for me.

But then, I thought: we spend so much time in our Home and we don’t always know if it is the ideal and healthier environment.

For example, these are some questions I wanted to answer:

  • Is the temperature too high? The humidity too low?
  • What is the Air Quality in the kitchen?
  • Is there any gas leak?

Yes, I could buy some dedicated widget, like for example Netatmo Home Coach, but in that way I would miss the ability to customize the solution and it would make difficult to integrate it in a bigger solution. That’s the reason why, after some initial experiments, I finally decided to build my own.

The hardware (sensors’ nodes).

I had already available a set of different Arduino compatible board and sensors. I decided to reuse three of these boards:

  • An Arduino MKR1000
  • Particle Photon
  • An (old) Intel Edison (for the kitchen, see below)

For temperature and humidity I have used DHT22 sensors.

Intel Edison in an heritage from an hackaton I have participated some time ago (thanks to Intel’s guys!). Together with the board I have received a set of Grove sensors and I have used the Air Quality and the Gas Sensor.

There is a gateway at the core of my Home Automation. It is (obviously) based on a Raspberry PI, equipped with Home Assistant and a Mosquitto MQTT broker.

A first layer of software.

The code running on sensors’ nodes is written in C (Arduino compatible), using the available libraries for sensors. The only exception is the code running on the Edison board. There I have decided to use Java (never forget your first love) and Intel UPM libraries.

The communication between sensors’ nodes and the gateway is entirely built upon MQTT.

Every few seconds, nodes send JSON messages with readings to the MQTT broker on the RPI. MQTT is lightweight (which is good for the memory constrained devices) and with QoS 1 I can handle occasional disconnections from the WIFI network.

On Intel Edison board, MQTT client is implemented using Eclipse Paholibrary.

Home Assitant has been configured to read from MQTT topics.

Home Assistant.

I don’t want to spend too much time talking about Home Assistant, you can find enough information on their site.

It is OK for my current goals. It is (almost) easy to setup. It does integrate with MQTT (and this was a core design choice). It integrates with my Philips Huesmart lights. You can use Python for extensions. It has a nice iOS App.

Just to give you an idea, this is a picture of one of my Home Assistant screen:

 
Home Assistant Home page

But, as I told you before, I want to be able to integrate in a bigger solution. Therefore, I have installed on the gateway NodeRED and I have developed a very simple flow that relays all the messages to a bigger Linux machine, again using MQTT.

The Docker Machine.

I wanted to be able to feed my real-time stream of data to a set of MicroServices, implemented using Docker container technology, Docker Compose and a set of other Open Source technologies, to experiment with them and realize some more interesting things.

Therefore, I took one old laptop (for everyday work and life I use only MAC), installed on it Ubuntu 16.04 LTS and Docker and setup with a set of of containers, orchestrated using Docker Compose.

If you want to have a rough idea of what I have added so far, here it is: 

Docker containers

MicroServices?

The first one I added is the mqtt-broker, that receives all the sensors messages from RPI. This way I have all the data and I can do whatever I want. Publish and Subscribe paradigm makes it easy to add more consumers for the data, as you need.

Then, I have added a bridgekafka, implemented in Python. It relays all the messages from the MQTT broker to the Kafka broker that act as the messaging backbone. MQTT is good for communication with low-power constrained sensors nodes, but Kafka is much better as an Enterprise Messaging backbone.

The message-store, again implemented in Python, consumes messages from Kafka and store them in a MySQL DB. Therefore, I can analyze them as I want. In the near future I will add a set of Jupyter Notebook to analyze data.

But if you want to be able to effectively visualize and analyze real-time telemetry data, you have to treat them as they are: Time-Series (TS). Therefore, I have added what is probably considered the most widely used TS DB: InfluxDB. Using Docker it is easy to find pre-built images, that you can customize with your Dockerfile.

And, following a common pattern, I have added Grafana. With Grafana it is easy to consume data from InfluxDB and create customizable real-time dashboard. InfluxDB is, in fact, one of the pre-configured data source.

With all these containers around, I wanted to be able to monitor containers themselves, to see for example, if one of them was dragging my (old laptop) resources. Therefore I have added Prometheus with cAdvisor and node-exporter, to capture containers and system metrics.

As a result, I have all the data coming from sensors collected in InfluxDB and visualized in a set of Grafana Dashboard. I can look at real-time behaviour, and have a close look at historical data.

For example, in this week it seems that temperature has lowered in Rome, due to cold air coming from the east. Here it is the picture for my Rooms Temperatures:

This week Rooms Temperatures

Well, as you see, if we don’t open very often the window, the effect is very little. And we maintain a probably too high temperature inside. I should think about it.

Is everything ok as I wanted?

Well, almost everything.

Kafka is scalable, can be made reliable, but it is still a little bit complicated. If you want to use NodeRED you don’t find reliable modules, hassle free, for Kafka. This is, at least my experience.

But I love Python too, and I decided to write Kafka consumers in Python (messagestore, for example).

I wanted to try with Apache NiFI, but I discovered that the flow consuming messages from Kafka was using 60% of my CPU. I switched to Python and now average CPU usage is less than 5% (thank to Prometheus monitoring).

Conclusion.

It started as a game, but soon it became for me also as an Architecture Design exercise. And I have tested what I have designed.

It doesn’t take too long to build what I have described. Few days.

It is working without any interruption (apart from one, due to my WIFI), since two months.

And I can integrate almost anything I want. Cool.

Next (big steps).

Well, what I would like to do is to start experiment ML algorithms to better decide, for example, if temperature is OK, how long I need to open the window, if I need to raise the humidity….

Then, I’ll very soon add a Google Home. It will be officially sold in Italy this week and I have ordered one.

Imagine: ‘Ok Google, could you tell me what is the Air Quality now?

Ciao Luigi, maybe you should open the window’.

An update.

I have added a second article on the subject, where I talk about the adoption of the new MKR boards from Arduino.