Gravio Blog
November 10, 2021

[Tutorial] Easy IoT: Measuring CO2, Temperature, Humidity and Air Pressure in an office, and displaying it in real-time

A tutorial to learn how to show environmental data such as CO2 values, temperature and humidity on a small display in a room for everybody to see in real-time. This tutorial is based on Gravio, the IoT Edge Computing Platform.
[Tutorial] Easy IoT: Measuring CO2, Temperature, Humidity and Air Pressure in an office, and displaying it in real-time


This article is part of a series to show how to make an office environment smarter by deploying sensors, displays, and integrating into third party APIs. With Gravio you can easily make any space more interconnected and smart. 


tldr; we are using a Gravio CO2 and an Aqara environment sensor to gather CO2 concentration, temperature, humidity and air pressure. We then display this on a Gravio LED matrix display.

Note: The CO2 sensor and the LED matrix display devices are available with a Gravio Standard subscription or higher.



  • Gravio HubKit setup - the Gravio Edge server
  • Gravio Studio installed (available on macOS and Windows)
  • Gravio Standard account - sign up here (To be eligible for the CO2 sensor)
  • Gravio CO2 sensor
  • Aqara Temperature and Humidity  sensor


Step 1:

First, you need to pair the three devices you are going to use: The CO2 sensor, the Aqara Environment Sensor and the LED matrix display. In order to achieve this, open your Gravio Studio and connect to the Gravio Hub. Once logged in, click on the “devices” button on the top right:

This will open the various ways of connecting. Click on Zigbee to open the Zigbee pairing dialogue. Click on the “Pairing” button and while the counter is counting down, hold the devices closeby and hit the pairing button multiple times at various lengths until they all appear in the list of paired devices:

Note, Zigbee often requires pressing the pairing button multiple times as the timing for the pairing needs to line up. Once all are paired, you can add the devices to Areas and Layers in the next step.

Step 2:

Go back to the main view and click on the green plus sign to add an Area.

Give your are a reasonable name and add a layer per data type you are adding (CO2, Temperature, Humidity, Air pressure):

Click the three dots on the top left to add new layers, and on the + sign on the far right to assign the sensors to the respective layers.

Once you have all sensors connected, it should look something like this:

Step 3:

Once your devices are connected via Zigbee, let’s create the action that outputs the data to the LED matrix display. You can imagine an action like a little program, consisting of steps,  where each step has input and output. The output of the Gravio LED Matrix step is of course the display of the data on the device.

Open The Actions Window:

Click on the + sign at the top right to create a new action:

Double click the new action to open the Action Editor:

In the top bar you arrange the steps within your action. Click on the + sign and start with the LED Matrix Step:

To test if it’s connected properly, select ASCII and enter some random characters (here for example a T for Test) into the content field, and hit the play button. Your LED Matrix display should show that character.

If that process is executed successfully, let’s read the latest sensor database values out and show them on the Matrix with a display. For this, add the Read Sensor DB step, drag it in front of the Matrix step and enter the following values:

Select the area where your sensor is

Select the layer where your data is

For more information on the other fields, please refer to the documentation on

Then we open the Gravio LED Matrix display component and set the template to the right data type (in this case CO2) and the cp.Content field to the input (Payload) so it will be displayed on the matrix:

The cp.Content variable (which is what is going to be sent to the Matrix display) is going to be set to the cv.Payload.Data , which is part of the output from the Sensor Data DB component.

If you now hit the “Play” button (in the top right), the LED Matrix should show the value.

If that works, you can repeat the above steps for each data type, putting a “Sleep” in between. This means the display is showing the data during the time it’s sleeping:

15000 milliseconds sleep equals 15 seconds.

Step 4:

The last step is about creating the trigger. In essence we need the “play button” to be pressed automatically to trigger the action. There are two ways trigger work in Gravio:

  • Activated by a sensor (including a virtual sensor such as an MQTT subscription or a video camera detection event)
  • Activated by time (e.g. in specific intervals or at specific points in time)

In our case we create a trigger called “Heartbeat” which triggers every 1 minute. This will then trigger our action every minute.

Click on the little clock face on the top right to create a time based trigger.

In this case, we called it “Heartbeat” and trigger it every 1 minutes.

Under the “Action” Tab, you can select which Action to trigger:

We select the action we just created. Click “Save”, then click the checkbox to activate it in the Triggers list.

Congratulations, that’s it! You should now see the readings of the 4 Layers on your Gravio LED Matrix for 15 seconds each.

If you have any questions or concerns, please don’t hesitate to join our Slack or write to us on We’re looking forward to seeing what you will come up with!


Setting a Gravio LED matrix up to display environmental data is very easy. It can literally be achieved in a few minutes by pairing the sensors, adding them to layers, creating an action that rotates the data and triggering it with time-based triggers.

Latest Posts
[Case Study] Learn How a national Japanese logistics company and a System Integrator implemented Gravio to Improve data quality
How a national Japanese Logistic company used Gravio to improve data quality
Friday, July 19, 2024
Read More
[Tutorial] How to take a screenshot on your Mac, send it to a local multimodal AI (LLava/Ollama), and trigger an API
In this blog post we learn how to take a screenshot on a mac, send that screenshot to a local AI (in this case Llava/Ollama) and trigger an API
Monday, July 1, 2024
Read More