Documentatie/Message-Queue-Endpoint.md

9.1 KiB
Raw Blame History

Sending data to the MQTT endpoint

This documentation page provides the basics on how to access the MQTT endpoint of FarmMaps, using Python code for examples.
The same workflow can be followed for different languages, as long as there's a MQTT client and a Protobuf library for the language you are using.
FarmMaps provides an MQTT endpoint to allow your application to send signals to FarmMaps, and listen for events.
These signals are collected in a "message queue".

If you are not familiar with anything mentioned, please see the FAQ at the end.

Prerequisites

To follow along with the examples, you need:

  • Access to the FarmmMaps MQTT broker (user, password), these can be requested here
  • To have Python 3.7 or higher installed.
  • To come up with a unique identifier for your sensors, to use in the URN.

Generally the URN is composed of the company or brand name and a device serial number like so:

"urn:dev:<company-name>:<device-serial-number>"

To run the python example code you also need to install the paho-mqtt and the protobuf modules.

Workflow

The general workflow for pushing data to the FarmMaps MQTT endpoint consists of:

  • Preparing your data as a protobuf message
  • Connecting to the MQTT Broker
  • Publishing the message and waiting for confirmation.
  • Closing the connection

To be able to publish data to the broker, you need an username and password, and posibly a dedicated topic for your data. An account can be requested through email from XXXXX. This user is only for the broker, and can not be used for the other API's. Generally, one user is provided per application (so no individual users for different sensors within the same application).

Settings

To connect to the MQTT Broker, we'll use the following settings:

Parameter Default Description
CLIENT_ID - This ID is used to identify the connecting party (your software) to the broker, please use your company or brand name.
USER - Your username at the FarmMaps broker.
PWD - Your password at the FarmMaps broker.
HOST farmmaps.awtest.nl The address of the FarmMaps broker.
PORT 1883 The port number of the FarmMaps broker.
KEEPALIVE 60 Number of seconds to maintain the connection, even if no messages are published.
TOPIC trekkerdata/sensors The topic to publish the messages to at the broker.

For preparing the message, we'll use the following settings:

Parameter Default Description
META_DATA_URL - HTTP adress where a JSON metadata file for the messages is provided (by the data source).
BASE_DEV_URN urn:dev:<company-name>:<device-serial-number> To identify each unique device from different parties, FarmMaps uses a Universal Resource Name (URN). Farmmaps uses this code to link devices and data to their owners and manage access to data devices.

All settings are required except for the META_DATA_URL, this parameter is optional.

Connecting to the MQTT Broker

First, we will create a MQTT client to connect to the FarmMaps MQTT broker:

#import mqtt client
from farmmaps import mqtt 

#MQTT connection settings
CLIENT_ID = '<your client id>'
USER = "<your username>"
PWD = "<your password here>"
HOST = "farmmaps.awacc.nl"
PORT = 1883
KEEPALIVE = 60
TOPIC = "<company-name>/<topic-name>"

#set up MQTT client
mqtt_client = mqtt.create_client(CLIENT_ID, USER, PWD, HOST, PORT, KEEPALIVE)

Preparing your data as a protobuf message

Farmmaps uses protobuf to specify a structure for the messages published in MQTT.
More information on protobuf can be found at Google Developer Documentation.
Python objects can easily be converted to protobuf messages and back (when recieving messages from the broker).

The message structure for the data can be found in farmmaps/farmmaps.proto, and is shown below.

Datastructure explained
The main message is the Datapoint object, containing time, location, machine-id and a reference to metadata. Then, there is a sensors parameter, which holds a dictionary of SensorType objects. Each SensorType holds a key and a value, where the key identifies the Sensortype, and the value is the value of the sensor. Finally, there's an optional metadata_url parameter that specifies where FarmMaps can get metadata. This is data that specifies how the values for each sensortype are to be interpreted.

syntax = "proto2";

package farmmaps;

message DataPoint {
  required string machine_id = 1;

  required int64 ts = 2;
  required float lat = 3;
  required float lon = 4;
  optional float altitude = 5;
  optional float heading = 6;
  optional float speed = 7;

  message SensorType {
    required string key = 1;
    required float value = 2;
  }

  repeated SensorType sensors = 8;

  optional string metadata_url = 9;
}

This .proto file is processed by protobuf and converted to a python class. This class can be found in farmmaps/farmmaps_pb2.py and should not be edited directly.
Edit the proto file and regenerate when you need to change the format. In our code we can then use the protobuff class like so:

#import protobuf class
from farmmaps import farmmaps_pb2 

# message settings
META_DATA_URL = 'http://68.183.9.30:8082/v3/meta/sensors'
BASE_DEV_URN = 'urn:dev:nl.trekkerdata:%s'

#create sample data for message 
device_id = "ib017"
timestamp = 1582731928 #epoch timestamp
longitude = 6.07206584
latitude = 52.959456183
altitude = 7.3
heading = 94.8534
speed = 0.026
sensordata = {"spn_898": 0, "spn_518": 0, "spn_513": 50, "spn_190": 1000}

# create empty protobuf message
msg = farmmaps_pb2.DataPoint()

# assign values to message properties
msg.machine_id = BASE_DEV_URN % (device_id)
msg.ts = timestamp
msg.lon = longitude
msg.lat = latitude
msg.altitude = altitude
msg.heading = heading
msg.speed = speed

for key, value in sensordata.items():
    measurement = msg.sensors.add()
    measurement.key = key
    measurement.value = value

In case you want to modify the structure of the object, the .proto can be modified, and the python module needs to be regenerated. For now, we'll stick with the pregenerated protobuf class.

References

FAQ

How do I install the required Python modules?

This tutorial requires the paho-mqtt and protobuf modules. These can be installed with the following commands:

Windows command prompt:

py -m pip install 'paho-mqtt==1.5.0'
py -m pip install 'protobuf==3.11.0'

Linux terminal:

python3 -m pip install 'paho-mqtt==1.5.0
python3 -m pip install 'protobuf==3.11.0'

What is a message queue?

A message queue is commonly used to make software programs able to send messages between eachother, and thereby making it easy for data to flow from one program into another. There are many variants of message queues, some popular names are Apache Kafka, MQTT and RabbitMQ.

In message queue systems there is usually one central "hub" called the broker or "message broker". This broker holds all the messages. Usually, the messages are organised in groups called "topics".

Now, there are two things that external services can do.

  • An external service (like a sensor) could publish a message to a certain topic. For example, a temperature sensor would publish the temperature at a specific time and location to the "temperatureMeasurements" topic.
  • An external service can "subscribe" to this topic by connecting to the broker. This service will then recieve every temperature measurement.

When the subscribed service temporarily disconnects from the broker, it will not recieve any messages, but the messages will remain stored at the broker. Depending on configuration, messages will be kept longer or shorter, or be deleted after they reach the subscribers but the intent is always to ensure the messages get from the publisher to the subscriber.

Setting communication between applications up like this makes things a lot more flexible than connecting systems directly and provides a central point for management of all communication.

What is protobuf?

To quote the Google Documentation:

Protocol buffers are Google's language-neutral, platform-neutral, extensible mechanism for serializing structured data think XML, but smaller, faster, and simpler. You define how you want your data to be structured once, then you can use special generated source code to easily write and read your structured data to and from a variety of data streams and using a variety of languages.

More information can be found in the protobuf documentation.