Decoding Payloads

Introduction

Many of you already know payload decoding from LoRaWAN devices. There, the concept is used to convert the data, usually sent as a simple byte sequence from the sensor, into usable data.

We have decided that we will use this concept for every IoT device, and therefore also for the Particle.io devices.

The advantages are obvious:

Run Code in the Cloud

Thanks to Payload Decoder, you have a piece of code that is called every time a new message comes from the device or platform (Spark status messages).

Advanced Processing

When receiving a message, short operations can be performed, including:

  • Simple calculations

  • Conversions of value range or unit

  • Comparison with historical data from the database of your devices

Your first Payload Decoder

A simple payload decoder for Particle.io Devices looks like this on Datacake:

function Decoder(payload, event) {
var temperature = payload;
return [
{
field: "TEMPERATURE",
value: temperature
}
];
}

Routing of Data

Payload decoders respond to calls to Particle.Publish() functions in the device firmware. The following code snippet shows the routing for a publish of a simple value (temperature).

Execution

Every time your device calls a Particle.Publish, and it is forwarded via Webhook and Datacake, then the encoder is called accordingly.

In the scope of the decoder you have access to the following elements:

  • Payload/message data of the Particle.Publish

  • Event name of the publish

  • Last measured values via database of the device on Datacake

Example

Code on Device

Let's assume the Code on the Device looks like the following:

Particle.publish("temperature", 23.02);

Datacake Decoder

The decoder for this message type would look like the following on Datacake:

function Decoder(payload, event) {
if (event == "temperature") {
var temperature = parseFloat(payload);
return [
{
field: "TEMPERATURE",
value: temperature
}
];
}
}

Concept

As you can see from the above example, the decoder is used to extract the data format from the Particle.Publish and split it into respective payloads.

The advantage here is clearly that you have a completely free hand in the design of the data format on the device.

You do not need to worry about the structuring of the data but can use the decoders to be able to react flexibly to different formats.

Database Mapping

Now you probably wonder how the database of the device knows how to store the temperature. Decisive for the mapping is the return type of the decoder, more precisely the following:

Returning Data

Payload Decoders must always return Data in the following format:

return [
{
field: "TEMPERATURE",
value: temperature
}
];

Create Database Field

A corresponding field must be created on the device, whose identifier is identical with the field property of the return dictionary. So in the above example with temperature, we create a field with identifer of the same name. To do this, switch to the configuration of your device and scroll down to the fields section.

There you click on the button "Add Field" to create a new field accordingly.

To learn more about fields, their types and functions, please click on the following link:

Decoding JSON Data

Since Particle does not tell you in which format you may transmit the data, you can also send a JSON string directly from the device. A corresponding decoder would then look like this:

Code on Device

Let's assume your device firmware is publishing data directly as a JSON-String:

Particle.publish("device/data", "{temperature: 23.2, co2: 560, state: 1}");

Datacake Decoder

A decoder on Datacake would look like the following snippet:

function Decoder(payload, event) {
if (event == "device/data") {
var data = JSON.parse(payload);
return [
{
field: "TEMPERATURE",
value: data.temperature
},
{
field: "CO2",
value: data.temperature
},
{
field: "STATE",
value: data.state ? true : false;
}
];
}
}

Routing Spark Events

A payload decoder can also respond to the appropriate Spark diagnostics and system publishes. For this, only a corresponding webhook must be available.

function Decoder(payload, event) {
// Routing for Spark-Events
if (event == "spark/device/diagnostics/update") {
payload = JSON.parse(payload);
var technology = payload.device.network.cellular.radio_access_technology;
var operator = payload.device.network.cellular.operator;
return [
{
field: "CELLULAR_TECHNOLOGY",
value: technology
},
{
field: "CELLULAR_OPERATOR",
value: operator
}
];
}
if (event == "temperature") {
var temperature = parseFloat(payload);
return [
{
field: "TEMPERATURE",
value: temperature
}
];
}
}

Read Database

The decoders allow you to read the current measured values from the database and use them for decoding. This can be useful to e.g.:

  • Calculate average, minium or maximum values

  • Set geofencing and do distance measuring

  • Count messages or state changes

It can also be used to pre-sort data and discard irrelevant changes.

Basic example

Measured values can be called directly in the decoder via the global measurements object. This then looks something like the following:

function Decoder(payload, event) {
if (event == "temperature") {
var temperature = parseFloat(payload);
// access previous temperature value from database
var oldTemperature = measurements.TEMPERATURE.value;
if (temperature != oldTemperature) {
return [
{
field: "TEMPERATURE",
value: temperature
}
];
}
}
}

Returning the Data

You must have noticed that the payload decoders return the data in an appropriate format. This looks like this:

return [
{
field: "TEMPERATURE",
value: temperature
}
];

The reason for this is that we use the appropriate form to address exactly our API and the database behind it.

Record Historical Data

And exactly for this reason, there are additional functions, such as the transmission of a time stamp to be able to import historical data or measurement data sequences. This then looks something like this:

Device Code

Let us now assume that the device measures the temperature every fifteen minutes and transmits it in an array every full hour.

Particle.publish("temps", "[23.2, 24.5, 24.9, 25.04]");

Datacake Decoder

The decoder for such a transmission pattern would then look as follows.

function Decoder(payload, event) {
if (event == "temps") {
payload = JSON.parse(payload);
var time_now = ... // get timestamp
var temperature_t0 = parseFloat(payload[0]);
var temperature_t1 = parseFloat(payload[1]);
var temperature_t2 = parseFloat(payload[2]);
var temperature_t3 = parseFloat(payload[3]);
return [
{
field: "TEMPERATURE",
value: temperature_t0,
timestamp: time_now - 0
},
{
field: "TEMPERATURE",
value: temperature_t1,
timestamp: time_now - 15
},
{
field: "TEMPERATURE",
value: temperature_t2,
timestamp: time_now - 30
},
{
field: "TEMPERATURE",
value: temperature_t3,
timestamp: time_now - 45
},
];
}
}

Uppercase

Please make sure that when returning the values from the decoder, you specify the identifier for the database this always in capital letters. Otherwise an assignment may not be possible.

Default Decoder

If you create a Particle.io device without a template on the Datacake platform, then this device is already delivered with a basic decoder. This decoder looks like this:

/*
Default Decoder for Particle.io Devices
Version: 0.1.1
(c) Datacake GmbH
*/
function Decoder(payload, event) {
var decoded = {};
// User Functions
/*
if (event == "temperature") {
// payload = JSON.parse(payload) // do this if you send JSON
decoded.temperature = parseFloat(payload);
}
*/
// Particle Diagnostic and Online Status Events
if (event == "spark/device/diagnostics/update") {
payload = JSON.parse(payload);
decoded.cellular_radio_access_technology = payload.device.network.cellular.radio_access_technology;
decoded.cellular_operator = payload.device.network.cellular.operator;
decoded.cellular_signal_strength = payload.device.network.signal.strength;
decoded.cellular_signal_quality = payload.device.network.signal.quality;
} else if (event == "spark/status") {
if (payload == "online") {
decoded.online_status = true;
} else {
decoded.online_status = false;
}
}
// Array where we store the fields that are being sent to Datacake
var datacakeFields = []
// take each field from decoded and convert them to Datacake format
for (var key in decoded) {
if (decoded.hasOwnProperty(key)) {
datacakeFields.push({field: key.toUpperCase(), value: decoded[key]})
}
}
// forward data to Datacake
return datacakeFields;
}

In the above decoder you can see a so called converter at the end of the code. This takes some typing work from you when writing the decoder. How this works exactly is explained in the next section.

Converter

As mentioned earlier, the Datacake platform expects a decoder to return data in the following format:

return [
{
field: "TEMPERATURE",
value: temperature
}
];

However, since this is a lot of write effort, there is a method to save some of this effort.

For this we create a simple dictionary and store the data as a simple key-value pair.

function Decoder(payload, event) {
var decoded = {};
if (event == "temperature") {
decoded.temperature = parseFloat(payload);
}
}

A small function at the end of the decoder converts this dictionary automatically into the datacake format.

function Decoder(payload, event) {
var decoded = {};
if (event == "temperature") {
decoded.temperature = parseFloat(payload);
}
// Array where we store the fields that are being sent to Datacake
var datacakeFields = []
// take each field from decoded and convert them to Datacake format
for (var key in decoded) {
if (decoded.hasOwnProperty(key)) {
datacakeFields.push({field: key.toUpperCase(), value: decoded[key]})
}
}
// forward data to Datacake
return datacakeFields;
}

Auto Uppercase

Another advantage is that you don't have to worry about capitalization. During the conversion all field names are converted to upper case letters (as required).

Of course, this only makes sense if you work with a high number of messages in the decoder, e.g. if you receive a payload with JSON or many aggregated measurement data.