Decoding Payloads
Introduction
Many of you already know payload decoding from LoRaWAN devices. There, the concept is used to convert the data, usually sent as a simple byte sequence from the sensor, into usable data.
We have decided that we will use this concept for every IoT device, and therefore also for the Particle.io devices.
The advantages are obvious:
Run Code in the Cloud
Thanks to Payload Decoder, you have a piece of code that is called every time a new message comes from the device or platform (Spark status messages).
Advanced Processing
When receiving a message, short operations can be performed, including:
Simple calculations
Conversions of value range or unit
Comparison with historical data from the database of your devices
Your first Payload Decoder
A simple payload decoder for Particle.io Devices looks like this on Datacake:
Routing of Data
Payload decoders respond to calls to Particle.Publish()
functions in the device firmware. The following code snippet shows the routing for a publish of a simple value (temperature).
Execution
Every time your device calls a Particle.Publish
, and it is forwarded via Webhook and Datacake, then the encoder is called accordingly.
In the scope of the decoder you have access to the following elements:
Payload/message data of the Particle.Publish
Event name of the publish
Last measured values via database of the device on Datacake
Example
Code on Device
Let's assume the Code on the Device looks like the following:
Datacake Decoder
The decoder for this message type would look like the following on Datacake:
Concept
As you can see from the above example, the decoder is used to extract the data format from the Particle.Publish and split it into respective payloads.
The advantage here is clearly that you have a completely free hand in the design of the data format on the device.
You do not need to worry about the structuring of the data but can use the decoders to be able to react flexibly to different formats.
Database Mapping
Now you probably wonder how the database of the device knows how to store the temperature. Decisive for the mapping is the return type of the decoder, more precisely the following:
Returning Data
Payload Decoders must always return Data in the following format:
Create Database Field
A corresponding field must be created on the device, whose identifier is identical with the field property of the return dictionary. So in the above example with temperature, we create a field with identifer of the same name. To do this, switch to the configuration of your device and scroll down to the fields section.
There you click on the button "Add Field" to create a new field accordingly.
To learn more about fields, their types and functions, please click on the following link:
Decoding JSON Data
Since Particle does not tell you in which format you may transmit the data, you can also send a JSON string directly from the device. A corresponding decoder would then look like this:
Code on Device
Let's assume your device firmware is publishing data directly as a JSON-String:
Datacake Decoder
A decoder on Datacake would look like the following snippet:
Routing Spark Events
A payload decoder can also respond to the appropriate Spark diagnostics and system publishes. For this, only a corresponding webhook must be available.
Read Database
The decoders allow you to read the current measured values from the database and use them for decoding. This can be useful to e.g.:
Calculate average, minium or maximum values
Set geofencing and do distance measuring
Count messages or state changes
It can also be used to pre-sort data and discard irrelevant changes.
Basic example
Measured values can be called directly in the decoder via the global measurements
object. This then looks something like the following:
Returning the Data
You must have noticed that the payload decoders return the data in an appropriate format. This looks like this:
The reason for this is that we use the appropriate form to address exactly our API and the database behind it.
Record Historical Data
And exactly for this reason, there are additional functions, such as the transmission of a time stamp to be able to import historical data or measurement data sequences. This then looks something like this:
Device Code
Let us now assume that the device measures the temperature every fifteen minutes and transmits it in an array every full hour.
Datacake Decoder
The decoder for such a transmission pattern would then look as follows.
Uppercase
Please make sure that when returning the values from the decoder, you specify the identifier for the database this always in capital letters. Otherwise an assignment may not be possible.
Default Decoder
If you create a Particle.io device without a template on the Datacake platform, then this device is already delivered with a basic decoder. This decoder looks like this:
In the above decoder you can see a so called converter at the end of the code. This takes some typing work from you when writing the decoder. How this works exactly is explained in the next section.
Converter
As mentioned earlier, the Datacake platform expects a decoder to return data in the following format:
However, since this is a lot of write effort, there is a method to save some of this effort.
For this we create a simple dictionary and store the data as a simple key-value pair.
A small function at the end of the decoder converts this dictionary automatically into the datacake format.
Auto Uppercase
Another advantage is that you don't have to worry about capitalization. During the conversion all field names are converted to upper case letters (as required).
Of course, this only makes sense if you work with a high number of messages in the decoder, e.g. if you receive a payload with JSON or many aggregated measurement data.
Last updated