Throughout our experience as a major player of the Internet of Things in Belgium, we worked on projects where every single element of the IoT value chain was at the tip of the arrow of the ICT evolution : modern low power sensors, low power networks such as LoRa and Narrowband IoT, cloud-based platforms from Microsoft, SAP or other big players, advanced use cases for automation leveraging machine learning, … However we also support projects where we need to help bridging legacy systems, which may reside of both extremes of the value chain:
- industrial devices using “historical” data protocols such as RS-232, RS-432, ModBus, etc require connection to newer modern data & application platforms leveraging LoRaWAN and narrowband IoT transport. Proximus works with gateways and bridges providers whose devices help interconnect these worlds;
- newer IoT devices and sensors using modern IoT networks need to send their data to legacy applications with limited data ingestion capabilities (csv files, ftp file transfer or email-based data ingestion, …);
- a combination of 1 and 2 above, ie leveraging LoRaWAN or NBIoT networks to receive data from legacy industrial sensors and machinery and let legacy applications ingest this data
CloudEngine – Swiss army knife for integrations
To support scenarios 2 and 3 above, we want the Proximus CloudEngine (CloE) to be our Swiss army knife. Until last year, CloE could :
- ingest data from Proximus LoRaWAN devices or any IP device via MQTT, HTTP or even SMS inbound traffic;
- process the incoming data using visual flows or advanced scripting language – EnCoScript – that would be extremely familiar to any Javascript developer;
- send processed data to external systems using MQTT, HTTP or pre-defined integrations to SAP IoT Cloud and Microsoft Azure Event Hub.
Starting this month, we will add a few features to CloE to make it easier for customers, ISPs and ASPs to integrate with older legacy systems. The first feature we’ve added this week in production, is the ability to generate csv files out of your IoT data and mail it over. In the next sprints, we will be adding scheduling triggers to scripts, outgoing communication via sFTP, and others are in design phase… . More on these when they’ll be available, so let’s first look at the csv capability.
Step 1 – Accumulating data
We introduced a new “IoTStore” module to CloE that you can use while in script mode. As shown in the example below, you just need to create the IoTStore module in your script, add your IoT incoming data to a row and add this row to the store. We introduced a new operator on Array objects (“.asCSV”) so you can specify how your csv will be delimited (ie comma separated, semi-column separated, etc).
Data stored in the IoTStore is associated to a tag or label (“tph” as example here in the last line of the script below). This is a capability that lets you index your data in a way you want – type of data, origin of the data, type of device, … Anything relevant for you that may help in extracting only specific data of the IoTStore when creating your csv file.
# Import the IoTStore module object iotstore = create("IoTStore"); function run(object data, object tags, string asset) { double temp = 7.5; int pressure = 1019; int humidity = 86; # Create a row with all needed data object datarow = [ temp, pressure, humidity ]; # Format this row as a new line with the adequate value delimited - here "comma" per default string row = datarow.asCSV(null); # Add this new line in the IoTStore iotstore.add("tph", row); }
Step 2 – Creating the csv attachment and sending it by mail
Creating the csv from your accumulated IoT data starts by specifying a point in time before which all accumulated data in the IoTStore will be integrated in the file. In the script example below, we:
- create a data object to record the current date & time
- define the column header for the csv file
- generate the csv attachmment. If there is data in the file, we send it by mail then clear the IoTStore of all data up to the current date and time
object datetime = create("DateTime", "'iot'_YYYYMMDD_HHmmss'.csv'"); object iotstore = create("IoTStore"); object mail = create("Mail"); object text = create("Text"); function run(object data, object tags, string asset) { # Save the current date and time object now = datetime.now(); # define the column headers for the csv file object headerrow = ["temperature", "pressure", "humidity"]; string header = headerrow.asCSV(null); # Get all the data labelled "tph" from the IoTStore # and create the file "csv" with the header defined above string csv = iotstore.getBefore("tph", null, header, now); # Create the mail object mail.addTo("example@mail.com"); mail.setSubject("IoT store output"); if (text.length(csv) > (text.length(header) + 4)) { mail.setBody("IoT data attached for " + now); # add the csv "file" as attachment to the mail object, then send the mail mail.addAttachment(csv, "text/csv", datetime.format(now, "CET")); mail.send(); # mail sent, clean iotstore iotstore.deleteBefore("tph",now); } else { # no data in iotstore mail.setBody("IoT data empty at " + now); mail.send(); } }
Step 3 – Deciding when to send the data
The IoTStore object can be queried for the current number of stored records against a specific label with the “count” method. This makes it easy to trigger an action such as generating and sending a csv file when a defined number of records has been reached.
In a next article, we will see how new scheduling triggers, available soon in CLoE, will support scheduled csv file generation and sending.
Stay tuned!