User Tools

Site Tools


This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
iot:tutorial:aws-integration [2020/06/22 08:06]
atolstov [Configure WebHMI]
iot:tutorial:aws-integration [2020/07/07 13:26] (current)
Line 309: Line 309:
 **//​Decimation code://** **//​Decimation code://**
 \\  \\ 
-If there is several target register, such us (//counter, counter2, etc.//), but if it changes too often, this code will help to decimate the output trigger signal.+If there is several target register, such as //counter, counter2, etc.//, but if it changes too often, this code will help to decimate the output trigger signal.
 <code lua> <code lua>
 a1,a2,a3,a4 = 0,0,0,0 a1,a2,a3,a4 = 0,0,0,0
Line 384: Line 384:
 At this moment, you will have access to Kibana plugin, the following link is provided above. At this moment, you will have access to Kibana plugin, the following link is provided above.
 {{ :​iot:​tutorial:​es_kibana_loaded.png?​direct |}} {{ :​iot:​tutorial:​es_kibana_loaded.png?​direct |}}
-==== Create a DynamoDB ​to Elasticsearch bridge with Lambda function help ====+==== Change MQTT payload ​to meters data as demo project ​====
 +Let's burn project to WebHMI with virtual electric meters. It is simulate power consumption with predefined daily load curve but small fluctiations. Just like real power system does. 
 +So let us formulate JSON payload (according to AWS Shadow rules) those registers with output data, which represents consumpted energy in kWh.
-<​del>​ +It is possible to recalculate ​in script values to kWh from impulses number e.g. and add a metadata such as location, timestamp etc. 
-0) formulate the payload ​in JSON and clarify the DDBv2 rule +Here is an examples of code to do this. 
-1) Create Lambda function, set up destinations,​ triggers+<code lua - counters.lib.lua>​1</​code>​ 
 +<code lua - counters simulation.lua>​2</​code>​ 
 +<code lua - decimator.lua>​3</​code>​ 
 +<code lua - AWS_MQTT_upload.lua>​4</​code>​
-2) test function ​successfully.+<code sql - IoT Core Rule SQL-like query for virtual counters>​SELECT cast(state.reported.counters.value.counter1 as DECIMALas counter1,  
 +cast(state.reported.counters.value.counter2 as DECIMAL) as counter2,  
 +cast(state.reported.counters.value.counter3 as DECIMAL) as counter3,  
 +cast(state.reported.counters.value.counter4 as DECIMAL) as counter4,  
 +cast(state.reported.counters.units as STRING) as units,  
 +cast(state.reported.location as STRING) as location 
 +cast((state.reported.timestamp) as STRING ) as timestamp  
 +FROM '​$aws/​things/​WebHMI_Dnipro_1/​shadow/​update'​ 
 +As result, every single MQTT upload will trigger IoT Rule (to process new entries from WH script, such as counters and location e.g. should be SQL rewritten)to put data to DynamoDB.  
 +The next step is to create a Lambda function, that will put data to visualisation dashboard Kibana on ElasticSearch instance. 
 +==== Create a DynamoDB to Elasticsearch bridge using Lambda function ==== 
 +There is a uploadable code with function, written on Node.js v10. 
 +Import it to AWS Lambda and test. Update roles with appropriate policies. 
 +To test let's create a predefined DDB Upload ​test event.  
 +If it is successful, let's post smth in WH, to update IoT Core Shadow, which will cause IoT Core Rule processing with SQL-like query, and append shadow in the DDB.  
 +Append to DDB should triggers that new Lambda ​function, which will repost the data to ElasticSearch instance. To check it, just go to Kibana and click search button. If there is no data in one minute after posting from WH, so something get wrong. The more likely with IAM. There is a CloudWatch logs to investigate problem.
-3)check Kibana metric created. 
-4) Work with Kibana (demo visualisation)</​del>​ 

Page Tools