iot:tutorial:aws-integration
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revisionLast revisionBoth sides next revision | ||
iot:tutorial:aws-integration [2020/07/06 11:48] – atolstov | iot:tutorial:aws-integration [2020/07/07 13:26] – DELETE UNFINISHED PARTS! atolstov | ||
---|---|---|---|
Line 384: | Line 384: | ||
At this moment, you will have access to Kibana plugin, the following link is provided above. | At this moment, you will have access to Kibana plugin, the following link is provided above. | ||
{{ : | {{ : | ||
+ | ==== Change MQTT payload to meters data as demo project ==== | ||
+ | |||
+ | Let's burn project to WebHMI with virtual electric meters. It is simulate power consumption with predefined daily load curve but small fluctiations. Just like real power system does. | ||
+ | |||
+ | So let us formulate JSON payload (according to AWS Shadow rules) those registers with output data, which represents consumpted energy in kWh. | ||
+ | |||
+ | It is possible to recalculate in script values to kWh from impulses number e.g. and add a metadata such as location, timestamp etc. | ||
+ | Here is an examples of code to do this. | ||
+ | <code lua - counters.lib.lua> | ||
+ | <code lua - counters simulation.lua> | ||
+ | <code lua - decimator.lua> | ||
+ | <code lua - AWS_MQTT_upload.lua> | ||
+ | |||
+ | <code sql - IoT Core Rule SQL-like query for virtual counters> | ||
+ | cast(state.reported.counters.value.counter2 as DECIMAL) as counter2, | ||
+ | cast(state.reported.counters.value.counter3 as DECIMAL) as counter3, | ||
+ | cast(state.reported.counters.value.counter4 as DECIMAL) as counter4, | ||
+ | cast(state.reported.counters.units as STRING) as units, | ||
+ | cast(state.reported.location as STRING) as location | ||
+ | cast((state.reported.timestamp) as STRING ) as timestamp | ||
+ | FROM ' | ||
+ | </ | ||
+ | |||
+ | As result, every single MQTT upload will trigger IoT Rule (to process new entries from WH script, such as counters and location e.g. should be SQL rewritten)to put data to DynamoDB. | ||
+ | The next step is to create a Lambda function, that will put data to visualisation dashboard Kibana on ElasticSearch instance. | ||
==== Create a DynamoDB to Elasticsearch bridge using Lambda function ==== | ==== Create a DynamoDB to Elasticsearch bridge using Lambda function ==== | ||
+ | There is a uploadable code with function, written on Node.js v10. | ||
+ | Import it to AWS Lambda and test. Update roles with appropriate policies. | ||
+ | To test let's create a predefined DDB Upload test event. | ||
- | < | + | If it is successful, let's post smth in WH, to update IoT Core Shadow, which will cause IoT Core Rule processing with SQL-like query, and append shadow in the DDB. |
- | 0) formulate the payload | + | |
- | 1) Create Lambda function, set up destinations, triggers | + | |
- | 2) test function | + | Append to DDB should triggers that new Lambda |
- | 3)check Kibana metric created. | ||
- | 4) Work with Kibana (demo visualisation)</ | ||
iot/tutorial/aws-integration.txt · Last modified: 2021/07/22 07:32 by atolstov