iot:tutorial:aws-integration
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revisionLast revisionBoth sides next revision | ||
iot:tutorial:aws-integration [2020/06/15 11:33] – [Setting up trigger script] atolstov | iot:tutorial:aws-integration [2020/07/07 13:26] – DELETE UNFINISHED PARTS! atolstov | ||
---|---|---|---|
Line 232: | Line 232: | ||
{{ iot: | {{ iot: | ||
- | For know, it is WebHMI can post the data from real devices to various | + | For know, it is WebHMI can post the data from real devices to AWS MQTT Client, which allow update to various AWS Thing' |
Line 268: | Line 268: | ||
===== Testing and Next Steps ===== | ===== Testing and Next Steps ===== | ||
- | ==== To make it possible to post to AWS Broker set up topic for mqtt_pubslish as update and mqtt_get as get. (fix topic' | + | ==== Testing |
Okay, so now it is working AWS MQTT communication and tested successfully. | Okay, so now it is working AWS MQTT communication and tested successfully. | ||
The next step is to reconfigure MQTT topics from '' | The next step is to reconfigure MQTT topics from '' | ||
{{ : | {{ : | ||
- | This will lead the message | + | There is a trigger |
{{ : | {{ : | ||
+ | This settings will lead the message to AWS IoT Core Rules processing, and in the end hitting to DynamoDB table. | ||
+ | |||
==== Setting up trigger script | ==== Setting up trigger script | ||
- | There should be a trigger to change the register' | + | There should be a trigger to change the register' |
+ | For the testing purpose you can do this manually. | ||
{{ : | {{ : | ||
- | ==== __Ensure, | + | After that, check the DynamoDB Table' |
{{ : | {{ : | ||
+ | When you ensure, that data comes to DynamoDB triggered by auxilary WebHMI register change, you can set up the trigger' | ||
+ | |||
As an example there is two practices such as time dependent (oscillator) or quantity depended (decimation) trigger. \\ | As an example there is two practices such as time dependent (oscillator) or quantity depended (decimation) trigger. \\ | ||
- | Oscillator code: | + | \\ |
+ | **//Oscillator code://** \\ | ||
+ | If there is a neccesity to post with predefined frequency, for example 15 second can be done with following script. | ||
- | Decimation code: | + | <code lua> |
+ | a1,a2,a3,a4 = 0,0,0,0 | ||
+ | |||
+ | function TOGGLE(reg) | ||
+ | WriteReg(reg, | ||
+ | end | ||
+ | |||
+ | function main (userId) | ||
+ | |||
+ | if GetReg(" | ||
+ | TOGGLE(" | ||
+ | decimation_counter = 0 | ||
+ | end--if | ||
+ | |||
+ | end--eof | ||
+ | </ | ||
+ | **//Decimation code://** | ||
\\ | \\ | ||
+ | If there is several target register, such as //counter, counter2, etc.//, but if it changes too often, this code will help to decimate the output trigger signal. | ||
+ | <code lua> | ||
+ | a1,a2,a3,a4 = 0,0,0,0 | ||
+ | |||
+ | decimation_counter = 0; | ||
+ | DECIMATION = 10 | ||
+ | |||
+ | function TOGGLE(reg) | ||
+ | WriteReg(reg, | ||
+ | end | ||
+ | |||
+ | function main (userId) | ||
+ | if decimation_counter> | ||
+ | | ||
+ | -- check if target registers changed | ||
+ | flag = a1 ~= GetReg(" | ||
+ | or | ||
+ | a2 ~= GetReg(" | ||
+ | or | ||
+ | a3 ~= GetReg(" | ||
+ | or | ||
+ | a4 ~= GetReg(" | ||
+ | | ||
+ | if flag then | ||
+ | decimation_counter = decimation_counter + 1 | ||
+ | end | ||
+ | | ||
+ | -- update local variale values for next scan check | ||
+ | a1 = GetReg(" | ||
+ | a2 = GetReg(" | ||
+ | a3 = GetReg(" | ||
+ | a4 = GetReg(" | ||
+ | | ||
+ | -- DEBUG trace | ||
+ | DEBUG(" | ||
+ | DEBUG(" | ||
+ | DEBUG(" | ||
+ | DEBUG(" | ||
+ | DEBUG(" | ||
+ | | ||
+ | -- decimation_counter for not too often upload to AWS | ||
+ | if decimation_counter >= DECIMATION then | ||
+ | TOGGLE(" | ||
+ | decimation_counter = 0 | ||
+ | end--if decimation_counter | ||
+ | | ||
+ | end--eof | ||
+ | </ | ||
+ | |||
+ | ==== Next steps ==== | ||
+ | ==== Create visualisation environment ==== | ||
+ | To use Kibana visualisation firstly you need to run an ElasticSearch engine deployed on virtual machine. | ||
+ | So the idea is to create ElasticSearch Instance with Kibana plugin on-board. | ||
+ | Fortunately, | ||
+ | {{ : | ||
+ | So, crete a new one instance with following settings. | ||
+ | {{ : | ||
+ | Specify domain name and instance type (size, e.g. t2.small). | ||
+ | {{ : | ||
+ | Specify the access policy, the good practice is a public one, but there is should IP resctiction be specified. | ||
+ | {{ : | ||
+ | [[https:// | ||
+ | {{ : | ||
+ | The first line has your current IP. | ||
+ | As a CIDR block you can rewrite you IP as XXX.XXX.XXX.XXX/ | ||
+ | Otherwise, copy your IP to field as is. | ||
+ | \\ | ||
+ | In the end, this will lead your to this settings. Review it and confirm creation if it fine. | ||
+ | {{ : | ||
+ | After the creation, in several minutes it will be endpoint adress created | ||
+ | {{ : | ||
+ | Wait until it would be done. | ||
+ | {{ : | ||
+ | At this moment, you will have access to Kibana plugin, the following link is provided above. | ||
+ | {{ : | ||
+ | ==== Change MQTT payload to meters data as demo project ==== | ||
+ | |||
+ | Let's burn project to WebHMI with virtual electric meters. It is simulate power consumption with predefined daily load curve but small fluctiations. Just like real power system does. | ||
+ | |||
+ | So let us formulate JSON payload (according to AWS Shadow rules) those registers with output data, which represents consumpted energy in kWh. | ||
+ | |||
+ | It is possible to recalculate in script values to kWh from impulses number e.g. and add a metadata such as location, timestamp etc. | ||
+ | Here is an examples of code to do this. | ||
+ | <code lua - counters.lib.lua> | ||
+ | <code lua - counters simulation.lua> | ||
+ | <code lua - decimator.lua> | ||
+ | <code lua - AWS_MQTT_upload.lua> | ||
+ | |||
+ | <code sql - IoT Core Rule SQL-like query for virtual counters> | ||
+ | cast(state.reported.counters.value.counter2 as DECIMAL) as counter2, | ||
+ | cast(state.reported.counters.value.counter3 as DECIMAL) as counter3, | ||
+ | cast(state.reported.counters.value.counter4 as DECIMAL) as counter4, | ||
+ | cast(state.reported.counters.units as STRING) as units, | ||
+ | cast(state.reported.location as STRING) as location | ||
+ | cast((state.reported.timestamp) as STRING ) as timestamp | ||
+ | FROM ' | ||
+ | </ | ||
+ | |||
+ | As result, every single MQTT upload will trigger IoT Rule (to process new entries from WH script, such as counters and location e.g. should be SQL rewritten)to put data to DynamoDB. | ||
+ | The next step is to create a Lambda function, that will put data to visualisation dashboard Kibana on ElasticSearch instance. | ||
+ | ==== Create a DynamoDB to Elasticsearch bridge using Lambda function ==== | ||
+ | There is a uploadable code with function, written on Node.js v10. | ||
+ | |||
+ | Import it to AWS Lambda and test. Update roles with appropriate policies. | ||
+ | |||
+ | To test let's create a predefined DDB Upload test event. | ||
+ | |||
+ | If it is successful, let's post smth in WH, to update IoT Core Shadow, which will cause IoT Core Rule processing with SQL-like query, and append shadow in the DDB. | ||
+ | |||
+ | Append to DDB should triggers that new Lambda function, which will repost the data to ElasticSearch instance. To check it, just go to Kibana and click search button. If there is no data in one minute after posting from WH, so something get wrong. The more likely with IAM. There is a CloudWatch logs to investigate problem. | ||
- | Create ES | ||
- | check Kibana metric created. | ||
- | Work with Kibana | ||
iot/tutorial/aws-integration.txt · Last modified: 2021/07/22 07:32 by atolstov