User Tools

Site Tools


iot:tutorial:aws-integration

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
iot:tutorial:aws-integration [2020/06/15 14:04] – [Setting up trigger script] atolstoviot:tutorial:aws-integration [2021/07/22 07:32] (current) – ↷ Links adapted because of a move operation atolstov
Line 232: Line 232:
 {{ iot:aws:wh:wh_test_via_mqttfx_connections.png?direct |}} {{ iot:aws:wh:wh_test_via_mqttfx_connections.png?direct |}}
  
-For know, it is WebHMI can post the data from real devices to various AWS Thingsvisualise with QuickSight tool and store this data in DynamoDB with possibility to import as .csv file to S3 bucket for example.+For know, it is WebHMI can post the data from real devices to AWS MQTT Clientwhich allow update to various AWS Thing's Shadow. The data could be visualised data with QuickSight tool and stored this data in DynamoDB with possibility to import as .csv file to S3 bucket for example.
  
  
Line 238: Line 238:
 ==== JSON formatted payload ==== ==== JSON formatted payload ====
 It is widely used MQTT payload format JSON. It is widely used MQTT payload format JSON.
-Let’s write a small script which will use inbuild [[:lua_cjson|cjson]] library to encode any WebHMI register values to JSON formatted strings.+Let’s write a small script which will use inbuild [[lua:lua_cjson|cjson]] library to encode any WebHMI register values to JSON formatted strings.
 Here is the code example to encode a string type message with Amazon Web Services (AWS) [[https://docs.aws.amazon.com/iot/latest/developerguide/device-shadow-document-syntax.html|Thing Report Shadow]] JSON format. Here is the code example to encode a string type message with Amazon Web Services (AWS) [[https://docs.aws.amazon.com/iot/latest/developerguide/device-shadow-document-syntax.html|Thing Report Shadow]] JSON format.
  
Line 309: Line 309:
 **//Decimation code://** **//Decimation code://**
 \\  \\ 
-If there is several target register, such us (//counter, counter2, etc.//), but if it changes too often, this code will help to decimate the output trigger signal.+If there is several target register, such as //counter, counter2, etc.//, but if it changes too often, this code will help to decimate the output trigger signal.
 <code lua> <code lua>
 a1,a2,a3,a4 = 0,0,0,0 a1,a2,a3,a4 = 0,0,0,0
Line 359: Line 359:
  
 ==== Next steps ==== ==== Next steps ====
-Create ES +==== Create visualisation environment ==== 
-check Kibana metric created. +To use Kibana visualisation firstly you need to run an ElasticSearch engine deployed on virtual machine.  
-Work with Kibana+So the idea is to create ElasticSearch Instance with Kibana plugin on-board. 
 +Fortunately, there is a detached Menu item in Analytics AWS Services group.  
 +{{ :iot:tutorial:aws_es_service.png?direct |}} 
 +So, crete a new one instance with following settings. 
 +{{ :iot:tutorial:es_type.png?direct |}} 
 +Specify domain name and instance type (size, e.g. t2.small). 
 +{{ :iot:tutorial:es_domain.png?direct |}} 
 +Specify the access policy, the good practice is a public one, but there is should IP resctiction be specified.  
 +{{ :iot:tutorial:es_security.png?direct |}} 
 +[[https://www.myip.com/|To check IP use myip.com]]. 
 +{{ :iot:tutorial:es_myip_blur.com.png?direct |}} 
 +The first line has your current IP.  
 +As a CIDR block you can rewrite you IP as XXX.XXX.XXX.XXX/16, where 16 is a number or bit, that can be changed to still satisfy the condition, so that will form a IP adresses range, which will protect your from ISP pool IP mascarading.  
 +Otherwise, copy your IP to field as is. 
 +\\  
 +In the end, this will lead your to this settings. Review it and confirm creation if it fine. 
 +{{ :iot:tutorial:es_review.png?direct |}} 
 +After the creation, in several minutes it will be endpoint adress created  
 +{{ :iot:tutorial:es_success_loading.png?direct |}} 
 +Wait until it would be done. 
 +{{ :iot:tutorial:es_active.png?direct |}} 
 +At this moment, you will have access to Kibana plugin, the following link is provided above. 
 +{{ :iot:tutorial:es_kibana_loaded.png?direct |}} 
 +==== Change MQTT payload to meters data as demo project ==== 
 + 
 +Let's burn project to WebHMI with virtual electric meters. It is simulate power consumption with predefined daily load curve but small fluctiations. Just like real power system does.  
 + 
 +So let us formulate JSON payload (according to AWS Shadow rules) those registers with output data, which represents consumpted energy in kWh. 
 + 
 +It is possible to recalculate in script values to kWh from impulses number e.g. and add a metadata such as location, timestamp etc. 
 +Here is an examples of code to do this. 
 +<code lua - counters.lib.lua>1</code> 
 +<code lua - counters simulation.lua>2</code> 
 +<code lua - decimator.lua>3</code> 
 +<code lua - AWS_MQTT_upload.lua>4</code> 
 + 
 +<code sql - IoT Core Rule SQL-like query for virtual counters>SELECT cast(state.reported.counters.value.counter1 as DECIMAL) as counter1,  
 +cast(state.reported.counters.value.counter2 as DECIMAL) as counter2,  
 +cast(state.reported.counters.value.counter3 as DECIMAL) as counter3,  
 +cast(state.reported.counters.value.counter4 as DECIMAL) as counter4,  
 +cast(state.reported.counters.units as STRING) as units,  
 +cast(state.reported.location as STRING) as location 
 +cast((state.reported.timestamp) as STRING ) as timestamp  
 +FROM '$aws/things/WebHMI_Dnipro_1/shadow/update' 
 +</code> 
 + 
 +As result, every single MQTT upload will trigger IoT Rule (to process new entries from WH script, such as counters and location e.g. should be SQL rewritten)to put data to DynamoDB.  
 +The next step is to create a Lambda function, that will put data to visualisation dashboard Kibana on ElasticSearch instance. 
 +==== Create a DynamoDB to Elasticsearch bridge using Lambda function ==== 
 +There is a uploadable code with function, written on Node.js v10. 
 + 
 +Import it to AWS Lambda and test. Update roles with appropriate policies. 
 + 
 +To test let's create a predefined DDB Upload test event.  
 + 
 +If it is successful, let's post smth in WH, to update IoT Core Shadow, which will cause IoT Core Rule processing with SQL-like query, and append shadow in the DDB.  
 + 
 +Append to DDB should triggers that new Lambda function, which will repost the data to ElasticSearch instance. To check it, just go to Kibana and click search button. If there is no data in one minute after posting from WH, so something get wrong. The more likely with IAM. There is a CloudWatch logs to investigate problem. 
 + 
  
  
  
  
iot/tutorial/aws-integration.1592229850.txt.gz · Last modified: 2020/06/15 14:04 by atolstov

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki