User Tools

Site Tools


iot:tutorial:aws-integration

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
iot:tutorial:aws-integration [2020/06/12 12:25] – 1 atolstoviot:tutorial:aws-integration [2021/07/22 07:32] (current) – ↷ Links adapted because of a move operation atolstov
Line 232: Line 232:
 {{ iot:aws:wh:wh_test_via_mqttfx_connections.png?direct |}} {{ iot:aws:wh:wh_test_via_mqttfx_connections.png?direct |}}
  
-For know, it is WebHMI can post the data from real devices to various AWS Thingsvisualise with QuickSight tool and store this data in DynamoDB with possibility to import as .csv file to S3 bucket for example.+For know, it is WebHMI can post the data from real devices to AWS MQTT Clientwhich allow update to various AWS Thing's Shadow. The data could be visualised data with QuickSight tool and stored this data in DynamoDB with possibility to import as .csv file to S3 bucket for example.
  
  
Line 238: Line 238:
 ==== JSON formatted payload ==== ==== JSON formatted payload ====
 It is widely used MQTT payload format JSON. It is widely used MQTT payload format JSON.
-Let’s write a small script which will use inbuild [[:lua_cjson|cjson]] library to encode any WebHMI register values to JSON formatted strings.+Let’s write a small script which will use inbuild [[lua:lua_cjson|cjson]] library to encode any WebHMI register values to JSON formatted strings.
 Here is the code example to encode a string type message with Amazon Web Services (AWS) [[https://docs.aws.amazon.com/iot/latest/developerguide/device-shadow-document-syntax.html|Thing Report Shadow]] JSON format. Here is the code example to encode a string type message with Amazon Web Services (AWS) [[https://docs.aws.amazon.com/iot/latest/developerguide/device-shadow-document-syntax.html|Thing Report Shadow]] JSON format.
  
Line 268: Line 268:
  
 ===== Testing and Next Steps =====  ===== Testing and Next Steps ===== 
 +==== Testing ====
 +
 +Okay, so now it is working AWS MQTT communication and tested successfully.
 +The next step is to reconfigure MQTT topics from ''iot-test'' to Thing's Shadow ''.../update'' topic.
 +{{ :iot:aws:wh:wh_to_aws_update_and_trigger.png?direct |}}
 +There is a trigger to run script, by the register's value change to post a message to AWS.  
 +{{ :iot:aws:wh:wh_to_aws_update_scripts.png?direct |}}
 +This settings will lead the message to AWS IoT Core Rules processing, and in the end hitting to DynamoDB table.
 +
 +==== Setting up trigger script  ====
 +There should be a trigger to change the register's value to post a message to AWS, because of too short duration of scan and superfluity of update frequincy for cloud data processing.
 +For the testing purpose you can do this manually.
 +{{ :iot:aws:wh:wh_to_aws_update_triggered.png?direct |}}
 +After that, check the DynamoDB Table's Item append.
 +
 +{{ :iot:aws:wh:ddb_update.gif?direct |}}
 +When you ensure, that data comes to DynamoDB triggered by auxilary WebHMI register change, you can set up the trigger's toggle by script.
 +
 +As an example there is two practices such as time dependent (oscillator) or quantity depended (decimation) trigger. \\ 
 + \\ 
 +**//Oscillator code://**  \\ 
 +If there is a neccesity to post with predefined frequency, for example 15 second can be done with following script.
 +
 +<code lua>
 +a1,a2,a3,a4 = 0,0,0,0
 +
 +function TOGGLE(reg) 
 +    WriteReg(reg, 1 - R(reg))
 +end 
 +
 +function main (userId)
 +    
 +    if GetReg("GT") % 15  == 0 then -- Global Time (T0@Internal)
 +                TOGGLE("aws_trigger")
 +                decimation_counter = 0
 +    end--if
 +    
 +end--eof
 +</code>
 +**//Decimation code://**
 +\\ 
 +If there is several target register, such as //counter, counter2, etc.//, but if it changes too often, this code will help to decimate the output trigger signal.
 +<code lua>
 +a1,a2,a3,a4 = 0,0,0,0
 +
 +decimation_counter = 0;
 +DECIMATION = 10
 +
 +function TOGGLE(reg) 
 +    WriteReg(reg, 1 - R(reg))
 +end
 +
 +function main (userId)
 +                            if decimation_counter>1 then DEBUG("decimation_counter:"..decimation_counter) end
 +                            
 +                            -- check if target registers changed
 +    flag =  a1 ~= GetReg("counter"
 +                        or
 +            a2 ~= GetReg("counter2")
 +                        or
 +            a3 ~= GetReg("counter3")
 +                        or
 +            a4 ~= GetReg("counter4")
 +            
 +    if flag then 
 +        decimation_counter = decimation_counter + 1
 +    end
 +    
 +                            -- update local variale values for next scan check
 +    a1 = GetReg("counter") -- Counter 1 (D0@Internal)
 +    a2 = GetReg("counter2") -- Counter 2 (D0@Internal)
 +    a3 = GetReg("counter3") -- Counter 3 (D0@Internal)
 +    a4 = GetReg("counter4") -- Counter 4 (D0@Internal)
 +    
 +                            -- DEBUG trace
 +                            DEBUG("aws_trigger" .. ":" .. R("aws_trigger") .. " flag:" .. tostring(flag))
 +                            DEBUG("a1:"..a1)
 +                            DEBUG("a2:"..a2)
 +                            DEBUG("a3:"..a3)
 +                            DEBUG("a4:"..a4)
 +                            
 +                            -- decimation_counter for not too often  upload to AWS
 +    if decimation_counter >= DECIMATION then
 +            TOGGLE("aws_trigger")
 +            decimation_counter = 0
 +    end--if decimation_counter
 +    
 +end--eof
 +</code>
 +
 +==== Next steps ====
 +==== Create visualisation environment ====
 +To use Kibana visualisation firstly you need to run an ElasticSearch engine deployed on virtual machine. 
 +So the idea is to create ElasticSearch Instance with Kibana plugin on-board.
 +Fortunately, there is a detached Menu item in Analytics AWS Services group. 
 +{{ :iot:tutorial:aws_es_service.png?direct |}}
 +So, crete a new one instance with following settings.
 +{{ :iot:tutorial:es_type.png?direct |}}
 +Specify domain name and instance type (size, e.g. t2.small).
 +{{ :iot:tutorial:es_domain.png?direct |}}
 +Specify the access policy, the good practice is a public one, but there is should IP resctiction be specified. 
 +{{ :iot:tutorial:es_security.png?direct |}}
 +[[https://www.myip.com/|To check IP use myip.com]].
 +{{ :iot:tutorial:es_myip_blur.com.png?direct |}}
 +The first line has your current IP. 
 +As a CIDR block you can rewrite you IP as XXX.XXX.XXX.XXX/16, where 16 is a number or bit, that can be changed to still satisfy the condition, so that will form a IP adresses range, which will protect your from ISP pool IP mascarading. 
 +Otherwise, copy your IP to field as is.
 +\\ 
 +In the end, this will lead your to this settings. Review it and confirm creation if it fine.
 +{{ :iot:tutorial:es_review.png?direct |}}
 +After the creation, in several minutes it will be endpoint adress created 
 +{{ :iot:tutorial:es_success_loading.png?direct |}}
 +Wait until it would be done.
 +{{ :iot:tutorial:es_active.png?direct |}}
 +At this moment, you will have access to Kibana plugin, the following link is provided above.
 +{{ :iot:tutorial:es_kibana_loaded.png?direct |}}
 +==== Change MQTT payload to meters data as demo project ====
 +
 +Let's burn project to WebHMI with virtual electric meters. It is simulate power consumption with predefined daily load curve but small fluctiations. Just like real power system does. 
 +
 +So let us formulate JSON payload (according to AWS Shadow rules) those registers with output data, which represents consumpted energy in kWh.
 +
 +It is possible to recalculate in script values to kWh from impulses number e.g. and add a metadata such as location, timestamp etc.
 +Here is an examples of code to do this.
 +<code lua - counters.lib.lua>1</code>
 +<code lua - counters simulation.lua>2</code>
 +<code lua - decimator.lua>3</code>
 +<code lua - AWS_MQTT_upload.lua>4</code>
 +
 +<code sql - IoT Core Rule SQL-like query for virtual counters>SELECT cast(state.reported.counters.value.counter1 as DECIMAL) as counter1, 
 +cast(state.reported.counters.value.counter2 as DECIMAL) as counter2, 
 +cast(state.reported.counters.value.counter3 as DECIMAL) as counter3, 
 +cast(state.reported.counters.value.counter4 as DECIMAL) as counter4, 
 +cast(state.reported.counters.units as STRING) as units, 
 +cast(state.reported.location as STRING) as location
 +cast((state.reported.timestamp) as STRING ) as timestamp 
 +FROM '$aws/things/WebHMI_Dnipro_1/shadow/update'
 +</code>
 +
 +As result, every single MQTT upload will trigger IoT Rule (to process new entries from WH script, such as counters and location e.g. should be SQL rewritten)to put data to DynamoDB. 
 +The next step is to create a Lambda function, that will put data to visualisation dashboard Kibana on ElasticSearch instance.
 +==== Create a DynamoDB to Elasticsearch bridge using Lambda function ====
 +There is a uploadable code with function, written on Node.js v10.
 +
 +Import it to AWS Lambda and test. Update roles with appropriate policies.
 +
 +To test let's create a predefined DDB Upload test event. 
 +
 +If it is successful, let's post smth in WH, to update IoT Core Shadow, which will cause IoT Core Rule processing with SQL-like query, and append shadow in the DDB. 
 +
 +Append to DDB should triggers that new Lambda function, which will repost the data to ElasticSearch instance. To check it, just go to Kibana and click search button. If there is no data in one minute after posting from WH, so something get wrong. The more likely with IAM. There is a CloudWatch logs to investigate problem.
  
-Test WebHMI can post to AWS shadow, 
-create Actions, apply pols, 
-check the DynamoDB update, check Kibana metric created. 
  
-Work with Kibana 
  
  
  
  
iot/tutorial/aws-integration.1591964723.txt.gz · Last modified: 2020/06/12 12:25 by atolstov

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki