Though you might use industrial storage solutions for the WebHMI, still there is always a risk of losing your data in some cases:
There is only one example of how to avoid the consequences of data loss considered here. Using shell script via crontab utility.
At the very start, it is a good idea to download all the data.
#!/bin/bash # Create a folder ~/webhmi-log-cron, with the /tmp sub-folder. HOST=192.168.1.1 LOG_DIR="$HOME/webhmi-logs-backup" mkdir -p $LOG_DIR/tmp cd $LOG_DIR/tmp # Get all logs (including today, which is, however, obviously partial) wget --user=admin --password=webhmi -nH --cut-dirs=1 "ftp://$HOST/log/log*.sqlite3" # Archive to save storage space gzip -f -v * # Copy out of tmp .gz file and do not overwrite if destination filename already exists cp -r --backup=t log-*.sqlite3.gz ../. # Clear tmp folder rm $LOG_DIR/tmp/*
Here is an example of how to set up your Linux machine to automatically save files from the /log
folder on the WebHMI's storage:
#!/bin/bash # Create a folder ~/webhmi-logs-backup, with the /tmp sub-folder. HOST=192.168.1.1 LOG_DIR="$HOME/webhmi-logs-backup" cd $LOG_DIR/tmp # download yesterday's log file to backup storage yesterday=$(date +"%Y-%m-%d" -d yesterday) # because login and pass for the FTP connection are unencrypted you can # restrict read/write access mode to the file with chmod command) wget --user=admin --password='webhmi' ftp://$HOST/log/log-$yesterday.sqlite3 #archive to save storage space gzip -f -v log-$yesterday.sqlite3 # Copy out of tmp .gz file and do not overwrite if destination filename already exists cp -r --backup=t log-$yesterday.sqlite3.gz ../. # Clear tmp folder rm $LOG_DIR/tmp/*
Do not forget to grant additional run permissions
chmod u+x /path_to_run/backup.sh
Add the script to the crontab with the parameters:
10 0 * * * /path_to_run/backup.sh
to run it every day at 00:10
New files will only be added to ~/WEBHMI/log_backup, nothing will be deleted from this folder.