Hi @controlc,
thank you.
Problem:
I run openhabian on a raspberry and access the system by a terminal session from my mac. I wanted to transfer my data from rrd4j to influxDB with the script. e.g. for item Klima_hum_wz
### timestamps
item: Klima_hum_wz
10y: 2009-11-01T14:36:44+01:00
1y: 2018-10-04T15:36:44+02:00
1m: 2019-10-03T15:36:44+02:00
1w: 2019-10-25T15:37:44+02:00
1d: 2019-10-31T14:37:44+01:00
8h: 2019-11-01T06:37:44+01:00
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 338 100 338 0 0 2624 0 --:--:-- --:--:-- --:--:-- 2640
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 338 100 338 0 0 14382 0 --:--:-- --:--:-- --:--:-- 14695
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 338 100 338 0 0 16197 0 --:--:-- --:--:-- --:--:-- 16900
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 338 100 338 0 0 15557 0 --:--:-- --:--:-- --:--:-- 16095
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 338 100 338 0 0 14368 0 --:--:-- --:--:-- --:--:-- 14695
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 338 100 338 0 0 17425 0 --:--:-- --:--:-- --:--:-- 18777
### found values: 0
Warning: Couldn't read data from file "Klima_hum_wz-*", this makes an empty
Warning: POST.
HTTP/1.1 204 No Content
Content-Type: application/json
Request-Id: a57b457c-fcac-11e9-b4c9-b827eb33e913
X-Influxdb-Build: OSS
X-Influxdb-Version: 1.7.8
X-Request-Id: a57b457c-fcac-11e9-b4c9-b827eb33e913
Date: Fri, 01 Nov 2019 13:36:45 GMT
Sleep for 5 seconds to let InfluxDB process the data...
### delete temporary files
So no values that have been transferred.
when browsing to the file http://IP/rest/persistence/items/Klima_hum_wz?serviceId=rrd4j I get a page with all the datapoint, see the extract here:
{"name":"Klima_hum_wz","datapoints":"360","data":[{"time":1570993680000,"state":"59.325"},
{"time":1570993920000,"state":"60.0"},{"time":1570994160000,"state":"59.95"},
{"time":1570994400000,"state":"59.8"},{"time":1570994640000,"state":"59.8"},
{"time":1570994880000,"state":"59.125"},{"time":1570995120000,"state":"58.9"},
{"time":1570995360000,"state":"58.925"},{"time":1570995600000,"state":"59.0"},
{"time":1570995840000,"state":"59.0"},{"time":1570996080000,"state":"58.925000000000004"},
{"time":1570996320000,"state":"58.9"},{"time":1570996560000,"state":"58.824999999999996"},
{"time":1570996800000,"state":"58.6"},{"time":1570997040000,"state":"58.6"},
{"time":1570997280000,"state":"58.75"},{"time":1570997520000,"state":"58.8"},
Why does the script does not take and transfer them? the script I use:
#!/bin/bash
# This script reads the values of an item from openhab via REST and imports the data to influxdb
# useage: get_item_states.sh <itemname>
itemname="$1"
if [ -z $itemname ]
then
echo "Please define Item!"
exit 0
fi
source /srv/openhab2-conf/services/rest2influx.cfg
# convert historical times to unix timestamps,
tenyearsago=`date +"%Y-%m-%dT%H:%M:%S%:z" --date="10 years ago"`
oneyearago=`date +"%Y-%m-%dT%H:%M:%S%:z" --date="-12 months 28 days ago"`
onemonthago=`date +"%Y-%m-%dT%H:%M:%S%:z" --date="29 days ago"`
oneweekago=`date +"%Y-%m-%dT%H:%M:%S%:z" --date="-6 days -23 hours 59 minutes ago"`
onedayago=`date +"%Y-%m-%dT%H:%M:%S%:z" --date="-23 hours 59 minutes ago"`
eighthoursago=`date +"%Y-%m-%dT%H:%M:%S%:z" --date="-7 hours 59 minutes ago"`
# print timestamps
echo ""
echo "### timestamps"
echo "item: $itemname"
echo "10y: $tenyearsago"
echo "1y: $oneyearago"
echo "1m: $onemonthago"
echo "1w: $oneweekago"
echo "1d: $onedayago"
echo "8h: $eighthoursago"
resturl="http://$openhabserver:$openhabport/rest/persistence/items/$itemname?serviceId=$serviceid"
# get values and write to different files
# curl -X GET --header "Accept: application/json" "$resturl&starttime=${tenyearsago}&endtime=${oneyearago}" > ${itemname}_10y.xml
curl -X GET --header "Accept: application/json" "$resturl&starttime=${tenyearsago}&endtime=${oneyearago}" > ${itemname}_10y.xml
curl -X GET --header "Accept: application/json" "$resturl&starttime=${oneyearago}&endtime=${onemonthago}" > ${itemname}_1y.xml
curl -X GET --header "Accept: application/json" "$resturl&starttime=${onemonthago}&endtime=${oneweekago}" > ${itemname}_1m.xml
curl -X GET --header "Accept: application/json" "$resturl&starttime=${oneweekago}&endtime=${onedayago}" > ${itemname}_1w.xml
curl -X GET --header "Accept: application/json" "$resturl&starttime=${onedayago}&endtime=${eighthoursago}" > ${itemname}_1d.xml
curl -X GET --header "Accept: application/json" "$resturl&starttime=${eighthoursago}" > ${itemname}_8h.xml
# combine files
cat ${itemname}_10y.xml ${itemname}_1y.xml ${itemname}_1m.xml ${itemname}_1w.xml ${itemname}_1d.xml ${itemname}_8h.xml > ${itemname}.xml
# convert data to line protocol file
cat ${itemname}.xml \
| sed 's/}/\n/g' \
| sed 's/data/\n/g' \
| grep -e "time.*state"\
| tr -d ',:[{"' \
| sed 's/time/ /g;s/state/ /g' \
| awk -v item="$itemname" '{print item " value=" $2 " " $1 "000000"}' \
| sed 's/value=ON/value=1/g;s/value=OFF/value=0/g' \
> ${itemname}.txt
values=`wc -l ${itemname}.txt | cut -d " " -f 1`
echo ""
echo "### found values: $values"
# split file in smaller parts to make it easier for influxdb
split -l $importsize ${itemname}.txt "${itemname}-"
for i in ${itemname}-*
do
curl -i -XPOST -u $influxuser:$influxpw "http://$influxserver:$influxport/write?db=$influxdatbase" --data-binary @$i
echo "Sleep for $sleeptime seconds to let InfluxDB process the data..."
sleep $sleeptime
done
echo ""
echo "### delete temporary files"
rm ${itemname}*
exit 0