Persistence needed to be exported to excel

OH2.4M8 in a docker container
Ubuntu 18.0.4 Server

I am using rrdj4 persistences for my Novelan heatpump. Displaying the values in graphs and so on is working. For further calculations which I want to do in Excel I would need to have the values exported to an Excel compatible form (CSV etc)

All my research is always leading to hints to use persistence instead of an excel export. But this is in my case not a solution.

Thanks for any help which is highly appreciated

You can try to use the rest api to read the rrd4j database. But not all data is saved, it will be compress. Maybe this Thread can give you a hint.

Hmm ok …
Is this addon somewhere on github? Maybe I can get an idea how to extract data from heatpump and do not take the detour through rrdj4 …

You can install the REST API docs from the Misc tab in PaoerUI. From there you can query rrd4j which will give you the values in JSON format. From there you will need to write your own script to convert the JSON to CSV. There is no CSV export built into OH or rrd4j.

Thanks rlkoshak

so far the best hint (at least for me)
Is it possible to ask more than one itemnames at once in one request?
In which format is the time displayed in json output?


If it’s formatted it’s probably ISO 8601. If it’s just a big number it’s probably epoch.

Thanks rlkoshak,

after some hours of coding I wrote a python script which is doing the job!


Hi @Homer-Sim,

It would be nice if you could share your script as I’m looking for a solution to export power consumption data to CSV.

Sorry for my late replay


###  Reihenfolge der Daten:
###  Datum / Aussentemperatur / Erdkollektortemperatur / Heizenergie / Warmwasserenergie / 

import json
import csv
import pandas
from datetime import date, timedelta
from urllib.request import urlopen
import os

heute =
gestern =
all_filenames = []
jahr = gestern.year

if ( gestern.month < 10 ):
monat = "0" + str(gestern.month)
monat = gestern.month

folderadd = str(jahr) + "-" + str(monat) +"/"

### URLs für die Heizdaten
URL1 = ["Aussentemp", ""]
URL2 = ["Erdkollektortemp", ""]
URL3 = ["Heizenergie", ""]
URL4 = ["WWenergie", ""]
URL5 = ["Gesamtstrom", ""]

addend = str(gestern) + '&endtime=' + str(heute)
URLS = [URL1[0], URL1[1]+addend, URL2[0], URL2[1]+addend, URL3[0], URL3[1]+addend, URL4[0], URL4[1]+addend, URL5[0], URL5[1]+addend]
TEMP_PATH = "/home/homer/heizungsloggen/temp/"
SAVE_PATH = "/home/homer/heizungsloggen/logs/" + folderadd
SAFETY_PATH = "/media/daten/DOKUMENTE/LEITZORDNER/HAUS/00Heizungslogs/" + folderadd

def read_json(url):
    data = json.loads(urlopen(url).read())
#    print(data)
    return data

def write_csv(data,filename):
return all_filenames

def main():
    for i in range(0,9,2):
       write_csv(read_json(URLS[i+1]), TEMP_PATH+str(gestern)+'_'+URLS[i]+'.csv')

if __name__ == '__main__':

It can be that some imports are obsolete because I deleted some unimportant code parts

1 Like