Ok, I’m on openhabian Raspberry Pi. Rough instructions below. I did some twiddles around the edges with venv, json, bash, but the below should get you going. if you need more detail just shout:
-
pip install pyhiveapi
-
clone/copy the example python script https://github.com/Pyhive/Pyhiveapi/blob/master/examples/pyhiveapi_example_1.py and save to your openhab conf/scripts folder
-
edit the script to remove all PRINT commands, except the one to print out the temperature, adjust as necessary. When you run the script it should just print the raw temperature/setpoint/whatever.
-
whitelist the script by adding the full path to /conf/misc/exec.whitelist
-
create temperature/setpoint Number Items as required
-
Create a .rules file that runs every minute, and calls the script via executeCommandLine , formats the response and post the update to the Item:
rule “Fetch Hive Data”
when
Time cron “0 0/1 * * * ?” // every 1 min
then
val String hivedata = executeCommandLine(“python /etc/openhab2/scripts/get-hive.py”, 60000) //replace with your script path and name
logInfo(“Hive Data”,hivedata)
CurrentTemp.postUpdate(Float::parseFloat(hivedata.toString) as Number) //replace CurentTemp with your Item name
end
This is the basics. I actually ended up making the python script produce a json string output containing all the hive data in one call. I like working in virtualenvs, so wrote a wrapper shell script to set up and teardown the venv containing pyhiveapi. The rule’s executeCommandLine then called the shell script, and parsed the json into individual item updates using the JSONPATH transformation addon in the rule.
With any luck the proper binding can be updated easily to the new endpoint and authorization: bearer token authentication