New binding for Centrica Hive / British Gas Hive with Multizone Heating and Radiator Valve support

Hi James, thanks for trying out my binding. It’s not going to be you doing something wrong, it will be a problem with my binding.
Centrica don’t publish full API documentation, just some auto-generated stuff, so I’ve been semi-reverse engineering it and obviously made a wrong assumption which caused this bug.
I’m guessing you have some non-heating devices on your hive account which I’m parsing incorrectly.

Do you mind letting me see your data that the Hive API returns for your account?
If you map your account thing’s dump_nodes channel to a switch and click it you will get a file in <openhab_home>/userdata that I can look at to see what the Hive API is actually returning for you.

N.B. the “anon.json” file will have all your UUIDs replaced with random ones but I will be able to see stuff like the name of your devices, your schedule etc. (there should be no sensitive info like your address).

If you send that file to me in a private message it will be much easier for me to diagnose what I am doing wrong.

1 Like

Afraid I can’t send you a private message yet… I have however placed the file on a webserver for you to grab!
http://crossingzebras.com/hive-nodes-anon.json

Hopefully something glaringly obvious to you!

Ok, I’ve downloaded it so you can take that file down now.

I’ve spotted at least one thing that is different. It’s a bit weird because I can’t think of a good reason for the difference. Perhaps it is something to do with you having the older hub (NANO1 vs my NANO2).

1 Like

I’ve made a quick hotfix release for you to test. I will not make a “real” release until I’ve had some time to properly test it.

1 Like

Sorry, much the same result there.

16:14:19.908 [INFO ] [ome.event.ThingStatusInfoChangedEvent] - 'hive:account:3ba1ac2a' changed from ONLINE to REMOVING
16:14:19.912 [INFO ] [ome.event.ThingStatusInfoChangedEvent] - 'hive:account:3ba1ac2a' changed from REMOVING to REMOVED
16:14:19.917 [INFO ] [ome.event.ThingStatusInfoChangedEvent] - 'hive:account:3ba1ac2a' changed from REMOVED to UNINITIALIZED
16:14:19.924 [INFO ] [ome.event.ThingStatusInfoChangedEvent] - 'hive:account:3ba1ac2a' changed from UNINITIALIZED to UNINITIALIZED (HANDLER_MISSING_ERROR)
16:14:19.927 [INFO ] [ome.event.ItemChannelLinkRemovedEvent] - Link 'hive_account_3ba1ac2a_dump_nodes => hive:account:3ba1ac2a:dump_nodes' has been removed.
16:14:36.551 [INFO ] [ome.event.ThingStatusInfoChangedEvent] - 'hive:account:fbe0418b' changed from UNINITIALIZED to INITIALIZING
16:14:36.554 [INFO ] [ome.event.ThingStatusInfoChangedEvent] - 'hive:account:fbe0418b' changed from INITIALIZING to UNKNOWN (CONFIGURATION_PENDING)
16:14:36.828 [INFO ] [ome.event.ThingStatusInfoChangedEvent] - 'hive:account:fbe0418b' changed from UNKNOWN (CONFIGURATION_PENDING) to ONLINE
16:14:36.946 [DEBUG] [nal.handler.DefaultHiveAccountHandler] - Polling failed with exception
org.openhab.binding.hive.internal.client.exception.HiveClientResponseException: FeatureAttribute is unexpectedly null
        at org.openhab.binding.hive.internal.client.FeatureAttributeFactory.getReadOnlyFromDtoWithAdapter(FeatureAttributeFactory.java:78) ~[?:?]
        at org.openhab.binding.hive.internal.client.FeatureAttributeFactory.getReadOnlyFromDto(FeatureAttributeFactory.java:70) ~[?:?]
        at org.openhab.binding.hive.internal.client.repository.DefaultNodeRepository.getZigbeeDeviceFeatureFromDto(DefaultNodeRepository.java:285) ~[?:?]
        at org.openhab.binding.hive.internal.client.repository.DefaultNodeRepository.parseNodesDto(DefaultNodeRepository.java:547) ~[?:?]
        at org.openhab.binding.hive.internal.client.repository.DefaultNodeRepository.getAllNodes(DefaultNodeRepository.java:583) ~[?:?]
        at org.openhab.binding.hive.internal.client.DefaultHiveClient.makeAuthenticatedApiCall(DefaultHiveClient.java:112) ~[?:?]
        at org.openhab.binding.hive.internal.client.DefaultHiveClient.getAllNodes(DefaultHiveClient.java:144) ~[?:?]
        at org.openhab.binding.hive.internal.handler.DefaultHiveAccountHandler.poll(DefaultHiveAccountHandler.java:281) ~[?:?]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_252]
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [?:1.8.0_252]
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_252]
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [?:1.8.0_252]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_252]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_252]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_252]

Let me know if you want any more info. Happy to share whatever you think would be useful.

note, I removed and re-added the account, the error was there both before and after the change.

This is a different problem, but yet again cause by the NANO1 hub reporting things slightly differently for some reason.

Here’s another hotfix to try:

1 Like

Now seems to be grumbling about the battery I guess. See below.

2020-05-08 17:22:56.015 [DEBUG] [al.handler.DefaultHiveAccountHandler] - Polling failed with exception
org.openhab.binding.hive.internal.client.exception.HiveClientResponseException: FeatureAttribute is unexpectedly null
        at org.openhab.binding.hive.internal.client.FeatureAttributeFactory.getReadOnlyFromDtoWithAdapter(FeatureAttributeFactory.java:78) ~[?:?]
        at org.openhab.binding.hive.internal.client.FeatureAttributeFactory.getReadOnlyFromDto(FeatureAttributeFactory.java:70) ~[?:?]
        at org.openhab.binding.hive.internal.client.repository.DefaultNodeRepository.getBatteryDeviceFeatureFromDto(DefaultNodeRepository.java:96) ~[?:?]
        at org.openhab.binding.hive.internal.client.repository.DefaultNodeRepository.parseNodesDto(DefaultNodeRepository.java:534) ~[?:?]
        at org.openhab.binding.hive.internal.client.repository.DefaultNodeRepository.getAllNodes(DefaultNodeRepository.java:583) ~[?:?]
        at org.openhab.binding.hive.internal.client.DefaultHiveClient.makeAuthenticatedApiCall(DefaultHiveClient.java:112) ~[?:?]
        at org.openhab.binding.hive.internal.client.DefaultHiveClient.getAllNodes(DefaultHiveClient.java:144) ~[?:?]
        at org.openhab.binding.hive.internal.handler.DefaultHiveAccountHandler.poll(DefaultHiveAccountHandler.java:281) ~[?:?]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_252]
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [?:1.8.0_252]
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_252]
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [?:1.8.0_252]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_252]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_252]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_252]

Hi James,

I’ve got another hotfix for you to try out:

I’ve tried this against the JSON file you sent me so hopefully you should at least not have any issues on the parsing side like you did before.

I don’t seem to have broken anything in my refactor but I’ve not got around to writing a full suite of tests yet so it may still be buggy.

1 Like

Funny, I was actually working through getting eclipse setup and currently trying to figure out how I even start with this! Poking around basic Java is something I am ok with, but quite how this all goes together is going to take me a while to figure out. I haven’t even worked out how the hell I can run this, let alone get data from the API and figure out where its breaking! Anyway, I digress, not related to the issue at hand.

installed the new fix; afraid its another new error now;

org.openhab.binding.hive.internal.client.exception.HiveClientResponseException: Display value is unexpectedly null.
        at org.openhab.binding.hive.internal.client.FeatureAttributeFactory.buildFeatureAttribute(FeatureAttributeFactory.java:49) ~[?:?]
        at org.openhab.binding.hive.internal.client.FeatureAttributeFactory.getSettableFromDtoWithAdapter(FeatureAttributeFactory.java:104) ~[?:?]
        at org.openhab.binding.hive.internal.client.repository.DefaultNodeRepository.getTransientModeFeatureFromDto(DefaultNodeRepository.java:227) ~[?:?]
        at org.openhab.binding.hive.internal.client.repository.DefaultNodeRepository.parseNodesDto(DefaultNodeRepository.java:541) ~[?:?]
        at org.openhab.binding.hive.internal.client.repository.DefaultNodeRepository.getAllNodes(DefaultNodeRepository.java:590) ~[?:?]
        at org.openhab.binding.hive.internal.client.DefaultHiveClient.makeAuthenticatedApiCall(DefaultHiveClient.java:112) ~[?:?]
        at org.openhab.binding.hive.internal.client.DefaultHiveClient.getAllNodes(DefaultHiveClient.java:144) ~[?:?]
        at org.openhab.binding.hive.internal.handler.DefaultHiveAccountHandler.poll(DefaultHiveAccountHandler.java:281) ~[?:?]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_252]
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [?:1.8.0_252]
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_252]
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [?:1.8.0_252]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_252]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_252]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_252]

For everyone else’s benefit:

It turns out the NANO1 hub does some weird stuff that my binding does not like:

After some back and forth with James I’ve managed to make a version of the binding that kind of works with the NANO1. I’m not going to publish it as a “release” right now as it is still pretty janky and I don’t personally have a NANO1 hub to test against. If you really want a copy send me a direct message and I’ll give it to you.

Just want to add a post saying thank you, this works perfectly and good job!!!
Hope to see more great things and items (see what I did there)

Hi folks,
I was using this binding as a basis for an integration, but the endpoint used in this binding https://api.prod.bgchprod.info/omnia/ stopped working around the 30/9/2020 (api and swagger interface both report “GONE”/“Not supported anymore”.) There are posts on other home automation forums noticing the same problems.

Perhaps the problem is only temporary (although first noticed on 30/9), but the error message seems terminal.

The workaround for me was to change to a python exec based binding. The endpoint for this too changed on the same date, from
https://beekeeper.hivehome.com/1.0/global/login
to
https://beekeeper.hivehome.com/1.0/cognito/login

The python library I used is Pyhiveapi, available on pip. Version 0.20.2 contains the updated url, confirmed working today. I hope this helps someone out.

literally found out something related earlier

https://community.hivehome.com/s/question/0D50J00004z1HKO/hi-does-hive-have-any-external-apis-for-integration-if-so-how-do-i-get-hold-of-the-documentation-regarding-these-thanks

the old original api that this binding is based on is now dead, you have the rough gyst of it in your reply.

anyway, how much effort was it to change to the new binding? if its simple enough I may go for it, though I have also thrown Ross an email hopefully to get this one updated, its been awesome and I would hate to see it die.

I have very basic needs from the integration - just a single home temperature, setpoint and boiler state, so I moved away from a Hive-specific binding unfortunately.

I now get these values via a python script based on the pyhiveapi library.

I butchered the example script to return the desired values. I call the python script using a rule and the executeCommandLine.

I can post the setup for this if you would find it useful, but it doesn’t appear to control anything other than heating, and is certainly not plug-and-play like the marketplace bindings sorry.

It appears there are (at least?) two subdomains for the ‘beekeeper’ endpoint
https://beekeeper-uk.hivehome.com/1.0/cognito/
and
https://beekeeper.hivehome.com/1.0/cognito/
Hive is now marketed internationally (with the US version supporting heating/cooling it appears) so it is perhaps pertinent to try and determine the user’s preferred endpoint in any new binding version?

if its python, then I am able to butcher it some more, I’ve done some digging into the API using the my.hivehome.com website and chrome dev tools, so I am sure I can bodge something together. happily all I am doing is monitoring temps from my TRV’s and flipping the hot water control based on a sensor I’ve got in the hot tank, so should be simple enough, but not having hot water will mean Mrs M will murder me in the morning!!

If you’ve pointers you can post, go for it! happy for anything to help

recon mostly we will be beekeeper-uk; or there must be somewhere to find out!

Ok, I’m on openhabian Raspberry Pi. Rough instructions below. I did some twiddles around the edges with venv, json, bash, but the below should get you going. if you need more detail just shout:

  1. pip install pyhiveapi

  2. clone/copy the example python script https://github.com/Pyhive/Pyhiveapi/blob/master/examples/pyhiveapi_example_1.py and save to your openhab conf/scripts folder

  3. edit the script to remove all PRINT commands, except the one to print out the temperature, adjust as necessary. When you run the script it should just print the raw temperature/setpoint/whatever.

  4. whitelist the script by adding the full path to /conf/misc/exec.whitelist

  5. create temperature/setpoint Number Items as required

  6. Create a .rules file that runs every minute, and calls the script via executeCommandLine , formats the response and post the update to the Item:

    rule “Fetch Hive Data”
    when
    Time cron “0 0/1 * * * ?” // every 1 min
    then
    val String hivedata = executeCommandLine(“python /etc/openhab2/scripts/get-hive.py”, 60000) //replace with your script path and name
    logInfo(“Hive Data”,hivedata)
    CurrentTemp.postUpdate(Float::parseFloat(hivedata.toString) as Number) //replace CurentTemp with your Item name
    end

This is the basics. I actually ended up making the python script produce a json string output containing all the hive data in one call. I like working in virtualenvs, so wrote a wrapper shell script to set up and teardown the venv containing pyhiveapi. The rule’s executeCommandLine then called the shell script, and parsed the json into individual item updates using the JSONPATH transformation addon in the rule.

With any luck the proper binding can be updated easily to the new endpoint and authorization: bearer token authentication

nice, that’s going to be hugely helpful. thanks!

Hi Steve,
I’m very keen to try the above as I have been using the original shell script for over 2 years.
I managed to modify it to parse smart switches and TRV’s from the json, so a lot of effort, now not working.
I’ve never used python but I appear to have fallen at the first step.
Attempting to install pyhiveapi on my Raspian OH pi3 using “pip install pyhiveapi” I get an error:-

"Could not find a version that satisfies the requirement pyhiveapi (from versions: )
No matching distribution found for pyhiveapi"

Any ideas for me ?

Edit:-
I’ve managed to fix the original script so won’t be trying the python route.