MyUplink binding?

The binding already uses all datapoints which are exposedby the API. Therefore I cannot add anything which is not supported. The “old” F and VVM models seem to have only a very limited set of datapoints compared to newer models.

This is because there is one line of code missing in my binding. I will add it. :smiley:
On the other hand I am working on a even more generic way to detect such enum/status channels.

2 Likes

Ok thanks for confirming what I suspected :face_with_raised_eyebrow:, the data is there, so not supporting these datapoints is a choice from Nibe. I will open a support ticket towards Nibe and see what happens.

Thanks for your answers and keep up the good work :muscle:t2:

Hello, I placed the file “org.openhab.binding.myuplink-4.2.0-SNAPSHOT.jar” into the map /usr/share/openhab/addons and changed ownership to “openhab” (instead of “openhabian”). After sudo reboot I have errors in the log:
What could it be?
2024-05-31 16:09:06.772 [ERROR] [Events.Framework ] - FrameworkEvent ERROR
org.osgi.framework.BundleException: Could not resolve module: org.openhab.binding.myuplink [238]
Unresolved requirement: Import-Package: com.google.gson; version=“[2.10.0,3.0.0)”

at org.eclipse.osgi.container.Module.start(Module.java:463) ~[org.eclipse.osgi-3.18.0.jar:?]
at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel$2.run(ModuleContainer.java:1847) ~[org.eclipse.osgi-3.18.0.jar:?]
at org.eclipse.osgi.internal.framework.EquinoxContainerAdaptor$1$1.execute(EquinoxContainerAdaptor.java:136) ~[org.eclipse.osgi-3.18.0.jar:?]
at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.incStartLevel(ModuleContainer.java:1840) ~[org.eclipse.osgi-3.18.0.jar:?]
at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.incStartLevel(ModuleContainer.java:1783) ~[org.eclipse.osgi-3.18.0.jar:?]
at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.doContainerStartLevel(ModuleContainer.java:1745) ~[org.eclipse.osgi-3.18.0.jar:?]
at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.dispatchEvent(ModuleContainer.java:1667) ~[org.eclipse.osgi-3.18.0.jar:?]
at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.dispatchEvent(ModuleContainer.java:1) ~[org.eclipse.osgi-3.18.0.jar:?]
at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:234) ~[org.eclipse.osgi-3.18.0.jar:?]
at org.eclipse.osgi.framework.eventmgr.EventManager$EventThread.run(EventManager.java:345) ~[org.eclipse.osgi-3.18.0.jar:?]

2024-05-31 16:09:06.957 [WARN ] [ty.util.ssl.SslContextFactory.config] - Trusting all certificates configured for Client@2c2076[provider=null,keyStore=null,trustStore=null]
2024-05-31 16:09:06.959 [WARN ] [ty.util.ssl.SslContextFactory.config] - No Client EndPointIdentificationAlgorithm configured for Client@2c2076[provider=null,keyStore=null,trustStore=null]
2024-05-31 16:09:08.001 [INFO ] [io.openhabcloud.internal.CloudClient] - Connected to the openHAB Cloud service (UUID = d6…f6, base URL = http://localhost:8080)
2024-05-31 16:09:11.732 [INFO ] [ab.ui.habpanel.internal.HABPanelTile] - Started HABPanel at /habpanel
2024-05-31 16:09:18.521 [INFO ] [e.automation.internal.RuleEngineImpl] - Rule engine started.
2024-05-31 16:09:19.094 [WARN ] [org.apache.felix.fileinstall ] - Error while starting bundle: file:/usr/share/openhab/addons/org.openhab.binding.myuplink-4.2.0-SNAPSHOT.jar
org.osgi.framework.BundleException: Could not resolve module: org.openhab.binding.myuplink [238]
Unresolved requirement: Import-Package: com.google.gson; version=“[2.10.0,3.0.0)”

at org.eclipse.osgi.container.Module.start(Module.java:463) ~[org.eclipse.osgi-3.18.0.jar:?]
at org.eclipse.osgi.internal.framework.EquinoxBundle.start(EquinoxBundle.java:445) ~[org.eclipse.osgi-3.18.0.jar:?]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.startBundle(DirectoryWatcher.java:1260) ~[?:?]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.startBundles(DirectoryWatcher.java:1233) ~[?:?]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.startAllBundles(DirectoryWatcher.java:1221) ~[?:?]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.doProcess(DirectoryWatcher.java:515) ~[?:?]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.process(DirectoryWatcher.java:365) ~[?:?]
at org.apache.felix.fileinstall.internal.DirectoryWatcher.run(DirectoryWatcher.java:316) ~[?:?]

Looks like the binding version is incompatible with the openhab version. It should work with 4.2.0.M2

Hi Alexander,

thx for your great work - i still updated to myuplink.

One Point to the writeable values:

In the API the value shoud be writeable

  {
    "category": "SMO 40",
    "parameterId": "40940",
    "parameterName": "current value",
    "parameterUnit": "DM",
    "writable": true,
    "timestamp": "2024-05-30T10:22:31+00:00",
    "value": 98,
    "strVal": "98DM",
    "smartHomeCategories": [],
    "minValue": -30000,
    "maxValue": 30000,
    "stepValue": 100,
    "enumValues": [],
    "scaleValue": "0.1",
    "zoneId": null
  },

By testing i get errors in the log that the channel write access not supported:

2024-05-31 18:05:55.265 [ERROR] [nal.common.AbstractInvocationHandler] - An error occurred while calling method 'ThingHandler.handleCommand()' on 'org.openhab.binding.myuplink.internal.handler.MyUplinkGenericDeviceHandler@672a04a2': channel (40940) does not support write access

java.lang.UnsupportedOperationException: channel (40940) does not support write access

	at org.openhab.binding.myuplink.internal.handler.MyUplinkThingHandler.handleCommand(MyUplinkThingHandler.java:101) ~[?:?]

	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]

	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[?:?]

	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]

	at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]

	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:147) [bundleFile:?]

	at org.openhab.core.internal.common.InvocationHandlerSync.invoke(InvocationHandlerSync.java:59) [bundleFile:?]

	at jdk.proxy20440.$Proxy20573.handleCommand(Unknown Source) [?:?]

	at org.openhab.core.thing.internal.profiles.ProfileCallbackImpl.handleCommand(ProfileCallbackImpl.java:95) [bundleFile:?]

	at org.openhab.core.thing.internal.profiles.SystemDefaultProfile.onCommandFromItem(SystemDefaultProfile.java:49) [bundleFile:?]

	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]

	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[?:?]

	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]

	at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]

	at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:147) [bundleFile:?]

	at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]

	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]

	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]

	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]

	at java.lang.Thread.run(Thread.java:840) [?:?]

That was the problem, now it works!! Thanks!!!

The latest version is no longer compatible with 4.2.0.M2 but works with the latest snapshot version.

The binding now supports any channels that have a number to text mapping (enum type). Write access is supported for Switch type channels. This is tested with my setup and works.
Write access should also be possible for those enum-type channels if it is supported by that specific channel in the API. This is untested so far.

Thanks for the update @AlexF
I upgraded to latest OH-snapshot, and added your new JAR.
When I try to add extra hot water I get this in the logs,

2024-06-02 21:08:09.084 [INFO ] [handler.MyUplinkGenericDeviceHandler] - channel '61503' does not support write access - value to set '0.5'
2024-06-02 21:08:09.085 [ERROR] [nal.common.AbstractInvocationHandler] - An error occurred while calling method 'ThingHandler.handleCommand()' on 'org.openhab.binding.myuplink.internal.handler.MyUplinkGenericDeviceHandler@2681d26': channel (61503) does not support write access
java.lang.UnsupportedOperationException: channel (61503) does not support write access
        at org.openhab.binding.myuplink.internal.handler.MyUplinkThingHandler.handleCommand(MyUplinkThingHandler.java:101) ~[?:?]
        at jdk.internal.reflect.GeneratedMethodAccessor210.invoke(Unknown Source) ~[?:?]
        at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
        at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]
        at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:147) [bundleFile:?]
        at org.openhab.core.internal.common.InvocationHandlerSync.invoke(InvocationHandlerSync.java:59) [bundleFile:?]
        at jdk.proxy2253.$Proxy2422.handleCommand(Unknown Source) [?:?]
        at org.openhab.core.thing.internal.profiles.ProfileCallbackImpl.handleCommand(ProfileCallbackImpl.java:95) [bundleFile:?]
        at org.openhab.core.thing.internal.profiles.SystemDefaultProfile.onCommandFromItem(SystemDefaultProfile.java:49) [bundleFile:?]
        at jdk.internal.reflect.GeneratedMethodAccessor209.invoke(Unknown Source) ~[?:?]
        at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
        at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]
        at org.openhab.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:147) [bundleFile:?]
        at org.openhab.core.internal.common.Invocation.call(Invocation.java:52) [bundleFile:?]
        at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
        at java.lang.Thread.run(Thread.java:840) [?:?]

Same on a few other channels i tried… 61500 and 61517 for example.

I have also testet with the updated binding.

Number channels still doesnt work for me with the same errors in the log like MrRusch.
I only have one Switch channel (Temp. Lux) who works by switching (write access).

With use of the NibeUplink binding it was possible to see if status heating, cooling or hotwater production was active. I can’t find any channels with this info in the myUplink binding. I use this status to calculate the COP during heating, cooling or hotwater production. Is it true that this info is not available within the myUplink binding?

It depends on your model. The binding detects all datapoints that are exposed by myUplink.
For my VVM320 there is a channel named similar to “priority”. That can be used.
But for my VVM320 the heatmeters are not being updated on regular basis. In fact this only happens on startup of the VVM320 main unit. I am not sure whether this issue applies to other models as well.

One update from my end:

A new firmware for my VVM320 is available which fixes the heat meter issue and also adds a lot of additional channels (435 instead of 79). I will test this but it looks like I need to adopt the binding since there are too many channels now which are not categorized and have identical names (many are related to schedules). Currently I do not have any idea how to do this in a generic way…

5 Likes

Looks like the end is nigh for the old NIBE Uplink. They went from “hey your pump is not supported” to “switch now, or loose price tracking” within only a couple of months :grimacing:

I also got an update for my F750, it seems to have activated some of the things in the smart-home-categories. But still no smart-home-zones despite my system being split into two circuits (floor heating on the bottom floor and radiators upstairs which is handled as two systems on the heatpump). The only difference I see from this is that it’s present in a field smartHomeCategories for some of the parameters.

I also have a bunch of new parameters, mostly different settings it seems, but the descriptions are unfortunately not very good for many of them.

They do seem to have different parameterId, so should be able to handle it in the binding and have separate channels at least, but to make it user-friendly seems very difficult. As of now the users would likely have to manually investigate what exact parameters on the heatpump corresponds to each channel. Hopefully Nibe can update with better descriptions for the parameters, that’s the only solution i can see at the moment.

Update: If the parameters are sorted by parameterId, the ones belonging to the same schedule is at least grouped together. First is an ‘activated’ parameter for each of the schedule types, followed by start time for each day of the week (in reverse order), end times, and finally the setting for each day. The setting can be used to determine what the schedule is supposed to configure which could be used to categorize and group them. There also seems to be two schedules for each setting (which is what can be configured on my heatpump as well).

The different settings I have for the schedules are: Comfort mode (for hot water), Fan speed, heating curve offset (some with configurable heating system, some without), blocking (compressor, electric addition, both). Then there is one additional schedule that doesn’t follow this pattern, so no idea what it does.