Openhab Docker | Upgrade 3.2 to 3.3

Can you share your docker compose ? I’m using portainer instead, but I don’t have any issues updating, and I’ve been using docker since… 2.5?

It should work just like you expect. You pull down the new image and start a new container using that image.

There is a start script in the container called entryPoint.sh which gets run when the container starts. It looks to see if the config mounted into the container is for a different version than the what’s running in the container and if it is, it performs the same steps that apt or yum will do:

  • create a backup (see $OH_USERDATA/backup)
  • replace the files in $OH_USERDATA/etc
  • clear the cache

What happens when you clear the cache, is all the add-ons that were installed need to be reinstalled. The reinstallation involves downloading them from the Internet. The error you are seeing is caused by it not being able to reach the site where the add-ons are made available for download.

When you cleared the cache again it didn’t do anything to help because the cache was already cleared.

Often these errors are intermittent and solve themselves because they are caused by an outage on the services that host the add-on artifacts. But you’ve cut off the end of the error so I can’t tell if it’s a “failed to download error” for certain.

Note, this would occur for any OH installation. It’s not specific to Docker.

If the error persists, you can work around the problem by downloading the add-ons kar file. It’s not a part of the Docker image because it would add several hundred MB of size to the image, size that isn’t strictly needed because the default for all OH installations is to download the add-ons as they are needed instead of having everyone download all 350+ add-ons every time.

As for why you cannot find it? :shrug. It’s there on the download page (see link at the top of this page). It does look like it’s not on the Docker tab though but you can find the link on the Linux tab, near the bottom of the page.

I found the KAR file and put this to the addons directory for a next try. Unfortunately the issue is still the same. OH3.3 isn’t able to install the needed addon- versions. Even the .kar is present.
I’ve switched back to OH3.2

Running a plain OH3.3 is working. But the plan is just to “upgrade”.

I have been using docker since OH 2.x. Since some time back in 3.0-3.2 I no longer have to empty tmp / cache folders, it now just works.

All I needed to do to upgrade / downgrade is to change my image in the docker-compose.yaml, do a docker-compose pull, and docker-compose up -d.

I also used to have to restart openhab post upgrade, it now restarts by itself.

I’ve never had to touch .kar files unless I was actually installing a third party addon that has been packaged as a .kar file.

I upgrade quite often - usually from snapshot to snapshot, or from snapshot to milestone, or from milestone to snapshot, and sometimes also to release, before going back to snapshot.

I am currently running 3.4-snapshot.

Unless I’m testing a custom built addon, my addons folder is usually empty.

That’s what I also expected from using Docker. Easy to upgrade.

I got everytime this kind of messages … even I’ve put the .kar file to addons.

2022-12-03 13:35:58.945 [ERROR] [core.karaf.internal.FeatureInstaller] - Failed installing 'openhab-package-standard': Error:
	Error downloading mvn:org.eclipse.jetty/jetty-util/9.4.43.v20210629
	Error downloading mvn:org.eclipse.jetty.websocket/websocket-common/9.4.43.v20210629
	Error downloading mvn:org.eclipse.jetty/jetty-io/9.4.43.v20210629
	Error downloading mvn:org.eclipse.jetty.websocket/websocket-api/9.4.43.v20210629
	Error downloading mvn:org.eclipse.jetty/jetty-proxy/9.4.43.v20210629
	Error downloading mvn:org.eclipse.jetty/jetty-client/9.4.43.v20210629
	Error downloading mvn:org.eclipse.jetty.websocket/websocket-client/9.4.43.v20210629
	Error downloading mvn:org.eclipse.jetty/jetty-http/9.4.43.v20210629
2022-12-03 13:36:00.175 [ERROR] [core.karaf.internal.FeatureInstaller] - Failed installing 'openhab-binding-hue, openhab-binding-modbus, openhab-persistence-mapdb, openhab-binding-webthing, openhab-binding-amazondashbutton, openhab-misc-openhabcloud, openhab-binding-tplinksmarthome, openhab-transformation-javascript, openhab-persistence-influxdb, openhab-transformation-regex, openhab-ui-habpanel, openhab-transformation-jsonpath, openhab-automation-jsscripting, openhab-binding-shelly, openhab-binding-mqtt, openhab-persistence-rrd4j, openhab-transformation-map, openhab-ui-basic, openhab-binding-smartmeter, openhab-binding-astro, openhab-binding-telegram, openhab-transformation-jinja': Error:
	Error downloading mvn:com.fasterxml.jackson.module/jackson-module-jaxb-annotations/2.12.5
	Error downloading mvn:com.fasterxml.jackson.core/jackson-core/2.12.5
	Error downloading mvn:com.fasterxml.jackson.core/jackson-databind/2.12.5
	Error downloading mvn:com.fasterxml.jackson.jaxrs/jackson-jaxrs-json-provider/2.12.5
	Error downloading mvn:com.fasterxml.jackson.jaxrs/jackson-jaxrs-base/2.12.5
	Error downloading mvn:com.fasterxml.jackson.dataformat/jackson-dataformat-xml/2.12.5
	Error downloading mvn:com.fasterxml.jackson.datatype/jackson-datatype-jsr310/2.12.5
	Error downloading mvn:com.fasterxml.jackson.core/jackson-annotations/2.12.5
	Error downloading mvn:com.fasterxml.jackson.dataformat/jackson-dataformat-cbor/2.12.5
	Error downloading mvn:com.fasterxml.jackson.dataformat/jackson-dataformat-yaml/2.12.5


2022-12-03 13:36:58.938 [ERROR] [core.karaf.internal.FeatureInstaller] - Failed installing 'openhab-package-standard': Error:
	Error downloading mvn:org.eclipse.jetty/jetty-util/9.4.43.v20210629
	Error downloading mvn:org.eclipse.jetty.websocket/websocket-common/9.4.43.v20210629
	Error downloading mvn:org.eclipse.jetty.websocket/websocket-client/9.4.43.v20210629
	Error downloading mvn:org.eclipse.jetty/jetty-client/9.4.43.v20210629
	Error downloading mvn:org.eclipse.jetty/jetty-http/9.4.43.v20210629
	Error downloading mvn:org.eclipse.jetty/jetty-proxy/9.4.43.v20210629
	Error downloading mvn:org.eclipse.jetty.websocket/websocket-api/9.4.43.v20210629
	Error downloading mvn:org.eclipse.jetty/jetty-io/9.4.43.v20210629
2022-12-03 13:37:00.163 [ERROR] [core.karaf.internal.FeatureInstaller] - Failed installing 'openhab-binding-hue, openhab-binding-modbus, openhab-persistence-mapdb, openhab-binding-webthing, openhab-binding-amazondashbutton, openhab-misc-openhabcloud, openhab-binding-tplinksmarthome, openhab-transformation-javascript, openhab-persistence-influxdb, openhab-transformation-regex, openhab-ui-habpanel, openhab-transformation-jsonpath, openhab-automation-jsscripting, openhab-binding-shelly, openhab-binding-mqtt, openhab-persistence-rrd4j, openhab-transformation-map, openhab-ui-basic, openhab-binding-smartmeter, openhab-binding-astro, openhab-binding-telegram, openhab-transformation-jinja': Error:
	Error downloading mvn:com.fasterxml.jackson.module/jackson-module-jaxb-annotations/2.12.5
	Error downloading mvn:com.fasterxml.jackson.jaxrs/jackson-jaxrs-json-provider/2.12.5
	Error downloading mvn:com.fasterxml.jackson.core/jackson-databind/2.12.5
	Error downloading mvn:com.fasterxml.jackson.jaxrs/jackson-jaxrs-base/2.12.5
	Error downloading mvn:com.fasterxml.jackson.datatype/jackson-datatype-jsr310/2.12.5
	Error downloading mvn:com.fasterxml.jackson.dataformat/jackson-dataformat-xml/2.12.5
	Error downloading mvn:com.fasterxml.jackson.core/jackson-annotations/2.12.5
	Error downloading mvn:com.fasterxml.jackson.core/jackson-core/2.12.5
	Error downloading mvn:com.fasterxml.jackson.dataformat/jackson-dataformat-yaml/2.12.5
	Error downloading mvn:com.fasterxml.jackson.dataformat/jackson-dataformat-cbor/2.12.5

Are you able to access the Internet from inside the container? Does it have a functioning DNS?

Also see if this relevant: [SOLVED] Openhab not starting - Error downloading mvn:org.openhab.ui.bundles/org.openhab.ui.dashboard/2.5.0.M2 - #27 by thefathefa

Yes. Internet is still available inside the container.

pi@pi4:~/docker $ docker exec -it oh33test /bin/bash 
root@pi4:/openhab# ls -la
total 64
drwxr-xr-x  1 openhab openhab  4096 Jun 27 02:00 .
drwxr-xr-x  1 root    root     4096 Dec  3 16:48 ..
drwxr-xr-x  2 openhab openhab  4096 Jan  2  2022 addons
drwxr-xr-x 14 openhab openhab  4096 Jan  2  2022 conf
drwxr-xr-x  1 openhab openhab  4096 Jun 27 02:00 dist
-rw-r--r--  1 openhab openhab 13430 Jun 27 01:38 LICENSE.TXT
drwxr-xr-x  1 openhab openhab  4096 Jun 27 01:38 runtime
-rwxr-xr-x  1 openhab openhab    73 Jun 27 01:22 start_debug.sh
-rwxr-xr-x  1 openhab openhab   199 Jun 27 01:22 start.sh
drwxr-xr-x 14 openhab openhab  4096 Dec  3 16:48 userdata
root@pi4:/openhab# ping google.de
PING google.de(ams17s08-in-x03.1e100.net (2a00:1450:400e:80e::2003)) 56 data bytes
64 bytes from ams17s08-in-x03.1e100.net (2a00:1450:400e:80e::2003): icmp_seq=1 ttl=120 time=6.49 ms
64 bytes from ams17s08-in-x03.1e100.net (2a00:1450:400e:80e::2003): icmp_seq=2 ttl=120 time=6.08 ms
64 bytes from ams17s08-in-x03.1e100.net (2a00:1450:400e:80e::2003): icmp_seq=3 ttl=120 time=6.06 ms
^C
--- google.de ping statistics ---
3 packets transmitted, 3 received, 0% packet loss, time 2004ms
rtt min/avg/max/mdev = 6.059/6.208/6.488/0.197 ms
root@pi4:/openhab# 

Found another weird situation… Maybe this can be the root cause for the failed update?
Using the Docker Image oh3.3 but used a core from 3.2 … ?

pi@pi4:~ $ docker ps | grep oh33
a619ad61ca48   openhab/openhab:3.3.0           "/entrypoint tini -s…"   12 minutes ago   Up 12 minutes (healthy)                                                                   oh33test
pi@pi4:~ $ docker exec -ti oh33test /openhab/runtime/bin/client
Logging in as openhab
Password:  

                           _   _     _     ____  
   ___   ___   ___   ___  | | | |   / \   | __ ) 
  / _ \ / _ \ / _ \ / _ \ | |_| |  / _ \  |  _ \ 
 | (_) | (_) |  __/| | | ||  _  | / ___ \ | |_) )
  \___/|  __/ \___/|_| |_||_| |_|/_/   \_\|____/ 
       |_|       3.2.0 - Release Build

Use '<tab>' for a list of available commands
and '[cmd] --help' for help on a specific command.
To exit, use '<ctrl-d>' or 'logout'.

openhab> list -s org.openhab.core
START LEVEL 100 , List Threshold: 50
 ID │ State  │ Lvl │ Version │ Symbolic name
────┼────────┼─────┼─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
151 │ Active │  80 │ 3.2.0   │ org.openhab.core
openhab> feature:repo-list
Repository                                         │ URL
───────────────────────────────────────────────────┼────────────────────────────────────────────────────────────────────────────────────────────────────
org.ops4j.pax.web-7.3.23                           │ mvn:org.ops4j.pax.web/pax-web-features/7.3.23/xml/features
distro-3.2.0                                       │ mvn:org.openhab.distro/distro/3.2.0/xml/features
framework-4.3.4                                    │ mvn:org.apache.karaf.features/framework/4.3.4/xml/features
standard-4.3.4                                     │ mvn:org.apache.karaf.features/standard/4.3.4/xml/features
org.openhab.binding.e3dc-3.2.0                     │ mvn:org.openhab.addons.bundles/org.openhab.binding.e3dc/3.2.0/xml/features
org.openhab.core.features.karaf.openhab-core-3.2.0 │ mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/3.2.0/xml/features
org.openhab.core.features.karaf.openhab-tp-3.2.0   │ mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-tp/3.2.0/xml/features
org.smarthomej.binding.amazonechocontrol-3.2.9     │ mvn:org.smarthomej.addons.bundles/org.smarthomej.binding.amazonechocontrol/3.2.9/xml/features
openhab-addons-3.2.0                               │ mvn:org.openhab.distro/openhab-addons/3.2.0/xml/features
openhab>                                                                                                                                                                                                                                                                    

I have now set up a new plain docker OH3.3.0 and it’s the same … The core is OH3.2. Is this known? Is this normal? I’d expect OH3.3 ofc.

I also tried OH3.4.0 with the same OH3.2 core. That’s a bit weird for me …

Did someone has the same issue?

Check your envs.
You probably have there one stating the version.
If it says 3.2 then that’s your issue. Delete it and try again.

Where can I do this. Remember, the docker container should be independent from the others. And the docker image is set well …

Are you using portainer??

I can do. But I use docker-compose

Created a new plain test … all directories were created by docker. I didn’t had any addons / conf / userdata folder set for the last test.

$ cat docker-compose.yml 
version: '3'

services:

  openhab33:
#    image: openhab/openhab:latest
    image: openhab/openhab:3.3.0
    container_name: oh33test2
    restart: unless-stopped
    network_mode: host
    cap_add:
      - NET_ADMIN
      - NET_RAW
    volumes:
      - /etc/localtime:/etc/localtime:ro
      - /etc/timezone:/etc/timezone:ro
      - ./openhab/addons:/openhab/addons
      - ./openhab/conf:/openhab/conf
      - ./openhab/userdata:/openhab/userdata
    environment:
      OPENHAB_HTTP_PORT: "8082"
      OPENHAB_HTTPS_PORT: "8445"
      USER_ID: "9001"
      GROUP_ID: "9001"
      EXTRA_JAVA_OPTS: "-Duser.timezone=Europe/Berlin"
    # The command node is very important. It overrides
    # the "gosu openhab tini -s ./start.sh" command from Dockerfile and runs as root!
    command: "tini -s ./start.sh server"

The OH3.3 image is used but OH3.2 is inside?

pi@pi4:~ $ docker ps | grep test2
17bbb5cedf5d   openhab/openhab:3.3.0           "/entrypoint tini -s…"   2 hours ago    Up 2 hours (healthy)                                                                    oh33test2

pi@pi4:~ $ docker exec -ti oh33test2 /openhab/runtime/bin/client
Logging in as openhab
Password:  

                           _   _     _     ____  
   ___   ___   ___   ___  | | | |   / \   | __ ) 
  / _ \ / _ \ / _ \ / _ \ | |_| |  / _ \  |  _ \ 
 | (_) | (_) |  __/| | | ||  _  | / ___ \ | |_) )
  \___/|  __/ \___/|_| |_||_| |_|/_/   \_\|____/ 
       |_|       3.2.0 - Release Build

Use '<tab>' for a list of available commands
and '[cmd] --help' for help on a specific command.
To exit, use '<ctrl-d>' or 'logout'.

openhab> list -s org.openhab.core
START LEVEL 100 , List Threshold: 50
 ID │ State  │ Lvl │ Version │ Symbolic name
────┼────────┼─────┼─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
151 │ Active │  80 │ 3.2.0   │ org.openhab.core

Something is wrong here. Are you 100% positive that you are not still running the 3.2 container somewhere?

The only way you could be running the 3.3 image and it tell you that it’s the 3.2 cute when you log in to the karaf console is if you are really actually running the 3.2 image after all, or the upgrade failed when starting the 3.3 image and the userdata/etc/versions.properties didn’t get replaced. But if that were the case, v the bundle would still say 3.3 even if the login welcome says 3.2.

So either something is seriously wrong with the original Docker image, which is unlikely or wise others would be complaining of the same purple l problem, or you are actually running the 3.2 image somehow.

Note, if OH 3.2 is still running it will have grabbed port 1801 (going from memory). And since both would be running with net=host, even if you are inside the 3.3 container, you’ll be connecting to the 3.2 insurance since it will have grabbed the ssh port first.

If you also change the container name with every version upgrade it will probably not remove the old container.

After you’ve pulled the new image, did you do docker-compose up -d ? Without this, it will keep using the old image.

During my current test another container is using OH32 it’s used for the current production verison.

But this should be independent because all Docker containers are independent from the other once, right? Btw. I also use different folder and different docker-compose files.

And … Tried in the first version of upgrading all in once. Kill the old container, use only the new. But there was the shit starting … :smiley:

Now I’m going to find what the reason can be. Therefore I try to use different setups for each test.

Different “projects” … different docker-compose files. I change the container name to not affect the old / other container.

Sure, have a look to the output. The oh33 image has been used.

Not when you are using net=host:

That makes it so that there is no isolation network wise between the container and the host. And if you have two containers running with net=host that means there’s no isolation network wise between those container either.

Since both use port 1801 to ssh to the karaf console which ever one comes up first will work and the second one will fail. You should be seeing “Bind” exceptions in the OH 3.3 instance’s logs when it attempts to bind to the ssh port (1801), the LSP port, and the broadcast port because your OH 3.2 instance is already connected to those ports.

Ok. You are right. I tried now the 3.3.0 and 3.4.0.M5 in separate containers without net=host and it seems working. I will duplicate my OH3.2 to a new container to test the automatic upgrade.

Additionally I found that Shelly Plus devices are only available starting with 3.4.0.M5. I need to check when OH3.4 officially will be released.

I need to use net=host for Amazon Dashbuttons furthermore. But I guess for the moment I will try this stand-alone.

A couple of weeks from now.

1 Like