Ecovacs Vacuum Cleaners Binding [3.2.0;4.0.0)

Will do. I’ve asked core maintainers about this in my PR, because I’m far from being an OSGi expert.

1 Like

Hey,

thanks for your great work. I’m using the sucks fork via mqtt, which crashes from time to time, which is really annoying.

So i’ve installed the binding, added credentials and my Debot 950 was super easy configured.

But i’m experimenting some trouble :confused:

I’ve configured and tested everything - worked fine - changed my rule for starting cleaning at 8 am
to use the new rule but nothing happens.

After suspending and resuming the vauum thing it worked again.
So i’ve grepped through the log files and found something interesting:
At 3:11 am one other binding (DeutscheBahn Timetable) switched offline and online shortly afterwards.
Yepp the good old 24h DSL breakup - unbelievable in 2022 but here it is.
So i’ve not checked the source yet (but i’ve planned to perform a review) you wrote something
about sockets for xmpp listening.

This may be an conclusive explanation for this behaviour.

So maybe any kind of “is connection established” seems to be required.

What do you think?

  • Sönke

I’m a bit confused: You say you have a Deebot 950, but that one uses MQTT, not XMPP?
What do you mean by ‘is connection established’? The binding already listens for MQTT disconnection events and reconnects in that case. The MQTT library also should continously issue ping messages to check server connectivity. Additionally, MQTT disconnections should only affect updating some state channels, not issuing commands (as those are sent via a REST API) … this is different from the XMPP case where commands are sent over the XMPP connection.
That being said, I just tried it myself on my 950, and indeed it seems there’s sonething off there (the event listener didn’t notice water plate being attached/detached) … I’ll investigate that, but it’ll take some time as it obviously doesn’t trigger immediately.

Hey,

sorry for the confusion:

Previously i’ve used this docker-container

that connects to ecovacs api and allows control and status updates to/from openHAB via mqtt.

About the communication between the ecovacs-api and robot i didn’t care (until now).
So i don’t know yet the implementation details right now (which api is used for which action),
i’ve just observed that after reconnection no more status updates are received and commands don’t work
until devices is disabled and enabled.

‘is connection established’ - Yes that’s the question, i don’t have any idea on that, due i’ve not yet
checked the implementation details. I’ll try to find some time to check the source, and will let you know wheter i’ve got an idea.

Thanks for your support!

1 Like

Hi there,
First of all, thanks a lot for the binding.
I have installed it and the account can go online, but when I try add a deebot, it shows the following error and get disconnected:

2022-03-09 10:05:28.287 [INFO ] [ab.event.ThingStatusInfoChangedEvent] - Thing 'ecovacs:vacuum:464077d55c:E0001076117607100181' changed from INITIALIZING to ONLINE
2022-03-09 10:05:28.290 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: sending command GetBatteryInfo, retry 0
2022-03-09 10:05:28.419 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: Received command response XML <ctl id="94871143" ret="ok"><battery power="100"/></ctl>
2022-03-09 10:05:28.431 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: sending command GetChargeState, retry 0
2022-03-09 10:05:28.558 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: Received command response XML <ctl id="51893249" ret="ok"><charge type="SlotCharging"/></ctl>
2022-03-09 10:05:28.569 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: sending command GetCleanState, retry 0
2022-03-09 10:05:28.740 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: Received command response XML <ctl id="48727702" ret="ok"><clean speed="standard" st="h" t="100" a="100"/></ctl>
2022-03-09 10:05:28.750 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: sending command GetWaterBoxInfo, retry 0
2022-03-09 10:05:28.881 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: Received command response XML <ctl id="51791744" ret="ok" on="1"/>
2022-03-09 10:05:28.890 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: sending command GetError, retry 0
2022-03-09 10:05:29.020 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: Received command response XML <ctl id="96222439" ret="ok"/>
2022-03-09 10:05:29.036 [DEBUG] [nternal.handler.EcovacsVacuumHandler] - E0001076117607100181: Scheduling next poll in 0s, refresh interval 5min
2022-03-09 10:05:29.039 [DEBUG] [nternal.handler.EcovacsVacuumHandler] - E0001076117607100181: Polling data
2022-03-09 10:05:29.043 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: sending command GetCleanSum, retry 0
2022-03-09 10:05:29.210 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: Received command response XML <ctl id="63016501" ret="ok" a="000000748" l="000096420" c="000000143"/>
2022-03-09 10:05:29.223 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: sending command GetCleanLogs, retry 0
2022-03-09 10:05:29.402 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: Received command response XML <ctl id="77809718" ret="ok"><CleanSt a="007" s="1646548201" l="0495" t="a" f="a"/><CleanSt a="008" s="1646548719" l="0710" t="a" f="a"/><CleanSt a="011" s="1646706614" l="1302" t="a" f="a"/><CleanSt a="007" s="1645842605" l="0670" t="a" f="a"/><CleanSt a="003" s="1645843403" l="0411" t="a" f="a"/><CleanSt a="012" s="1646015405" l="1317" t="a" f="a"/><CleanSt a="000" s="1646016746" l="0051" t="a" f="a"/><CleanSt a="007" s="1646188198" l="0782" t="a" f="a"/><CleanSt a="010" s="1646189816" l="1578" t="a" f="a"/><CleanSt a="009" s="1646360995" l="0989" t="a" f="a"/></ctl>
2022-03-09 10:05:29.414 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: sending command GetCleanSpeed, retry 0
2022-03-09 10:05:30.922 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: sending command GetCleanSpeed, retry 1
2022-03-09 10:05:32.428 [TRACE] [.internal.api.impl.EcovacsXmppDevice] - E0001076117607100181: sending command GetCleanSpeed, retry 2
2022-03-09 10:05:33.930 [DEBUG] [nternal.handler.EcovacsVacuumHandler] - E0001076117607100181: Failed communicating to device, reconnecting
org.openhab.binding.ecovacs.internal.api.EcovacsApiException: No response for command GetCleanSpeed
	at org.openhab.binding.ecovacs.internal.api.impl.EcovacsXmppDevice.sendCommand(EcovacsXmppDevice.java:154) ~[?:?]
	at org.openhab.binding.ecovacs.internal.handler.EcovacsVacuumHandler.lambda$12(EcovacsVacuumHandler.java:555) ~[?:?]
	at org.openhab.binding.ecovacs.internal.handler.EcovacsVacuumHandler.doWithDevice(EcovacsVacuumHandler.java:702) ~[?:?]
	at org.openhab.binding.ecovacs.internal.handler.EcovacsVacuumHandler.pollData(EcovacsVacuumHandler.java:519) ~[?:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:829) [?:?]
2022-03-09 10:05:33.936 [DEBUG] [nternal.handler.EcovacsVacuumHandler] - E0001076117607100181: Data polling completed
2022-03-09 10:05:33.937 [INFO ] [ab.event.ThingStatusInfoChangedEvent] - Thing 'ecovacs:vacuum:464077d55c:E0001076117607100181' changed from ONLINE to OFFLINE (COMMUNICATION_ERROR)

FYI, my deebot model was OZMO 610.

Thanks.

Does the app allow clean speed control for your model?

No clean speed control .
Thanks.

@phui Should be fixed in the new release I just uploaded.

@soenke That issue should hopefully fixed now as well. If the connection still breaks up after some time, please enable trace logging log:set TRACE org.openhab.binding.ecovacs in Karaf console, restart (disable + enable) the vacuum thing, wait until it breaks again and then send me the log.

It works perfectly now !!!
Thanks a lot !!! :slight_smile:

@maniac103 Works fine since your update now! Thanks!

Just try update to 3.3 milestone 3 and the binidng not work, ecovac account can go online but my ozmo 610 can’t, fall back to 3.3 milestone 2 work again perfectly.
Thanks.

Please share the log.

Edit: I just tried with a completely fresh OH 3.3 M3 install and installed the binding from the Marketplace. It worked totally fine with my OZMO 950 (which is the only device I have), so either something’s off in your installation or something’s off regarding XMPP. Either way, I’d need to see a log file.

here you go:

2022-04-10 10:08:57.569 [DEBUG] [nternal.handler.EcovacsVacuumHandler] - E0001076117607100181: Failed communicating to device, reconnecting

org.openhab.binding.ecovacs.internal.api.EcovacsApiException: org.jivesoftware.smack.SmackException: No supported and enabled SASL Mechanism provided by server. Server announced mechanisms: [PLAIN]. Registered SASL mechanisms with Smack: [SASL Mech: SCRAM-SHA-1-PLUS, Prio: 100, SASL Mech: SCRAM-SHA-1, Prio: 110, SASL Mech: X-OAUTH2, Prio: 410, SASL Mech: ANONYMOUS, Prio: 500]. Enabled SASL mechanisms for this connection: null. Blacklisted SASL mechanisms: [SCRAM-SHA-1-PLUS].

	at org.openhab.binding.ecovacs.internal.api.impl.EcovacsXmppDevice.connect(EcovacsXmppDevice.java:233) ~[bundleFile:?]

	at org.openhab.binding.ecovacs.internal.handler.EcovacsVacuumHandler.lambda$11(EcovacsVacuumHandler.java:484) ~[bundleFile:?]

	at org.openhab.binding.ecovacs.internal.handler.EcovacsVacuumHandler.doWithDevice(EcovacsVacuumHandler.java:680) ~[bundleFile:?]

	at org.openhab.binding.ecovacs.internal.handler.EcovacsVacuumHandler.connectToDevice(EcovacsVacuumHandler.java:483) ~[bundleFile:?]

	at org.openhab.binding.ecovacs.internal.api.util.SchedulerTask.run(SchedulerTask.java:82) [bundleFile:?]

	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]

	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]

	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]

	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]

	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]

	at java.lang.Thread.run(Thread.java:829) [?:?]

Caused by: org.jivesoftware.smack.SmackException: No supported and enabled SASL Mechanism provided by server. Server announced mechanisms: [PLAIN]. Registered SASL mechanisms with Smack: [SASL Mech: SCRAM-SHA-1-PLUS, Prio: 100, SASL Mech: SCRAM-SHA-1, Prio: 110, SASL Mech: X-OAUTH2, Prio: 410, SASL Mech: ANONYMOUS, Prio: 500]. Enabled SASL mechanisms for this connection: null. Blacklisted SASL mechanisms: [SCRAM-SHA-1-PLUS].

	at org.jivesoftware.smack.SASLAuthentication.selectMechanism(SASLAuthentication.java:361) ~[bundleFile:4.3.3]

	at org.jivesoftware.smack.SASLAuthentication.authenticate(SASLAuthentication.java:192) ~[bundleFile:4.3.3]

	at org.jivesoftware.smack.tcp.XMPPTCPConnection.loginInternal(XMPPTCPConnection.java:402) ~[bundleFile:4.3.3]

	at org.jivesoftware.smack.AbstractXMPPConnection.login(AbstractXMPPConnection.java:528) ~[bundleFile:4.3.3]

	at org.jivesoftware.smack.AbstractXMPPConnection.login(AbstractXMPPConnection.java:485) ~[bundleFile:4.3.3]

	at org.openhab.binding.ecovacs.internal.api.impl.EcovacsXmppDevice.connect(EcovacsXmppDevice.java:221) ~[bundleFile:?]

	... 10 more

2022-04-10 10:08:59.895 [DEBUG] [nternal.handler.EcovacsVacuumHandler] - E0001076117607100181: Failed communicating to device, reconnecting

org.openhab.binding.ecovacs.internal.api.EcovacsApiException: org.jivesoftware.smack.SmackException$NoResponseException: No response received within reply timeout. Timeout was 5000ms (~5s). While waiting for establishing TLS

	at org.openhab.binding.ecovacs.internal.api.impl.EcovacsXmppDevice.connect(EcovacsXmppDevice.java:233) ~[bundleFile:?]

	at org.openhab.binding.ecovacs.internal.handler.EcovacsVacuumHandler.lambda$11(EcovacsVacuumHandler.java:484) ~[bundleFile:?]

	at org.openhab.binding.ecovacs.internal.handler.EcovacsVacuumHandler.doWithDevice(EcovacsVacuumHandler.java:680) ~[bundleFile:?]

	at org.openhab.binding.ecovacs.internal.handler.EcovacsVacuumHandler.connectToDevice(EcovacsVacuumHandler.java:483) ~[bundleFile:?]

	at org.openhab.binding.ecovacs.internal.api.util.SchedulerTask.run(SchedulerTask.java:82) [bundleFile:?]

	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]

	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]

	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]

	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]

	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]

	at java.lang.Thread.run(Thread.java:829) [?:?]

Caused by: org.jivesoftware.smack.SmackException$NoResponseException: No response received within reply timeout. Timeout was 5000ms (~5s). While waiting for establishing TLS

	at org.jivesoftware.smack.SmackException$NoResponseException.newWith(SmackException.java:93) ~[bundleFile:4.3.3]

	at org.jivesoftware.smack.SynchronizationPoint.checkForResponse(SynchronizationPoint.java:317) ~[bundleFile:4.3.3]

	at org.jivesoftware.smack.SynchronizationPoint.checkIfSuccessOrWait(SynchronizationPoint.java:160) ~[bundleFile:4.3.3]

	at org.jivesoftware.smack.SynchronizationPoint.checkIfSuccessOrWaitOrThrow(SynchronizationPoint.java:131) ~[bundleFile:4.3.3]

	at org.jivesoftware.smack.tcp.XMPPTCPConnection.connectInternal(XMPPTCPConnection.java:944) ~[bundleFile:4.3.3]

	at org.jivesoftware.smack.AbstractXMPPConnection.connect(AbstractXMPPConnection.java:417) ~[bundleFile:4.3.3]

	at org.openhab.binding.ecovacs.internal.api.impl.EcovacsXmppDevice.connect(EcovacsXmppDevice.java:214) ~[bundleFile:?]

	... 10 more

Thanks

Okay, that one is easy: see solution here. This is a problem with OSGi dependency management which I asked about in my PR, but I did not receive a reply for it yet.

Hey,

i’ve got some trouble with the binding when restarting my openhab(ian).
After restart the plugin is shown as installed, but the things are uninitialized and have no configuration.
This behaviour is reproducable in my installation.
After removing and installing the plugin from the marketplace it works again and the things are getting back online again.

What if’ve observed is the following error message within the openhab.log, which seems to be an problem with resolving some dependencies.

2022-04-13 20:56:05.495 [WARN ] [internal.service.FeaturesServiceImpl] - Can't load features repository mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/3.3.0-SNAPSHOT/xml/features
java.lang.RuntimeException: Error resolving artifact org.openhab.core.features.karaf:org.openhab.core.features.karaf.openhab-core:xml:features:3.3.0-SNAPSHOT: [Could not find artifact org.openhab.core.features.karaf:org.openhab.core.features.karaf.openhab-core:xml:features:3.3.0-SNAPSHOT] : mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/3.3.0-SNAPSHOT/xml/features
	at org.apache.karaf.features.internal.service.RepositoryImpl.load(RepositoryImpl.java:121) ~[bundleFile:?]
	at org.apache.karaf.features.internal.service.RepositoryImpl.<init>(RepositoryImpl.java:51) ~[bundleFile:?]
	at org.apache.karaf.features.internal.service.RepositoryCacheImpl.create(RepositoryCacheImpl.java:51) ~[bundleFile:?]
	at org.apache.karaf.features.internal.service.FeaturesServiceImpl.getFeatureCache(FeaturesServiceImpl.java:611) [bundleFile:?]
	at org.apache.karaf.features.internal.service.FeaturesServiceImpl.ensureCacheLoaded(FeaturesServiceImpl.java:582) [bundleFile:?]
	at org.apache.karaf.features.internal.service.FeaturesServiceImpl.listRequiredRepositories(FeaturesServiceImpl.java:514) [bundleFile:?]
	at org.apache.karaf.deployer.features.FeatureDeploymentListener.bundleChanged(FeatureDeploymentListener.java:247) [bundleFile:?]
	at org.apache.karaf.deployer.features.FeatureDeploymentListener.init(FeatureDeploymentListener.java:90) [bundleFile:?]
	at org.apache.karaf.deployer.features.osgi.Activator$DeploymentFinishedListener.deploymentEvent(Activator.java:86) [bundleFile:?]
	at org.apache.karaf.features.internal.service.FeaturesServiceImpl.registerListener(FeaturesServiceImpl.java:296) [bundleFile:?]
	at org.apache.karaf.deployer.features.osgi.Activator.doStart(Activator.java:53) [bundleFile:?]
	at org.apache.karaf.util.tracker.BaseActivator.start(BaseActivator.java:92) [bundleFile:?]
	at org.eclipse.osgi.internal.framework.BundleContextImpl$2.run(BundleContextImpl.java:814) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.internal.framework.BundleContextImpl$2.run(BundleContextImpl.java:1) [org.eclipse.osgi-3.16.300.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:?]
	at org.eclipse.osgi.internal.framework.BundleContextImpl.startActivator(BundleContextImpl.java:806) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.internal.framework.BundleContextImpl.start(BundleContextImpl.java:763) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.internal.framework.EquinoxBundle.startWorker0(EquinoxBundle.java:1028) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.internal.framework.EquinoxBundle$EquinoxModule.startWorker(EquinoxBundle.java:371) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.container.Module.doStart(Module.java:605) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.container.Module.start(Module.java:468) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel$2.run(ModuleContainer.java:1849) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.internal.framework.EquinoxContainerAdaptor$1$1.execute(EquinoxContainerAdaptor.java:136) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.incStartLevel(ModuleContainer.java:1842) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.incStartLevel(ModuleContainer.java:1785) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.doContainerStartLevel(ModuleContainer.java:1747) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.dispatchEvent(ModuleContainer.java:1669) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.dispatchEvent(ModuleContainer.java:1) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:234) [org.eclipse.osgi-3.16.300.jar:?]
	at org.eclipse.osgi.framework.eventmgr.EventManager$EventThread.run(EventManager.java:345) [org.eclipse.osgi-3.16.300.jar:?]
Caused by: java.io.IOException: Error resolving artifact org.openhab.core.features.karaf:org.openhab.core.features.karaf.openhab-core:xml:features:3.3.0-SNAPSHOT: [Could not find artifact org.openhab.core.features.karaf:org.openhab.core.features.karaf.openhab-core:xml:features:3.3.0-SNAPSHOT]
	at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.configureIOException(AetherBasedResolver.java:803) ~[?:?]
	at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:774) ~[?:?]
	at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:657) ~[?:?]
	at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:598) ~[?:?]
	at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:565) ~[?:?]
	at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:555) ~[?:?]
	at org.ops4j.pax.url.mvn.internal.Connection.getInputStream(Connection.java:123) ~[?:?]
	at java.net.URL.openStream(URL.java:1140) ~[?:?]
	at org.apache.karaf.features.internal.service.RepositoryImpl.load(RepositoryImpl.java:114) ~[bundleFile:?]
	... 29 more
	Suppressed: shaded.org.eclipse.aether.transfer.ArtifactNotFoundException: Could not find artifact org.openhab.core.features.karaf:org.openhab.core.features.karaf.openhab-core:xml:features:3.3.0-SNAPSHOT
		at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:403) ~[?:?]
		at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifacts(DefaultArtifactResolver.java:215) ~[?:?]
		at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifact(DefaultArtifactResolver.java:192) ~[?:?]
		at shaded.org.eclipse.aether.internal.impl.DefaultRepositorySystem.resolveArtifact(DefaultRepositorySystem.java:247) ~[?:?]
		at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:767) ~[?:?]
		at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:657) ~[?:?]
		at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:598) ~[?:?]
		at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:565) ~[?:?]
		at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:555) ~[?:?]
		at org.ops4j.pax.url.mvn.internal.Connection.getInputStream(Connection.java:123) ~[?:?]
		at java.net.URL.openStream(URL.java:1140) ~[?:?]
		at org.apache.karaf.features.internal.service.RepositoryImpl.load(RepositoryImpl.java:114) ~[bundleFile:?]
		at org.apache.karaf.features.internal.service.RepositoryImpl.<init>(RepositoryImpl.java:51) ~[bundleFile:?]
		at org.apache.karaf.features.internal.service.RepositoryCacheImpl.create(RepositoryCacheImpl.java:51) ~[bundleFile:?]
		at org.apache.karaf.features.internal.service.FeaturesServiceImpl.getFeatureCache(FeaturesServiceImpl.java:611) [bundleFile:?]
		at org.apache.karaf.features.internal.service.FeaturesServiceImpl.ensureCacheLoaded(FeaturesServiceImpl.java:582) [bundleFile:?]
		at org.apache.karaf.features.internal.service.FeaturesServiceImpl.listRequiredRepositories(FeaturesServiceImpl.java:514) [bundleFile:?]
		at org.apache.karaf.deployer.features.FeatureDeploymentListener.bundleChanged(FeatureDeploymentListener.java:247) [bundleFile:?]
		at org.apache.karaf.deployer.features.FeatureDeploymentListener.init(FeatureDeploymentListener.java:90) [bundleFile:?]
		at org.apache.karaf.deployer.features.osgi.Activator$DeploymentFinishedListener.deploymentEvent(Activator.java:86) [bundleFile:?]
		at org.apache.karaf.features.internal.service.FeaturesServiceImpl.registerListener(FeaturesServiceImpl.java:296) [bundleFile:?]
		at org.apache.karaf.deployer.features.osgi.Activator.doStart(Activator.java:53) [bundleFile:?]
		at org.apache.karaf.util.tracker.BaseActivator.start(BaseActivator.java:92) [bundleFile:?]
		at org.eclipse.osgi.internal.framework.BundleContextImpl$2.run(BundleContextImpl.java:814) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.internal.framework.BundleContextImpl$2.run(BundleContextImpl.java:1) [org.eclipse.osgi-3.16.300.jar:?]
		at java.security.AccessController.doPrivileged(Native Method) ~[?:?]
		at org.eclipse.osgi.internal.framework.BundleContextImpl.startActivator(BundleContextImpl.java:806) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.internal.framework.BundleContextImpl.start(BundleContextImpl.java:763) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.internal.framework.EquinoxBundle.startWorker0(EquinoxBundle.java:1028) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.internal.framework.EquinoxBundle$EquinoxModule.startWorker(EquinoxBundle.java:371) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.container.Module.doStart(Module.java:605) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.container.Module.start(Module.java:468) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel$2.run(ModuleContainer.java:1849) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.internal.framework.EquinoxContainerAdaptor$1$1.execute(EquinoxContainerAdaptor.java:136) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.incStartLevel(ModuleContainer.java:1842) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.incStartLevel(ModuleContainer.java:1785) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.doContainerStartLevel(ModuleContainer.java:1747) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.dispatchEvent(ModuleContainer.java:1669) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.dispatchEvent(ModuleContainer.java:1) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:234) [org.eclipse.osgi-3.16.300.jar:?]
		at org.eclipse.osgi.framework.eventmgr.EventManager$EventThread.run(EventManager.java:345) [org.eclipse.osgi-3.16.300.jar:?]
Caused by: shaded.org.eclipse.aether.resolution.ArtifactResolutionException: Error resolving artifact org.openhab.core.features.karaf:org.openhab.core.features.karaf.openhab-core:xml:features:3.3.0-SNAPSHOT
	at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:413) ~[?:?]
	at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifacts(DefaultArtifactResolver.java:215) ~[?:?]
	at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifact(DefaultArtifactResolver.java:192) ~[?:?]
	at shaded.org.eclipse.aether.internal.impl.DefaultRepositorySystem.resolveArtifact(DefaultRepositorySystem.java:247) ~[?:?]
	at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:767) ~[?:?]
	at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:657) ~[?:?]
	at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:598) ~[?:?]
	at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:565) ~[?:?]
	at org.ops4j.pax.url.mvn.internal.AetherBasedResolver.resolve(AetherBasedResolver.java:555) ~[?:?]
	at org.ops4j.pax.url.mvn.internal.Connection.getInputStream(Connection.java:123) ~[?:?]
	at java.net.URL.openStream(URL.java:1140) ~[?:?]
	at org.apache.karaf.features.internal.service.RepositoryImpl.load(RepositoryImpl.java:114) ~[bundleFile:?]
	... 29 more

When i check the active bundles in openhab-console the file-bundle gets listed as active:

346 ¦ Active   ¦  80 ¦ 4.3.3                 ¦ smack-resolver-javax
350 ¦ Active   ¦  80 ¦ 0                     ¦ wrap_file__var_lib_openhab_tmp_kar_org.openhab.binding.ecovacs-3.3.0-SNAPSHOT_org_lastnpe_eea_eea-all_2.2.1_eea-all-2.2.1.jar

After reinstall the “real bundle” is available and active too

351 ¦ Active   ¦  80 ¦ 4.3.3                 ¦ smack-java7
352 ¦ Active   ¦  80 ¦ 4.3.3                 ¦ smack-resolver-javax
353 ¦ Active   ¦  80 ¦ 3.3.0.202203091202    ¦ openHAB Add-ons :: Bundles :: Ecovacs Binding
356 ¦ Active   ¦  80 ¦ 0                     ¦ wrap_file__var_lib_openhab_tmp_kar_org.openhab.binding.ecovacs-3.3.0-SNAPSHOT_org_lastnpe_eea_eea-all_2.2.1_eea-all-2.2.1.jar

So due i’m not that familar with the osgi bundle handing here and this might be a problem with dependencies, does anyone else can reproduce this issue,
or is this a more general bug within the marketplace bindings? In that case i would open an issue in openhab-core.

Here are my system information

runtimeInfo:
  version: 3.2.0
  buildString: Release Build
locale: de-DE
systemInfo:
  configFolder: /etc/openhab
  userdataFolder: /var/lib/openhab
  logFolder: /var/log/openhab
  javaVersion: 11.0.10
  javaVendor: Azul Systems, Inc.
  javaVendorVersion: Zulu11.45+27-CA
  osName: Linux
  osVersion: 5.10.103-v7l+
  osArchitecture: arm
  availableProcessors: 4
  freeMemory: 58791288
  totalMemory: 212209664
bindings:
  - ahawastecollection
  - astro
  - avmfritz
  - denonmarantz
  - deutschebahn
  - ecovacs
  - exec
  - heos
  - mqtt
  - netatmo
  - network
  - ntp
  - tankerkoenig
  - tradfri
clientInfo:
  device:
    ios: false
    android: false
    androidChrome: false
    desktop: true
    iphone: false
    ipod: false
    ipad: false
    edge: false
    ie: false
    firefox: false
    macos: false
    windows: true
    cordova: false
    phonegap: false
    electron: false
    nwjs: false
    webView: false
    webview: false
    standalone: false
    os: windows
    pixelRatio: 1.5
    prefersColorScheme: light
  isSecureContext: true
  locationbarVisible: true
  menubarVisible: true
  navigator:
    cookieEnabled: true
    deviceMemory: 8
    hardwareConcurrency: 8
    language: de-DE
    languages:
      - de-DE
      - de
      - en-US
      - en
    onLine: true
    platform: Win32
  screen:
    width: 2560
    height: 1440
    colorDepth: 24
  support:
    touch: false
    pointerEvents: true
    observer: true
    passiveListener: true
    gestures: false
    intersectionObserver: true
  themeOptions:
    dark: light
    filled: true
    pageTransitionAnimation: default
    bars: filled
    homeNavbar: default
    homeBackground: default
    expandableCardAnimation: default
  userAgent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML,
    like Gecko) Chrome/100.0.4896.75 Safari/537.36
timestamp: 2022-04-13T19:12:12.950Z

Hi, great job!
Confirming it is working perfectly with Deebot Ozmo 905 that I have.

Just a question: I see in the channels there is last-clean#last-clean-map that I don’t have when configuring the Thing. Since from the Ecovacs app I can see the map with the last cleaning (or better for the clean history), I’m wondering if it is just an option to enable.
I tried all command and they are working, like I’m able to control which room to clean using the spotArea command

No, this isn’t an option to enable, but depends on a device capability list contained in the binding. I’ve included the mapping capability now, so it should work with the new release. You’ll have to recreate the thing for the vacuum (once), though, for the missing channels to appear.

Very good. All is almost working.
For the channels last-clean#last-clean-map and last-clean#last-clean-mode I’m getting NULL and UNDEF.
I see data coming:


This is the list of MQTT topic I’m receiving:
iot/atr/BatteryInfo
iot/atr/BigDataCleanInfoReport
iot/atr/ChargeState
iot/atr/CleanedMap
iot/atr/CleanedPos
iot/atr/CleanedTrace
iot/atr/CleanReport
iot/atr/CleanReportServer
iot/atr/CleanSt
iot/atr/MapP
iot/atr/Pos
iot/atr/trace

Is there any other test I can do?
If required full log, of course I will upload

Clean logs are polled, not updated via MQTT event message. You can search for ‘cleaning logs’ in the log to find the place where the respective HTTP request is done.

I suppose there is any map in the polling result. If so my test is completed, and the binding is working well as per design on my Deebot Ozmo 905