Should not be broken anymore, since Basic Auth is not offered with https://github.com/openhab/openhab-core/pull/1713.
After the next snapshot is released.
Iāve been playing with home assistant lately. Itās all integrated. I didnāt even need to ssh top install thee deconz binding and i even can do they config from within home assistant and look at the zigbee map from within home assistant. I can even do ssh or edit files from within home assistant. I never have to leave home assistant. Documentation is a lot better. You feel itās made for users
Than, looking at openhab, even number 3, it all looks datet. And the thing i liked the most of OH,jythonā¦ Well, thee answer is: here is the URL, and figure out out yourself. While, some time ago, oh3 would have the NRGE with jython as default languageā¦ Very disappointedā¦ Apart from some ui change, oh3 will not be installed here. Iāll keep oh2.5 or go to home assistantā¦
I am strongly considering moving back to HA but I keep hoping somebody will fork OH.
Yes home assistant looks veel good. Quality of integrations (like bindings in oh) i use is also better. Itās sad, i was just getting used to ohā¦
But, looking for solutions is frustrating, because the docs are not updated, on several places,ā¦
In HA i was able to find info much much quicker.
Well, i guess iāll buy another raspberry, and use both, and the one that works best, has best support, and documentation, and looks best, will be choosen. At the moment HA has the best points. But, you never know, oh3 will become much better, but they have a lot of workā¦ Also on the bindings. Letās hope for the best. I liked OH, but the way itās going nowā¦
Home Assistant, at least a year ago was not stable and, last I checked a few months ago, the code quality was poor. I tried using their new Z-Wave2MQTT just after one of their main developers released an update. I needed to work around serious errors that prevented the code from even operating.
I just see enough discontent here to think there may be a fork soon that is more user focused.
Googling for the error it seems that this problem is due that the value is sometimes an integer and sometimes a decimal and InfluxDB doesnāt like that in the same serie.
But Iām curious about the cause because the fact to try to preserve the type was inherited from the old version from here org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils#convertBigDecimalToNum
I donāt know if it can be a change in the Zwave addon that previously worked always in doubles? or some obscure effect on the new addon that I will have to investigate.
In any case, can you test if with that version that always uses doubles you have the problem?
It will be helpful if you can enter openhab console and set TRACE level for InfluxDB binding:
log:set TRACE org.openhab.persistence.influxdb
and then set prior log to the exception
I run into this issue with a humidity channel so Iāve created an issue for it:
Have you ever updated any of the docs?
I donāt think it is fair to compare openHAB 3 which is not released or finished yet and the docs are only just getting started. The milestone is for people to try and give feedback on and test for bugs. I agree with you the better software is going to come down to which one has better support for your hardware and quality of the bindings/integrations and HA may be better for some people. For me openHAB has better bindings as I have been testing both at the moment. As for support this forum is far more helpful.
Hopefully you have contacted the person that maintains the binding in openHAB with some feedback done in a nice friendly way?
Home Assistant claims to not be finished yet It has not even reached version 1.0
From what I understand openHAB 3.0 stable will be end of Dec this year. When HA 1.0 will arrive is anyoneās guess. Both projects are continuously changing and will never be āfinishedā, however the milestone build is for early adopters that understand the docs are not done and missing completely in some areas. I find it far better than V2 and paperUI already.
Good morning,
before I started questions about the binding, DECONZ, I was in contact with someone from Dresden Electronic, who supported the USB stick (
We did a lot of testing, and the result was, that their software was working perfect.
Home assistant works also perfect with it.
For Openhab, the reply I got: āThat feature is not well designed. Itās disabled by defaultā.
Ha ok. I use zigbee.
I looked at 2wave2mqtt and there seems to be a lot of changes (and hopefully progress). But, I have no idea how bad/good it is right now.
About forking,
Thatās whatās lacking now. User focus. From what I understand, OH wants to attract āordenary usersā too. I am a professional programma on ibm/I5. Nothing with java, nothing to do with linux,ā¦
But, I try to find my way.
I guess forking will be hard. I have the feeling that there are not that meny developersā¦
why?
Congrats, you finally managed to be added to my ignore List.
Not a single helpful comment, only ranting about how Bad openHAB is managed and how much better other software is. Please move on to that software and let us work on the Future of openHAB. Thanks.
Have installed the custom persistance jar and uninstalled old one before, as it seems it solves the issue for doubles but not for other data types, i can open Trace if needed.
thanks
Thomas
2020-10-19 09:37:00.652 [ERROR] [org.influxdb.impl.BatchProcessor ] - Batch could not be sent. Data will be lost
org.influxdb.InfluxDBException$FieldTypeConflictException: partial write: field type conflict: input field "value" on measurement "Shutter_OG_Schlafzimmer" is type float, already exists as type integer dropped=24
at org.influxdb.InfluxDBException.buildExceptionFromErrorMessage(InfluxDBException.java:144) ~[bundleFile:?]
at org.influxdb.InfluxDBException.buildExceptionForErrorState(InfluxDBException.java:173) ~[bundleFile:?]
at org.influxdb.impl.InfluxDBImpl.execute(InfluxDBImpl.java:827) ~[bundleFile:?]
at org.influxdb.impl.InfluxDBImpl.write(InfluxDBImpl.java:460) ~[bundleFile:?]
at org.influxdb.impl.OneShotBatchWriter.write(OneShotBatchWriter.java:22) ~[bundleFile:?]
at org.influxdb.impl.BatchProcessor.write(BatchProcessor.java:340) [bundleFile:?]
at org.influxdb.impl.BatchProcessor$2.run(BatchProcessor.java:370) [bundleFile:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
at java.lang.Thread.run(Thread.java:834) [?:?]
2020-10-19 09:37:00.674 [ERROR] [org.influxdb.impl.BatchProcessor ] - Batch could not be sent. Data will be lost
org.influxdb.InfluxDBException$FieldTypeConflictException: partial write: field type conflict: input field "value" on measurement "lastEnd" is type string, already exists as type integer dropped=49
at org.influxdb.InfluxDBException.buildExceptionFromErrorMessage(InfluxDBException.java:144) ~[bundleFile:?]
at org.influxdb.InfluxDBException.buildExceptionForErrorState(InfluxDBException.java:173) ~[bundleFile:?]
at org.influxdb.impl.InfluxDBImpl.execute(InfluxDBImpl.java:827) ~[bundleFile:?]
at org.influxdb.impl.InfluxDBImpl.write(InfluxDBImpl.java:460) ~[bundleFile:?]
at org.influxdb.impl.OneShotBatchWriter.write(OneShotBatchWriter.java:22) ~[bundleFile:?]
at org.influxdb.impl.BatchProcessor.write(BatchProcessor.java:340) [bundleFile:?]
at org.influxdb.impl.BatchProcessor$2.run(BatchProcessor.java:370) [bundleFile:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
at java.lang.Thread.run(Thread.java:834) [?:?]
2020-10-19 09:37:00.681 [ERROR] [org.influxdb.impl.BatchProcessor ] - Batch could not be sent. Data will be lost
org.influxdb.InfluxDBException$FieldTypeConflictException: partial write: field type conflict: input field "value" on measurement "Fibaro_LUX_8" is type float, already exists as type integer dropped=7
at org.influxdb.InfluxDBException.buildExceptionFromErrorMessage(InfluxDBException.java:144) ~[bundleFile:?]
at org.influxdb.InfluxDBException.buildExceptionForErrorState(InfluxDBException.java:173) ~[bundleFile:?]
at org.influxdb.impl.InfluxDBImpl.execute(InfluxDBImpl.java:827) ~[bundleFile:?]
at org.influxdb.impl.InfluxDBImpl.write(InfluxDBImpl.java:460) ~[bundleFile:?]
at org.influxdb.impl.OneShotBatchWriter.write(OneShotBatchWriter.java:22) ~[bundleFile:?]
at org.influxdb.impl.BatchProcessor.write(BatchProcessor.java:340) [bundleFile:?]
at org.influxdb.impl.BatchProcessor$1.run(BatchProcessor.java:287) [bundleFile:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
at java.lang.Thread.run(Thread.java:834) [?:?]
==> /var/log/openhab/events.log <==
Hi. I try to run OH3 on Win7x64, but got error
openhab> org.apache.felix.resolver.reason.ReasonException: Unable to resolve root: missing requirement [root] osgi.identity; osgi.identity=openhab-runtime-base; type=karaf.feature; version="[3.0.0.M1,3.0.0.M1]"; filter:="(&(osgi.identity=openhab-runtime-base)(type=karaf.feature)(version>=3.0.0.M1)(version<=3.0.0.M1))" [caused by: Unable to resolve openhab-runtime-base/3.0.0.M1: missing requirement [openhab-runtime-base/3.0.0.M1] osgi.identity; osgi.identity=openhab-core-model-rule; type=karaf.feature [caused by: Unable to resolve openhab-core-model-rule/3.0.0.M1: missing requirement [openhab-core-model-rule/3.0.0.M1] osgi.identity; osgi.identity=org.openhab.core.model.rule; type=osgi.bundle; version="[3.0.0.M1,3.0.0.M1]"; resolution:=mandatory [caused by: Unable to resolve org.openhab.core.model.rule/3.0.0.M1: missing requirement [org.openhab.core.model.rule/3.0.0.M1] osgi.wiring.package; filter:="(&(osgi.wiring.package=org.openhab.core.common)(version>=3.0.0)(!(version>=4.0.0)))" [caused by: Unable to resolve org.openhab.core/3.0.0.M1: missing requirement [org.openhab.core/3.0.0.M1] osgi.ee; filter:="(&(osgi.ee=JavaSE)(version=11))"]]]]
What i missing?
Here is a hint. What version of Java are you using?