I’m getting this error message, when I try to resolve my dependencies:
Resolution failed. Capabilities satisfying the following requirements could not be found:
[<>]
⇒ osgi.identity: (osgi.identity=org.openhab.binding.mynewbinding)
⇒ [org.openhab.binding.mynewbinding version=3.1.0.202101182120]
⇒ osgi.wiring.package: (osgi.wiring.package=com.sun.jndi.dns)
So I have added a dependency in my POM.xml for the “missing” artifact (which didn’t solve the issue):
Why is this not working? Could it possibly be caused by the downloadURL which seems not to be working? (accessing it from my webbrowser I can’t download anything)
Selenium is a browser orchestration tool. It makes sense to use it only if you really want to test something which requires a browser.
Is that your plan or you try to scrap data from some webpage with it? If so there are other (simpler) ways for doing that.
Anyhow if you ever run with dependency problem such you had you can always check with mvn dependency:tree where it comes from and use maven dependency management to exclude problematic jar. (be aware it doesn’t work for all situations)
Thanks so far! I indeed want to use it to scrap data from a webpage. Which would be a more appropriate way to do that? The website is dynamic and uses some complex javascript-code for generating the data I need.
So I already programmed solutions with Selenium and puppeteer but had in both cases problems to integrate the dependecies into my openhab-binding.
Try with web page scraper instead of full browser. Also, if you have plenty of javascript involved then it usually means that data is fetched by browser from an endpoint which can be read independently (obviously with some authentication). Have you looked at “web developer” extension in your favorite browser to see what data is read and which is brought via generated html?
A while ago I wrote a crawler which walked through vendor gateway device which did rely on some jquery and ajax calls. Here you can see how too bootsrap crawler.
My code is more complicated than needed cause I was catching also traffic happening on physical interface. If you will get it out you will see that crawling itself is rather simple.