Binding development, support Alexa

Just looking for some advice for supporting Alexa for a binding.

I have binding which can discover and create things for speakers. Can/should I add the Alexa metadata right in the binding, so end user does not need to manually enter this for all the things? Is this a bad idea?

I’m hoping to have the user not have to enter deep Alexa info for say 50 lights.

I guess I am hoping I could add something to the things-types definition which would result in the Metadata.json file automatically get filled out when the user adds the equipment via the gui for example.

I am just migrating to openhab 3 now, so sorry if my terminology is off.


I can’t answer your questions but I’ve moved the post to a more appropriate category.

1 Like

It will probably be up to the 2 people that review and sign off on the binding before it is merged, however if the openHAB architectural council has made a ruling on it then the reviewers have to follow. This only happens if there are arguments, so mostly it will be up to the reviewers.

I want to be clear I really do not know the official stance or even if one has been set, the following is just my opinion.

I would not want Alexa to have access without my express knowledge and others have this same view, so it is best to take the “OPT IN” approach, where if a user wants this feature, they have to turn it on. Does your binding setup a BRIDGE thing that all things connect through? If no it is also possible to set configuration at a binding level where a user can flick a switch from the default NO, to say YES (true) “auto add Alexa tags when discovery is done”. To access this configuration on an unmerged binding in the addons folder, you need to go to the INBOX, press the PLUS + icon and in the list of bindings, you press the gear icon. See how only the Chromecast binding has this gear icon.

If you do want feedback on how your binding is going when it is not finished, you can always do a PR and mark it as a WIP work in progress, then people can do quick looks and make comments. They would usually look at the file and the thing-types.xml files to get a quick idea of what the binding does, so if they are up to date, the comments should then be helpful. This is why the readme is important as it is usually the first file to get looked at to learn what the binding does before the rest of the files get looked at.

Just to add a bit of clarity here on the role of the AC. It has two jobs really. If there is a disagreement among maintainers that they cannot solve on their own they can appeal to the AC for a decision. That has never happened in the time that the AC has been around. The second job is to deal with decisions that have very large impact over the future of OH and/or impact a lot of different repos. Prior to OH 3 there were a few such decisions.

In a case like this where a binding author might have a conflict with the maintainers, I can’t imagine that the AC would overrule the maintainers. They have the authority and trust to run their repos as they see fit and it is not the AC’s job to come dictate differently. It’s the maintainers who have to deal with the day-to-day management of the repo, not the AC.

But note that the AC hasn’t been used much. And that is by design. So its role will likely largely be established through precedent over time. What I’ve said is my understanding right now and is subject to change.

Also, I do not speak on behalf of any one here. I’m merely expressing my personal understanding. In other words, this is not an official statement.

I agree. This also feels a little bit like OH 2’s old “Smart Mode” which automatically created Items for you. For the most part that was pretty much a disaster. In general, as Skinah states, most OH users tend to prefer an opt in approach over an opt out approach.

But maybe understanding what this metadata is might lead to a different answer. You are talking about something like Item Metadata which can be used to expose Items to Alexa. Or are we talking about something that is specific to the bindings and the Things? Is this data automatically discovered? Is it similar to what the Zwave binding does where it discovers what type of device it is and then looks up in a database of metadata what the Channels are?

If it’s a Things only metadata or related to discovery and such I don’t think there is any problem hard coding that into the binding somewhere/somehow. I can’t comment on the somewhere/somehow though.

Thanks for the feedback.

I will give some more specifics to help target the discussion.

I am currently working on bring support for a protocol to the russound binding. Russound amps provide whole home audio amps which power multi zones of speakers. You end up with any number of speaker zones (I have 8, no idea of what an average install would be).

You could easily have 6 channels linked to items and want Alexa support for each zone. ie. power, volume, source, mute, etc.

I found setting up Alexa support via the current gui is quite a bit of work, and quite monotonous as things are now. It also requires the end user spends significant time reading the Alexa binding documentation to figure out how to set up a speaker.

I was thinking when adding the bridge to the system, there would be a config flag something like ‘Generate Alexa metadata automatically’. Even if the Alexa metadata is generated, this does not mean Alexa would automatically have access. The end user would still have to go over to the Alexa smart device section and initiate a scan to detect smart devices within their system.

In looking at the recent binding dev documentation, I haven’t found mention of how to access the metadata registry, getting reference to it, etc.

The binding obviously has intimate knowledge of the things it is creating so it seemed reasonable to add the necessary information into openhab to allow a much simpler experience for end users to get support.

that is where I am coming from, and was looking for feedback on if this feature is a good idea in general, and if so, if others with more experience with bindings, especially with the newer apis related to version 3 had tackled something like this.


@digitaldan, just wanted to bring your attention to this because would appreciate your thoughts, if you have time. I know you’ve worked on many bindings which have devices like this, and also I believe you were/are involved in the Alexa support.


Does that even work? Alexa metadata is on Items, not Channels or Things. It’s my understanding that bindings do not have access to Items so how can it set the metadata?

I’m not surprised as I don’t think it’s allowed. I know there are some things that the binding can pass on to the Item like a default label format and icon and stuff like that. But I don’t know if there is a way for a binding to set Item metadata.


It could very well not work with the current apis. I certainly haven’t found an easy way to do it so far.

I am struggling with things a little as it seems the binding itself has a very good idea of the capability of the devices it accesses, so it seems like something the binding could make a lot easier for the general end users.

For myself, I can just write a jython script which does all this, but that will be far from easy for an average user to utilize.

I’m trying to think of a good way to make this very easy for a user that really has no idea of the Alexa metadata, but would like to say “Alexa, decrease volume of Kitchen speakers”, etc.

thanks again for the feedback.

Are we talking about the Amazon control binding or our Alexa hosted skill ? There is no Alexa binding to add anything to if the later is the case . We have talked about crating one and moving much of the hosted logic there , and in fact Amazon announced a local skill api ( still unreleased) where the hosted app runs on the local echo device instead of in the cloud which would be awesome and might make this easier, but right now there is not much of a reason to have one.

the Alexa skill.

I was hoping to hear what you thought because I know from our work on the omnilink binding you would be familiar with the audio zones etc, and I believe you were pretty involved in the Alexa skill at least at one time.

I’m trying to add a specific protocol to the russound binding, and it seems to me that it would be very helpful for openhab users if the binding had the capability somehow to add Alexa metadata, saving its users of the tedious work of adding the metadata.

It’s not clear to me how adding the required metadata should be accomplished at a binding level. I just recently began migrating to 3.x


Not trying to get anyone too excited yet but I am almost done refactoring the entire Alexa skill code which will introduce a syntax change pivoting towards a device-based metadata configuration style similar to HomeKit and Google Assistant integrations. It basically introduces a new layer which should simplify the configuration for most basic functionalities while keeping most of the advanced custom configuration for more technical users. So hopeful, this will address a good portion of the concerns around the complexity of the current Alexa metadata syntax, which I will concede could be daunting to the non-technical users.

As far as configuring metadata, it is much easier to do so in the 3.0 UI than before. So I think if the effort is to streamline the setup process, it should be improved at the UI level, not through a specific binding.


I can understand why you want to do this, but it does beg the question: why Alexa, but not google home or apple homekit? From a user’s perspective this would look like openHAB favors one vendor over others which goes against OH’s philosophy of being “vendor and technology agnostic”.


Sounds good Jeremy,

I will just write a little script of sorts to bang out my speakers for now and see how your work goes.

it feels like I was swimming upstream trying to add support within the binding. I need to continue with my migration so I become more familiar with 3.x


for me just because Alexa is what I use. I originally didn’t think adding Alexa support was going to be much of a task, was looking for advice from other binding devs on how they did it.

Bindings certainly can at least “suggest” some metadata for linked Items. I do not know the mechanism, only the effects, where bindings like zwave add presentation options and format to Items.
Currently this seems to take the form of a hidden Item edit when you link Item to channel. Occasionally it becomes a nuisance, e.g.when a channel profile is used to transform to an unmatched Item type and the force-fed format is not appropriate.