Common Intents catalog

Hello all,

This thread is about gathering information around the question “is there some place in openHAB for a common Intent catalog ?”

I’m interested in openHAB capability to act as a vocal assistant.
I eagerly follow all the great efforts in this category, and as I read topics and checked what is done elsewhere on other open source project, I’m now thinking that maybe openHAB should have an Intent concept.
Is there already such effort somewhere ?

For those wondering what an “Intent” is: Intent is used nearly everywhere when a user try to interact in a complex way with a system. A little example of what could be an very simple Intent:

- type : controlling light
- action : on
- location : bedroom

It’s some kind of “already interpreted user input”. The user input can be a voice command, or a text based interaction with a binding like Telegram/Signal/SMS, or habot, or in fact nearly anything.

As far as I know, the current openHAB way of dealing with user input is to give everything to the HLI concept:

User input → HLI → action on item

As it is great for letting HLI addon great power, it complexifies the development of HLI and doesn’t allow for modularity.
If openHAB core could add an Intent catalog / functionnality / concept / whatever, it could means:

User input → HLI → (Core openHAB Intent module → Intent subscriber —>) action on item
(new and optional part between parenthesis)

The benefit of this is the “Intent subscriber” part. “Intent subscriber” can take many forms:

  • openHAB core could subscribe to some intents and perform action on item for basic operations like light, rollershutter, etc. It would mutualize code and effort in core, thus relieving HLI addon from the burden of analyzing the ontology/semantic of all the items in the house.
  • Addons could subscribe to some intent exposed in a catalog and act accordingly. For example, a music player addon like Spotify or LMS can subscribe to intents related to music for advanced capability (seach, etc).
  • User could subscribe to Intent from within rule (with a new kind of trigger : Intent).
  • We could imagine letting the user choose wich subscriber will receive which intent.

Advanced topic : we can imagine a reverse control, by letting addon or user defining custom intent (with some syntaxic sugar) and sending them to HLI for them to add in their content recognition engine.

I don’t know anything about Home Assistant, but I just found a page about this in their documentation.
Mycroft also uses this concept.

If it is deemed appropriate, I can try to initiate something, with the time I dispose. But if someone else wants to, I would be glad to help him/her.
I would also be very happy for receiving any advice,directions, etc.

Related question for those interested : do you know if there is an open catalog of Intent somewhere ? Some kind of standard ?

Thanks,

openHAB is not designed to be a smart assistant so it has no built in concept of an intent (you could kind of see a command as being like an intent of more data was added to it, but not really) and it’s probably not something that can be easily shoehorned into the existing architecture.

One of OH’s strengths is the relatively stable core architecture and concepts. A radical change to either is going to be a hard sell.

But it’s definitely something that could be implemented as a plugin. In fact OH has really good integration with existing third party assistants including Google Assistant, Alexa, and even Mycroft. In addition, OH has it’s own Natural Language Processong plugin with HABot (there are some tutorials on using HABot with chat services on this forum). The model is not super robust but the concern is it needs to be performant on an RPi.

If you were to start somewhere I’d recommend starting with HABot to see what it does, how it does it, and whether there is something you can do there. The alternative would be to open an issue in core to discuss what would and what would not be accepted along these lines in core. I wouldn’t spend too much time implementing something as there might not be a willingness to change the core so drastically. But if there’s any chance it’ll be now in OH 4 as breaking changes are usually not allowed in point updates.

I understand that the core has to remain “core” and solid, and as such, minimal. This is why I wanted to ask for opinions before, and so, thank you very much for taking time to expose your thoughts !

I don’t think this has to be a breaking change.
It could be totally optional. Some HLI could use it or continue with the straight line to item.
And as all publish/subscribe system already in the code, it should be modular and optional for addons.
If there is traction for it, the core could add a new trigger system for rules, using it. But this is not a first step. And as a new thing, it don’t even need to break anything.

As I see it, the imprint would also be minimal : a bunch of class/text file defining common intent and a publish / subscribe system. It is for me an interface between addons, defining a common langage.
The majority of the work would be in the addons subscribing to it.

Indeed, I already did a little inquiry, and to some others HLI, and this is why I made this post :sweat_smile:
I saw some opportunities.

I will think about opening an issue to core, thank you for the suggestion.

No, you might have misunderstood the concept of openHAB.
A “vocal assistant” would need to be developed as an addon, subsribing to openHAB’s existing event bus, not the other way round. No change in the existing addons/bindings.

@Miguel_M.A.D is certainly the most experimented with voice assistant in openHAB and could be interested by your ideas.

1 Like

Hi @dalgwen, it seems interesting and I agree with @rlkoshak that it could fit into the current HLI interface and been implemented as an addon.

You can use the voiceManager to get access to any HLI by id from your signal plugin so I think you could write integrations agains those right now, let me know if I’m missing some point.

I have this addon in the marketplace and also in a current PR to be added to OH4
ActionTemplateInterpreter Next (3.4.0,4.0.0], I believe you already saw it. Seems to have some similarities with the intents concept, let me know if you have any feedback about it.

This topic was automatically closed 41 days after the last reply. New replies are no longer allowed.