Official Alexa Smart Home Skill for openHAB 2

You need to trigger an Alexa discovery after making that change on the openHAB side.

Of course I did that, too. After every edit in the OH item config, too.

So I think I found the culprit here. What version of OH are you running? The regional settings service pid was renamed in 2.5. The skill is expecting the previous pid.

In the meantime, as I mentioned to @m4rk, you can append @de-DE to your preset text-based name as a workaround until this change is implemented.

Thanks for helping identify this one! :+1:

I’m glad I could contribute, when actually you were helping me.

@m4rk @Marc_W The two bugs that I listed above are now fixed and released to the live skill.

Can you please remove the workaround I mentioned previously and trigger an Alexa discovery? As long as your regional language setting is setup in line with Alexa, you shouldn’t have to specify it at the item level.

The previous “locale” format is no longer supported (The usage documentation may take up to 24 hours to reflect the latest changes). If you need to override the server language property at the item level, you can add it to the language metadata parameter. This setting covers all text-based names defined in the item configuration.

String TestString "Test" {alexa="ModeController.mode" [supportedModes="Normal=Normal:Cottons,Delicate=@Value.Delicate:Knits", language="en"]}

Hi Jeremy,

Thank you. All working now without any extra bits of language code

Alexa had trouble with ‘Knits’ in your example. She thinks I say two minutes and sets a timer. Doh! So, I changed it to Woolens :grinning:

For my blinds, open, close, raise and lower all work.

The work around for the lack of an Alexa Stop command for blinds is working and totally handled by openHAB without an Alexa routine. I used both mode and range controllers to do it.

Or I should say it could be fully handled by openHAB. However, to stop a blind I have to say ‘Set blind to stop’ …hmmmm … its a bit clunky

To make it more natural I did create an Alexa routine for ‘Stop blind’ voice command

So, it all works with less code now.

Here is the code for others. It may need a bit more more tidying!


Group OfficeTest "Office test blind" {alexa="Endpoint.Other"}
String OfficeBlindStop "Stop office blind" (OfficeTest) {alexa="ModeController.mode" [supportedModes="STOP=STOP,UP=UP, DOWN=DOWN"]} //, could use this instead of command in rule >>>  expire="5s,command=STOPPED"
Rollershutter OfficeBlinds "Office blinds" (OfficeTest) {alexa="RangeController.rangeValue" [category="EXTERIOR_BLIND",supportedRange="0:100:10", unitOfMeasure="Percent", actionMappings="Close=100,Open=0,Lower=(+10),Raise=(-10)"], channel="openwebnet:bus_automation:Screen10:55:shutter" }
    Item OfficeBlindStop changed to 'STOP'

I elaborated more for the particular blinds binding I am using here:

Now I will test room awareness.


Is there a way to auto-create Alexa groups (not categories like plugs, lights, …) according to the OpenHAB groups?
I want to achieve that it will sync the rooms from OH to Alexa and as such having a living group matching the group in OH allowing to switch on/off lights and/or plugs all simultaneously. As far as I understand the current doc there seems to be no possibility for this or am I wrong?

See screenshots:


Hi Jeremy.

Of course. My setup is working now as follows:

Rollershutter RS1 "Avocado" <rollershutter> { channel="...", alexa="RangeController.rangeValue" [supportedRange="0:100:1", presets="0=@Value.Close,100=@Value.Open,20=blau", unitOfMeasure="Percent", actionMappings="Close=0,Open=100,Lower=(-10),Raise=(+10)", stateMappings="Closed=0,Open=1:100"]}  

(I hope I understood correctly to have Lower/Raise in upper-case first letters, to refer to the builtin action and have Alexa translate my voic commands and compare the translation with the builtin actions!?)

and responds correctly to the following commands in German:

  • “Alexa, setze Avocado auf blau!”
  • “Alexa, setze Avocado auf 30 Prozent!”
  • “Alexa, schließe/öffne Avocado!”
  • “Alexa, fahre Avocado ganz runter”
  • “Alexa, fahre Avocado ganz rauf!”

I noticed some strange behaviour here:

  • “Alexa, fahre Avocado runter!” (In my opinion, this should be equivalent so “Alexa, lower Avocado!”) -> Result: Alexa executes something, but on each even command, the Rollershutter goes down (by some seemingly random amount and not 10%), on each odd command the Rollershutter goes up (by some different, seemingly random amount).
  • “Alexa, fahre Avocado hoch!” -> Same behaviour as previous point (moving up, down on alternate commands, seemingly random travel distance)

In summary: yes, presets work in local language, but there is unexpected behaviour in the Raise/Lower commands

So this is exactly what I would have recommended. The only difference is that I would set your proxy item to autoupdate=false so you don’t have to mess around with its state and the skill would also use the command received as the current state. I updated below, your item definition (added missing stateMappings parameter) and rule.

Group OfficeTest "Office test blind" {alexa="Endpoint.ExteriorBlind"}
String OfficeBlindsCommand "Office blinds command" (OfficeTest) {alexa="ModeController.mode" [friendlyNames="@Setting.Direction", supportedModes="STOP=STOP,UP=UP, DOWN=DOWN"], autoupdate="false"}
Rollershutter OfficeBlinds "Office blinds" (OfficeTest) {alexa="RangeController.rangeValue" [friendlyNames="@Setting.Opening", supportedRange="0:100:10", unitOfMeasure="Percent", actionMappings="Close=100,Open=0,Lower=(+10),Raise=(-10)", stateMappings="Closed=100,Open=1:100"], channel="openwebnet:bus_automation:Screen10:55:shutter"}
rule "Office Blinds Command"
    Item OfficeBlindsCommand received command STOP or
    Item OfficeBlindsCommand received command UP or
    Item OfficeBlindsCommand received command DOWN

On a side note, after putting some more thoughts into potentially allowing Rollershutter items to be ModeController, I realized that this will be a problem for the state reporting and would basically require to add each number value from 0 to 100 as supported modes. So this will unfortunately not happen and the solution above should be the recommended solution for the time being.

1 Like

You can’t auto-create Alexa groups but you can have openHAB groups modeled as a device. You just need to make sure to define a group type on that item.

In the example below, all 4 items will be modeled as a device on the Alexa side. If you request “Alexa, turn off all lights”, all the lights associated to the group will be turned off. Likewise, you can control each light individually; “Alexa, set color light to blue”

Group:Switch:OR(ON,OFF) Lights "All Lights" {alexa="Lightning"}
Switch SwitchLight "Switch Light" (Lights) {alexa="Lightning"}
Dimmer DimmerLight "Dimmer Light" (Lights) {alexa="Lightning"}
Color ColorLight "Color Light" (Lights) {alexa="Lightning"}

It important to note that if you are looking to take advantage of the Alexa-enabled group feature (aka room awareness), you will have to use Alexa groups. There is no way around it.

So your item definition looks good. And according to Amazon utterance examples, the lower/raise requests you are using should work as intended. Have you checked your event logs to see the command received? Also, check your Alexa voice transcript history to see how your request was understood.

I tested the rule and it goes into a loop

How so? Can you post some of your event logs?

openHAB got into a loop of some sort. The log was filling up with get requests. I fixed that clearing the cache and a few reboots.

Now I get this in the log :
Command ‘STOP’ has been ignored for group ‘OfficeBlind’ as it is not accepted

Group OfficeBlind "Office test" {alexa="Endpoint.EXTERIOR_BLIND"}
    String OfficeBlindCommand "Office blind command" (OfficeBlind) {alexa="ModeController.mode" [supportedModes="STOP=STOP,UP=UP, DOWN=DOWN", autoupdate="false"]} 
    Rollershutter Office_RollerShutter "Office blind [%d %%]" <blinds> (OfficeBlind,gAllBlinds) {alexa="RangeController.rangeValue" [supportedRange="0:100:10", unitOfMeasure="Percent", actionMappings="Close=100,Open=0,Lower=(+10),Raise=(-10)"], channel="openwebnet:bus_automation:Screen10:55:shutter" }

This causes the error and the blind does not stop :

rule "Office blind command"
    Item OfficeBlindCommand received command STOP or
    Item OfficeBlindCommand received command UP or
    Item OfficeBlindCommand received command DOWN

This works

rule "Office blind command"
    Item OfficeBlindCommand received command STOP or
    Item OfficeBlindCommand received command UP or
    Item OfficeBlindCommand received command DOWN

Oh I see a mistake in the item name

Why are you trying to send the command to the group item? Now I understand the looping issue here. The only purpose for it is to model a single endpoint in Alexa for the two components. Your rule should point directly to the items that are controlling your blinds.

See above :stuck_out_tongue:

The rule now runs without error but blind does not stop.

rule "Office blind command"
    Item OfficeBlindCommand received command STOP or
    Item OfficeBlindCommand received command UP or
    Item OfficeBlindCommand received command DOWN
    logInfo("TEST" , 'rule ran . rC = '+ receivedCommand)

However, replacing sendCommand(receivedCommand) with sendCommand(STOP) and it works

For both STOP and received Command I see this in the log

[INFO ] [.eclipse.smarthome.model.script.TEST] - rule ran . rC = STOP

fixed it with this change…


Log shows this for ’ Alexa, Set Office test to UP,DOWN,STOP and the blinds behave accordingly

[INFO ] [.eclipse.smarthome.model.script.TEST] - rule ran . rC = UP

[INFO ] [.eclipse.smarthome.model.script.TEST] - rule ran . rC = DOWN

[INFO ] [.eclipse.smarthome.model.script.TEST] - rule ran . rC = STOP

So, Its all working now. Thank you.

Final rule…

    Item OfficeBlindCommand received command STOP or
    Item OfficeBlindCommand received command UP or
    Item OfficeBlindCommand received command DOWN
1 Like

Nice! :+1:

Well, amazon logs say my voice commands were understood as I wanted them to be understood.

This is part of the event log:

2019-12-21 21:10:01.364 [ome.event.ItemCommandEvent] - Item 'RS1 ' received command 41
2019-12-21 21:10:01.432 [nt.ItemStatePredictedEvent] - RS1 predicted to become 41
2019-12-21 21:10:01.465 [vent.ItemStateChangedEvent] - RS1 changed from 31 to 41
2019-12-21 21:10:03.103 [vent.ItemStateChangedEvent] - RS1 changed from 41 to 37

I don’t really understand why there is the second ChangeEvent. It seems like this is related to the moving back due to correcting the slats’ tilt. However, the amount is inconsistent between different runs.

Ultimately, I think this is not a problem of the Alexa Smart home skill anymore, but a problem of changing some motor parameters for my blinds. It might take a while to get this sorted out, and I will report back once this is done.

In the meantime, I have one more question: Can you provide an example how to adress the rollershutter, if it is included inside a group endpoint, in German? I could not find the correct utterances anywhere, and I just can’t adress the items if defines like this:

Group gOne "Ente" {alexa="INTERIOR_BLIND"}

Rollershutter RS1 "Avocado" <rollershutter> (gOne) { channel="zwave:device:babe135b:node3:blinds_control", alexa="RangeController.rangeValue" [supportedRange="0:100:1", presets="0=@Value.Close,100=@Value.Open,20=blau", unitOfMeasure="Percent", actionMappings="Close=0,Open=100,Lower=(-10),Raise=(+10)", stateMappings="Closed=0,Open=1:100"]} 
Dimmer RS1_Tilt "Ananas"  <slats>  (gOne) { channel="zwave:device:babe135b:node3:blinds_control2", alexa="RangeController.rangeValue" [ supportedRange="0:100:1", presets="0=@Value.Open,65=rot,100=@Value.Close"]}

If I utter “Alexa, setze Ente auf blau”, then Alexa asks “Welche Einstellung?”, I respond “Avocado!” and nothing happens.

The first change is related to the command received by the skill and the second one is the actual change that your binding applied. As you mentioned this is not related to the skill.

Since you didn’t specify the friendlyNames parameter on either items part of the group, it will default to the item label. So to address the component separately, you have to use that label (e.g. “Alexa, setze Ente Avocado auf blau” or “Alexa, setze Ente Ananas auf rot”). Also, keep in mind, that each semantic extension can only be specified once in a group endpoint. However, you can mix for example lower/raise for the opening and open/close for the tilting, or vice-versa, but no overlapping (include state mappings as well) is allowed. The Alexa discovery will silently fail in that case.

This is odd. It should have worked as long as the preset name is unique across the items of a group endpoint. I just tested your item definition on the Alexa simulator and it’s working fine. Maybe, the Alexa language processing didn’t understand your device name. You should always check your Alexa voice transcript history to see what was understood.