[openWebNet/BTicino] openHAB3

OK, got it already, in Javascript I need to go via the actions object. This code works for me:

var cen = actions.get("openwebnet", "<thing_uid>");
cen.virtualPress("START_PRESS", 2);

And now thanks to this, Alexa is able to call up my scenarios from the MH202. I am really amazed how easy all of this comes together. :partying_face:

cool :sunglasses:
how to you connect the Alexa command to the activation of your script that in turn activates the MH202 CEN scenario via virtualPress ?

Basically I have non-semantic Items that are connected to Alexa by adding the Alexa metadata and rules that react to commands sent to those items. I am using either Items of type Switch with the Alexa device type Light, or Items of type String with the Alexa device type Scene, both works fine.

Also of course you can switch single lights, group of lights or other devices by just adding the Alexa metadata to them in openHAB.

I think I will write another blog article about it anyway. :grinning:

2 Likes

Done:

As always, any thoughts and comments are appreciated!

2 Likes

Hi Robert,

Thanks for your contributions. :+1: I added your two latest blogs to the list of useful links.

I have voice control but have a more general approach. Note: I always use text files for configuration.

I set an Alexa tag as mode control on a proxy item. Then you can command Alexa to set the proxy item to any ā€˜modeā€™ you want. The change of mode for a proxy item can then trigger and openHAB rule and even detect what mode was requested.

Group NorthLights                                   "North lights"                                                                              {alexa="Endpoint.LIGHT"}
    String NorthLights_Command                      "North lights command [%s]"                                     (NorthLights)               {alexa="ModeController.mode" [supportedModes="ON=ON,OFF=OFF,CONSTANT=CONSTANT"], expire="4h,command=NORMAL"}
    Switch NorthLights_Switch                       "North lights switch"                       <light>             (NorthLights, gAllLights)   {alexa="PowerController.powerState", channel="openwebnet:bus_on_off_switch:gateway:22:switch"}

To make voice commands more natural I also create an Alexa routine that translates a nice phrase or multiple phrases to the ā€˜modeā€™ . This also means you can create nice Alexa voice responses to your command for a mode to be set and direct the response to different Alexas. A further refinement is to create an Alexa openHAB device as a group with various capabilities. Then when setting up the routine you are presented the capbilities and the options for the grouped device.

You get Alexa to run any device or scenario this way using more natural phrases

If like me you have various ways to trigger the openHAB rule then the following is useful to log which item triggered the rule

import org.openhab.core.model.script.ScriptServiceUtil

rule "Outside north lights"
when
    Channel "openwebnet:bus_cenplus_scenario_control:gateway:CENPLUS10:button#25" triggered //Screen10 SHORT_PRESS
    or Item NorthLights4hrs_CENplus_proxy received command 'PRESSED' //Sitemap
    or Item NorthLights_Command received command 'CONSTANT' //Alexa command
then

    val list = ScriptServiceUtil.getItemRegistry.getItems(triggeringItemName)
    val triggering_item = list.get(0)	
    logInfo('Lights', 'Triggering item = ' + triggering_item.label.toString )
    if (receivedCommand == "CONSTANT") {
        logInfo("Alexa" , 'North lights voice command received: '+  receivedCommand)
    }
    logInfo("Lights" , 'Outside north lights constant mode started')


.............do stuff...............

Log for voice command

2023-01-01 12:23:02.126 [INFO ] [org.openhab.core.model.script.Lights] - Triggering item = North lights command

2023-01-01 12:23:02.128 [INFO ] [org.openhab.core.model.script.Alexa ] - North lights voice command received: CONSTANT

2023-01-01 12:23:02.129 [INFO ] [org.openhab.core.model.script.Lights] - Outside north lights constant mode started
1 Like

I can investigate further, but I think the current behavior is correct and coherent: if you switch OFF an address on the BUS, and it changes state ON ā†’ OFF, and then switch it again to OFF since the state is not changed on the hardware side, this will not generate any state update on the switch channel and therefore on any associated item (or fire an updated trigger on any rule associated to the item)

Yes, what you are saying makes sense for normal addresses with an actuator. But what I have been trying is to process group commands (e.g. WHERE=0 for general or WHERE=7 for area 7), and for those there is no state, and an OFF can be followed by another OFF, which still would need some processing.

Ciao Massimo!
Iā€™d like to help with F522 energy values readingsā€¦

ok, good! then write me a PM so we can discuss some ideas

HI,
thank you for the fantastic work done with this binding.
Iā€™ve found a strange behaviour on the alarm channel (bus_alarm_zone) thatā€™s supposed to be a bug.
This channel should have SILENT , INTRUSION , TAMPERING or ANTI_PANIC value when the specific zone have an alarm, ā€˜NULLā€™ otherwise.
Instead it remain in the alarm status (INTRUSION for example) after the alarm has been disengaged or newly engaged.
Status can obviously be set back to NULL with an expire on the item or a rule/script but i donā€™t know if this is the expected behaviour.

Thank you.

Today I have been moving from lights to shutters and imported my first two shutters into openHAB. The binding seems to work great again in that regard, I really like the auto calibration feature, it seems a lot of thought went into this. :+1:

Also it seems that - in contrast to the lights - for shutters the group commands are not an issue. Even if controlling the shutters from BTicino via group commands, openHAB shows the correct status/opening percentage. I guess in case of shutters, they provide individual status updates on the bus, even if controlled by group commands?

That sounds like pretty advanced Alexa-stuff. :slight_smile:

So with the Alexa device type ā€œMode Controllerā€ you can define a list of modes yourself? How are the actual voice commands that can then be used to set the mode?

Those routines are a pure Alexa function, right? So they are setup via the Alexa app, without involvement of a configuration in openHAB?

Not sure if I understand that, what is a group with various capabilities? Does that mean to have Points in it of different types? Or assign multiple Alexa device types?

A ā€˜modeā€™ is any text you want to represent some action and there can be some translation done by the Alexa syntax too eg Full cycle = mode 1, ON etc. At the moment I usually set an routine to respond to the phrase I like to use but natively it should respond to some simple phrases like ā€¦ Set to command or something like that.

Another approach is to capture the spoken phrase to Alexa and then use openHAB regex to decipher what was said and take the appropriate action and even speak back on the Alexa used. I do this for my garage door operations which require quite a bit of extra coding and sensors to track the door state.

Yes, routines are pure Alexa app. You can get Alexa to respond to any phrase you like and then do stuff.

Group is a collection of capbilities(OH items with Alexa taggging) for a device eg you could have thermostat group that has the capbalities of ā€˜setpointā€™ and ā€˜current temperatureā€™ as defined by OH items. This affects how the thermosat is displayed in the Alexa app. As group it is displayed as one device that shows the setpoint temperature and below it the current temperature. It also responds in a more natural way to voice commands as one device.

There is lot more than I have tested yet and I am in the process of fine tuning and expanding the voice control. But once you get past tipping point then voice command or queries becomes the easiest way to control things, lights, blinds, thermostats and other things that are not in OH. Testing on my 86 year old Mum who was already used to an Alexa at her house and it seems that guests to my house quickly get used to itā€¦they donā€™t need to use an app or figure how to turn things on/off , which wall switch, what does what etc etc. They just ask Alexa to do it and it works; most of the time :wink:

The link below is the link to the general capabilities section of the Alexa skill but the post contains a lot more >>

Here is an example of Alexa controling openwebnet blind

Rollershutter WestBedroomWest_RollerShutter         "Bedroom West blind [%d %%]"        <blinds>            (gAllBlinds) {
    alexa="Blind.RangeValue" [inverted=false,supportedCommands="UP=@Value.Up:@Value.Open,DOWN=@Value.Down:@Value.Close,STOP=@Value.Stop", supportedRange="0:100:1", unitOfMeasure="Percent", actionMappings="Close=DOWN,Open=UP,Lower=(+5),Raise=(-5)", stateMappings="Closed=100,Open=0:99"],
    channel="openwebnet:bus_automation:gateway:65:shutter" 

With this anyone can command a blind to close, open, stop and go to a set % position. It also allows natively requests like ā€˜lowerā€™ or ā€˜raiseā€™ and then the blinds will move fixed %, as set in the syntax. I set +/- 5%.

A further step to make it work more easily for everyone is Alexa room awarness. With that then the Alexa knows which room it is in and responds more intelligently to requests like ā€˜switch the light ONā€™ ie it assumes you mean the one for the room its in and there is no need to give the specific item name.

All this is just the icing on the cake for my automation to work but actually was my orginal goal >>>to make the automation work mostly without intervention but when something needs to be done it should be intuitive and no need to know my system nor be an expert in my way of doing things.

This last effort comes towards the end of a long road of integration and I am still working on it. So, I would appreciate some discussions with others on how to do it the best way etc, motivation to explore alternative methods etc. I seem to one of a very few going this way and talking about it.

editā€¦ The Alexa room awareness feature only works fully for lights and its been that way fro a long time. It partially works for blinds. Raise and lower commands work but open and close do not. So,there is a workaround posted in the community by the Alexa skill developer and I created my own rule I specifically made to operate my openwebnet blinds which I amy expand to operate other items. It is working well for openwebnet BUS blinds :grinning:

This means anyone can command a light or blind without knowing the specific Alexa names of the devices. eg guests. The Alexa in the room that receives the command ā€˜knowsā€™ which are the correct lights or blinds to operate. I will post my code if anyone asks but will move discussion to the general thread

1 Like

H,

I noticed that the thermo unit atLeastOneProbeManual channel detects when I set a zone to manual but doesnā€™t update when it is back in automode, the Item remains as ā€˜ONā€™ even though central unit doesnā€™t show any zones as manual.

Is that just me or anyone else?

to be honest i dont understand the working of the channels atLeastOneProbeOff, atLeastOneProbeManual and atLeastOneProbeProtection as they are sometimes ON even that seems not correct for me but i could not find out how they work.

for example now i went to the cu, confirmed my standard mode programm (=automatic) but all three channes stay ON although they should all be OFF in my opinion (the red lines only appear when switched ON)

grafik

i am also still missing the possibillity to switch a zone not only to manual but then back to auto

As I understand it when one zone is in manual or off or frost protection then the relevant channel switch changes to ON. This is the same as the CU showing the symbol for those conditions.

Manual is set on the CU as a fixed temperature and not on the zone offset on the wall mounted thermosats.

Another way to get a zone into manual is to use the Alexa app and change the target temperature.

The switches are read only and although I can manually put them back to the correct state it generates an error

handleChannelCommand() Unsupported ChannelUID atLeastOneProbeManual

Thanks for the inspirations - I am not sure yet how far my road of integration will take me, still being at the basics. Next comes a weather station, with wind sensor for bringing up the shutters and then Iā€™d like to integrate our car gate so I can control it from BTicino controls . :wink:

With Alexa I also have to yet see how far I want to take it - so far I only have BTicino scenarios linked with Alexa, but Iā€™d like to start with individual lights next and see how well that goes - will for sure try to make it work with that room awareness feature that you mentioned.

1 Like

I have a weather station, as well as wind, rain and sun switches setup. I use some BTicino contacts sensors, I also have my electric garage door integrated, again using contact sensors. If you want to discuss please use the general thread unless its binding specific.

Good morning,

I am back with a question about shutters and their state.

I am now using a non-semantic group hierarchy where I put all my shutters to send them commands. All the groups have member base type Rollershutter and AVG as aggregation function.

  • After I reboot openHAB, the state of all the shutters and groups is UNDEF, as expected.

  • If I then send an UP command to the top level group (the ā€œhouseā€), its state changes to ā€œ0ā€, also as expected.

  • However, the sub-groups stay at UNDEF, which Iā€™d expect to be set to 0 as well.

  • Then also the top level group status changes back to UNDEF after 20-30 seconds. Looks like the aggregation function kicks in and overwrites the top level status again with the UNDEF in which the sub-groups still are.

Is this a bug? Shouldnā€™t the UP to the top-level group also set the state of all the sub-groups to 0?

In this state I also cannot send a percentage command to the shutters, it will give me the following error, which is probably because the state is still UNDEF:

2023-01-11 05:42:05.755 [INFO ] [.handler.OpenWebNetAutomationHandler] - Command 10 cannot be executed: unknown position or shutterRun configuration params not/wrongly set (thing=openwebnet:bus_automation:22eef285a7:51)

What is the best way to bring the shutters in a defined state after a reboot of openHAB?