Release Candidate and Support: Amazon Echo Control Binding

Hi everyone,
I would like to thank everyone from my side for building those amazing things here.
I want to be quick and do not spent your valuable time.
I have read more than 1000 replies regarding this binding but I couldn’t have an answer, Is there any way or rule that you can ask alexa do something, lets say turn the light on (light is “item”) and do it without discover the device in the alexa app?
Thank you again.
Mike

You could use the lastVoiceCommand Channel, or a group of Items linked to them, as a trigger for a rule that parses the contents, and then does things based on the content.

Thank you @5iver ,
could you please give an example for a simple switch?
My point is that I don’t want to use [ Switchable ] notice.

I use this for the VoiceCommand Item, and use “Alert” as a command word for it (nothing to do with Alexa… just OH). This is in Jython, but I could dig for an archive if you can’t convert. This should be very similar to what you’d need for this type of rule. I forgot to mention, you’ll need to create a routine with a confirmation ‘OK’, or Alexa will always respond with that she doesn’t know what you’re talking about.

@rule("Alert: Voice command alert")
@when("Item VoiceCommand received command")
def voiceCommandAlert(event):
    log.debug("VoiceCommand received [{}]".format(event.itemCommand))
    if "alert " in str(event.itemCommand).lower():
        content = str(event.itemCommand).lower().replace("alert ", "")
        if len(content) > 0:
            events.sendCommand("Audio_Notification", content)

Found it…

rule "Alert: VoiceCommand"
when
    Item VoiceCommand received command
then
    logInfo("Rules","Alert: VoiceCommand received [{}]",VoiceCommand.state)
    if (triggeringItem.state.toString.toLowerCase.contains("alert ")) {
        val content = VoiceCommand.state.toString.toLowerCase.replace("alert ", "")
        if (content.length > 0) {
            Audio_Notification.sendCommand(content)
        }
    }
end
1 Like

I think that’s helps a lot!!! Thank you, I’ll try and get back.
Really thank you for your time @5iver .

1 Like

You’re very welcome! You actually got me thinking about this… and I’d wanted to play with it for a while, so here is what I got working. Happy holidays!

3 Likes

Update.
Finally I didn’t make it through with the above. I need to find way to trigger items without ask echo to discover the devices… so without adding [“Switchable”] or [“Lighting”].
Any ideas?

ps. what that “LastVoiceCommand” is doing?

I will answer my own question after some months :slight_smile:
Thanks to # idkpmiller [post 1213] I was able to write a rule which triggers according to the “alexa” keyword.

I have two echo dots.

items:

    Group gAlexa_voice
    String Echo_LastCommand_W       "Last Command"          (gAlexa_voice)  {channel="amazonechocontrol:echo:account1:echo2:lastVoiceCommand"}
    String Echo_LastCommand_K       "Last Command"          (gAlexa_voice)  {channel="amazonechocontrol:echo:account1:echo1:lastVoiceCommand"}

rule

rule "Get Alexa keyword"
when
 Member of gAlexa_voice changed to "alexa"
then
//do something
end
1 Like

BETA 2.5 (1) released:

This version implements a cache for the dynamic channel state descriptions. This results in a better performance for the REST calls and PaperUI.

Check the first post in this thread for the download link.

Best,
Michael

2 Likes

Hello @michi . 2.4/2.5 updates still do not fix the character encoding issue I raised few months ago. I just installed OH 2.4 on a brand new Windows 10 PC and still these wrong encoding when accessing OH from localhost:8080 (tested on Edge and Chrome browsers).

How can I help to troubleshoot this? Is anyone else experiencing the same?

In PaperUI:
image

image

I really need this to be fixed to use that nice binding properly!

Hi Yann,

you can do the following to provide more information:

check the special characters in the result. Send me the request and response headers from the network tab of the device-request

Do the same for the url http(s)://youropenhab/amazonechocontrol//PROXY/api/devices-v2/device (e.g.: http://localhost:8080/amazonechocontrol/account1/PROXY/api/devices-v2/device)

Best regards,
Michael

@michi: Just replied by PM with the details of headers / contents requested :wink:

This issue will be fixed in the next beta (will be released in the next days)

1 Like

@michi i´ve got two questions regarding the binding and Alexa in general.

Does it have any reason that i´m not able to trigger rules with the lastVoiceCommand when there´s no empty routine in the Alexa app?
openHAB sees every call to one of my Echos but the rules are only triggered when there´s an empty routine that stops Alexa from saying “sorry i may not understand you” (or something like that).
I made some tests and my rules are only triggered with this workaround.


Do you know if there´s an info who gave a command to Alexa?
Maybe this would be a nice idea for an additional channel to make rules more personal.
There´s an option to train Alexa to your voice and she´s able to say who started the radio station when continuing.

kind regards
Michael

Hi michi,

I realized that I got an exception short after the TTS Channel was triggered by a rule with the new beta. Currently I’m not able to reproduce that exception. I only see it sometimes.

Greets Udo

2019-01-01 17:22:36.597 [ERROR] [nal.common.AbstractInvocationHandler] - An error occurred while calling method ‘ThingHandler.handleCommand()’ on ‘org.openhab.binding.amazonechocontrol.internal.handler.EchoHandler@a9b412’: POST url ‘https://alexa.amazon.de/api/behaviors/preview’ failed: Bad Request

org.openhab.binding.amazonechocontrol.internal.HttpException: POST url ‘https://alexa.amazon.de/api/behaviors/preview’ failed: Bad Request

at org.openhab.binding.amazonechocontrol.internal.Connection.makeRequest(Connection.java:583) ~[?:?]

at org.openhab.binding.amazonechocontrol.internal.Connection.executeSequenceNode(Connection.java:1108) ~[?:?]

at org.openhab.binding.amazonechocontrol.internal.Connection.executeSequenceCommand(Connection.java:1093) ~[?:?]

at org.openhab.binding.amazonechocontrol.internal.Connection.textToSpeech(Connection.java:1084) ~[?:?]

at org.openhab.binding.amazonechocontrol.internal.handler.EchoHandler.startTextToSpeech(EchoHandler.java:658) ~[?:?]

at org.openhab.binding.amazonechocontrol.internal.handler.EchoHandler.handleCommand(EchoHandler.java:545) ~[?:?]

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]

at java.lang.reflect.Method.invoke(Method.java:498) ~[?:?]

at org.eclipse.smarthome.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:153) [102:org.eclipse.smarthome.core:0.10.0.oh240]

at org.eclipse.smarthome.core.internal.common.InvocationHandlerSync.invoke(InvocationHandlerSync.java:59) [102:org.eclipse.smarthome.core:0.10.0.oh240]

at com.sun.proxy.$Proxy108.handleCommand(Unknown Source) [224:org.openhab.binding.amazonechocontrol:2.5.0.Beta_01]

at org.eclipse.smarthome.core.thing.internal.profiles.ProfileCallbackImpl.handleCommand(ProfileCallbackImpl.java:75) [109:org.eclipse.smarthome.core.thing:0.10.0.oh240]

at org.eclipse.smarthome.core.thing.internal.profiles.SystemDefaultProfile.onCommandFromItem(SystemDefaultProfile.java:49) [109:org.eclipse.smarthome.core.thing:0.10.0.oh240]

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]

at java.lang.reflect.Method.invoke(Method.java:498) ~[?:?]

at org.eclipse.smarthome.core.internal.common.AbstractInvocationHandler.invokeDirect(AbstractInvocationHandler.java:153) [102:org.eclipse.smarthome.core:0.10.0.oh240]

at org.eclipse.smarthome.core.internal.common.Invocation.call(Invocation.java:53) [102:org.eclipse.smarthome.core:0.10.0.oh240]

at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:?]

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:?]

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:?]

at java.lang.Thread.run(Thread.java:748) [?:?]

Thanks for your reply, got it working, except the part with TTS for the Sonos.
I will give it a try with Google TTS and the use of audio sinks, if I have a solution I will come back with it to you. :wink:

Happy new Year :slight_smile:

@Bredmich
I think you missed the part, that we are talking about Sonos speakers with integrated Alexa, not the Amazon Echo or dot itself. Or did you get TTS working for a Sonos One? :slight_smile:

Greets Udo

I answered on @Kfm as he said that only one out of three “echos” perform the TTS.
As he didn´t mentioned which one is an real echo and which one is a sonos i recommend to wait a second between multiple TTS commands.

I don´t have an sonos but this limitation seems to apply to anything controlled by this binding.
@michi do you know something about issues when sending to many commands through the binding without a pause?

My next steps would be:

  • Build an dummy switch that can be triggered through the sitemap
  • Write a simple rule that sends an TTS to one speaker
  • Trigger the switch and hear if the speaker says the TTS text
  • Change the speaker and repeat

As i haven´t found any info about kfm´s setup i can only guess which of his three “echos” are sonos speakers.

kind regards
Michael

Hi @Bredmich, thanks for your reply.

Yes, @Kfm `s Post about his Sonos problem was very old, I answered him and he directly answered me and didn’t use the quote method. And then you answered and missed the Sonos part. Doesn’t matter, a lot of posts going on here. You tried to help, thanks for it!!! :slight_smile:

TTS is working fine, also if you send all three requests in a row or to a group w/o any problem. Just the Implementation from Sonos’s Alexa doesn’t handle TTS call’s. That was the problem we talked about. Also in my opinion it has nothing to do with the binding, its just Sonos Alexas implementation or restrictions from Amazon to use it on third party devices.

I thing @Kfm thought, you mean TTS is for Sonos not handled because of the three calls in a row. But that is not the case. It just doesn’t work because it’s a Sonos. And because of that only the real amazon echo is talking.

Happy new Year.
Greets Udo

I love this binding and use it also for controlling that my kids don’t listen to music after their bedtime.
As there is no general on/off-Channel I use the player channel to check whether an Echo plays music.What I am not able to check for is if there is a skill (like the many quiz skills available) being actively used. Is there a channel or a method to check, whether a given Echo is currently using a Skill?

No, because the trigger does not know about a routine in the alexa app. And anyway, you should see the last command in the channel. Have you tried to bind it to an GUI element, so that you can see what happen?

Currently it seems, the the user Information is not available. So I see no way to provide this as a channel