Binding Request: Amazon Echo

Has anyone thought about creating a skills binding for openhab and amazon echo? This would be an amazing combination.

“Alexa, ask openhab to turn on the light.”
“Alexa, ask openhab what the status of the house is.”

The current “bridge” is neat and works really well…


but it is limited to seeing preprogrammed switches. A skill would open up the entire platform.

Also following along at:
https://groups.google.com/forum/#!topic/openhab/nuqiEOcJbv4

2 Likes

Not Amazon Echo but here’s a thread showing my progress with voice control so far.

That sounds pretty cool! I’ll definitely follow along with your progress. At the moment I’m most interested in using Alexa for the voice integration in my system. Theoretically, this would be much easier for lay people to add voice control in OpenHab 2 also.

The Alexa skills are very new, I’m hoping someone has been working on openhab integration. I’m happy to help test anything with my setup.

@belovictor I was wondering how you were making out with your Echo and creating an Alexa Skill linking to my.openhab.org? You mentioned back in July in a Google Group thread that you were planning on it. I put together an Alexa Skill myself but ran into a roadblock of not being able to figure out how to get outside the world of Amazon servers. Any attempt at HTTPing failed with no apparent message in the logs. I am admittedly a bit of an amateur at these things but I was able to get a skill working where I could capture the entire voice command. I thought it would be a simple process of then passing that command ( VoiceCommand=[message] ) via a simple HTTP GET to my.openhab.org but I failed. I did find this bit of code that I thought would help and even created a Twilio dev account to so I could run the code as is and it didn’t work. I also found this, but again no luck. So any progress on your end?

Well, I’m still thinking on the right way to integrate. Just passing the text would be simple but will not use the power of echo. I would rather think of integrating it with item names or something like that. But I didn’t make huge progress yet.
The trick here is to authenticate your request in my.openHAB and connect it with your account in right way. We can discuss on how to integrate your work with my.openHAB to make this process easier. I would imagine something like we did for IFTTT - oauth2 connection between alexa account and my.openHAB account to transparently send those commands to openHAB.

@belovictor What were your thoughts/ideas for this?

I’m not so sure passing the text would that simple unless I’m completely missing something. As I noted using HTTP within the skill does not seem to work. I am using the username and password for my.openhab.org and coded it just like the examples I linked to above. Since I couldn’t get the Twilio code sample to work either, I am at a loss as to what to try next since I may be missing something fundamental. I was hoping someone else had better success and could point out what I have been doing wrong.

Yes I do understand using the power of Echo would be nice but I first wanted to at least get a base working. I am currently using the amazon-echo-ha-bridge and it does work very well with no need to create an Echo Skill. But it is limited to commands that control lighting, on:off:dim %. So to close or open something like a garage door you have to say “Turn off (or on) the garage door”. It’s workable but…

Though I will have to say ‘ask openhab to…’ every time, I would prefer a separate skill instead of that bridge.
Well, the example you pointed is a labda service (amazon lambda app). So first you need to be sure your skill calls this lambda and after that diagnose if it works inside. I didn’t have a deep look into how to configure the skill yet though.
My thought was adding a little bit of a context (echo will settle in certain room or space of your house, so you could specify a group to enable things like ‘all lights’ and add some context to commands you say to operate controls in this particular area) for example. The main trouble is that we don’t have an addressable hierarchy in openHAB so you can’t automatically interpret things like ‘main lights in guest room on second floor’. So we can agree on certain item name or label scheming so that you could map parts of it into speakable things. I’m not sure about label though, cause I have a lot of ‘main lights’ or ‘temperature’ labels in my house while those items stay in different groups (mapped to rooms…). So there should be done quite a bunch of research here on how to map voice vs items. I just don’t want to create a huge ‘if then’ ruleset to process voice commands and edit it every time I add something to my house configuration…

I mentioned this before in my own thread but the way I’m tackling that problem at the moment is by having a hierarchical group structure and assigning items to multiple groups. I don’t know if it’s necessarily the best way (and that’s why I’m always keen to hear other people’s experience/solutions) but it seems to work.

So far I have a ‘location’ group. e.g. (living_room, bedroom, bathroom) and a ‘thing’ group (e.g. lights, tvs).

If I say “turn off the bedroom light” it will firstly look for an item called “bedroom_light” and if found will turn that item off.
If I say “turn off the bedroom lights” (and it can’t find an item called “bedroom_lights” it will look for items that are members of BOTH the bedroom group and the lights group and turn them all off.

Cool. That’s what I meant :smile:
Where is your thread?

Link is below. I’m using google now to capture speech to text and wit.ai to process text to intent.

I think the hierarchical structure is great. My system very much fits this mold. In fact, my sitemap is populated and organized based in the groupings, so would work perfectly.

It is large but manageable and easily scalable.

Actually mine does work with close/open the garage door commands…it must equate open to ON, etc. pretty neat but I would still like to see some further development

Two questions:

  1. The Echo HA bridge sounds like a good starting point to have some simple integration (at least for lights). As this bridge is implemented in Java, it should be fairly easy to integrate it smoothly into openHAB as a bundle. Did anybody start such an effort? I would love to have that!

  2. For an Alexa skill, I would not want to use “openHAB” as the greeter. Just like the Echo has Alexa, openHAB should have Bob or something like that. So shall we do a poll on what would be a good name for our personal assistant…?

I’d like to be able to customize the name, but that probably wouldn’t be possible if we’re creating a permanent skill for everyone to use.

Mine is called “Gregory” (House).

2 Likes

The Echo HA bridge is a great starting point I have 15 devices so far and it is VERY fast both with X10 and insteon devices from the insteon PLM binding. Also to RFM69 Arduino nodes.

I know several trekkies" have been pushing for “computer” as a trigger name on the Amazon forums :smile: That would always be neat… I think any word that is not very common (such as “Alexa”) must help prevent unintentional triggers although as a skill perhaps this does not matter. It would be fantastic to get it set up natively with alexa as a skill opening up additional possibilities. Just with the limited functionality through the hue emulator has already proven this a viable and “real deal” voice recognition system.

I’m currently using the echobridge successfully to arm the security system through the DSC Binding, and through recent improvements to that I am also able to enable/disable the chime feature and open the garage door through its output.

I also have a switch item set up to run a rule for changing the thermostats when leaving for
the day, lights also of course. Switches work very fast, sometimes before she’s said “OK”

Having the echobridge as a binding would be great, wish I knew how to code java well as I would jump on it.

That’s funny @ubergeek , the very first thing I did when I got my echo was
put in a request to change the wake work from Alex to computer. I am also
using the echo bridge project to control about a half dozen lights, my
sonos equipment and my pool. The official api requires a application wake
work ( tell openhab to …) which is not as convent as just saying Alexa,
turn kitchen lights on. I now own three echos! Good voice recognition is a
game changer.

1 Like

@digitaldan knows how to code java :slight_smile: Didn’t you think yet about creating a real bundle for it, so that no additional bridge needs to be set up? As I will get my Echo tomorrow, I am now myself very interested in it and thus will have all possibilities for review and testing!

I think you could take each clause as a group name and constrain the set of items returned to only those belonging to all named groups. The example “(main lights) in (guest room) on (second floor)” means the items returned would have to belong to all three groups, main_lights, guest_room and second_floor, no hierarchy needed.