Hi,
Thanks to all of you who attended my talk at the Smart Home Day about offline voice recognition – hope you appreciated it as much as I appreciated preparing and delivering it .
I’ve been asked if I could share the configuration I made, so here it is, hopefully I haven’t missed anything, before I scratch it:
Hardware
- Raspberry Pi 3 Model B (anything will work really but Snips requires a RPi 3 or compatible ARM device)
- Microphone: PlayStation Eye (https://www.amazon.de/Sony-9600459-PLAYSTATION-Eye/dp/B000W3YQ1Y/ref=sr_1_1?ie=UTF8&qid=1508829019&sr=8-1&keywords=playstation+eyeF)
Configuration
https://gist.github.com/ghys/a55555481e22bd705cdba7742ae99373
referred to as “the Gist” below.
Sphinx configuration
- Create a directory structure (I used
/opt/stt
)sudo mkdir /opt/stt
sudo mkdir /opt/stt/de
sudo mkdir /opt/stt/de/grammar
- Download https://sourceforge.net/projects/cmusphinx/files/Acoustic%20and%20Language%20Models/German/cmusphinx-de-ptm-voxforge-5.2.tar.gz/download (the German acoustic model) and extract it in
/opt/stt/de/cmusphinx-de-ptm-voxforge-5.2
- Download https://sourceforge.net/projects/cmusphinx/files/Acoustic%20and%20Language%20Models/German/cmusphinx-voxforge-de.dic/download (the German dictionary) and place it in
/opt/stt/de/de.dic
Edit the file with a text editor and add the German pronounciation for “openHAB” on its separate line (anywhere in the file is OK):
openhab Q OOH P AX N HH AAH B
- Download commands.gram and openhab.gram from the Gist and place them both in the
/opt/stt/de/grammar
directory.
Optional: You can use LiveDE.java from the Gist to test it against the commands.gram grammar (you’ll need the Sphinx library in your CLASSPATH: sphinx4-core-5prealpha-SNAPSHOT.jar). I used BlueJ during the demo since it’s installed by default on Raspberry Pi Desktop. Note: it didn’t work well during the demo but it was bad luck, when I tried it while preparing it was far better Use phases following the grammar for example: Büro Licht einschalten, Schlafzimmer Licht ausschalten etc.
Snips configuration
(note: it is recommended to install Snips on a dedicated Raspberry Pi for best performance)
- Install with
curl https://install.snips.ai -sSf | sh
(or follow the instructions: https://github.com/snipsco/snips-platform-documentation/wiki) - Download and install this assistant: https://drive.google.com/file/d/0Bw230k8qOwrmdmdYSUxXSGotbE0/view?usp=sharing
-
snips-install-assistant assistant_proj_arkkdl6O7.zip
(or you can go to https://console.snips.ai/ and make your own)
-
- Run Snips with
snips
- You can test your setup with
snips-watch
in another terminal window - say Hey Snips then for instance can you please make the lights in the kitchen yellow
openHAB configuration
- Download the latest openHAB snapshot (I made a manual installation to
/opt/openhab
) - Add these add-ons to the
addons
folder: - Configure the items: place voice.items from the Gist in
conf/items
- Configure the addons: place addons.cfg from the Gist in
conf/services
and adjust if necessary - Configure the MQTT binding: place mqtt.cfg from the Gist in
conf/services
and adjust if necessary - Run openHAB and finish the configuration in Paper UI:
- Configuration > System:
- Configuration > Services > Voice > CMU Sphinx Speech-to-Text:
Note! Only click Save with “Start listening” activated when you’re ready and Snips is stopped (the Sphinx bundle and Snips cannot share the microphone), because the hotword spotting/voice recognition will then start! Check the openHAB logs for any error!
- Configuration > System:
- Open the HABPanel local configuration editor: http://localhost:8080/habpanel/index.html#/settings/localconfig
and import this config: habpanel-config.json (4.2 KB) This will make the “Voice Demo” dashboard available. - You can now try Sphinx in German:
- Speak the keyword: openhab
- Speak a command supported by the default interpreter:
- Schalte das erste Licht ein
- Mach das zweite Licht rot
- etc. See the interpreter’s source code in the Eclipse SmartHome repository for more details about supported sentences: StandardInterpreter.java
For Snips:
- First disable the Sphinx bundle (with the Karaf console or by moving the .jar out of the
addons
folder). Restart openHAB to ensure the microphone input is released. - Open Flows Builder from http://localhost:8080/flowsbuilder/index.html
- Drag and drop a When an item is updated node from the left column
- Click the node and change “Item” to Snips_Intent in the right column
- Drag and drop a Execute a given script node from the left column
- Click the node then the “Edit script” button in the right column and paste this script then click Save:
- Drag and drop a When an item is updated node from the left column
var colors = { yellow: '50,100,100', blue: '200,100,100', red: '0,100,100' }; // etc.
var parsed = JSON.parse(state);
if (!parsed.input || !parsed.intent) {
throw 'intent not recognized';
}
var colorSlots = parsed.slots.filter(function (s) { return s.slotName == 'objectColor'});
var color = (colorSlots.length) ? colorSlots[0].rawValue : null;
var locationSlots = parsed.slots.filter(function (s) { return s.slotName == 'objectLocation'});
var location = (locationSlots.length) ? locationSlots[0].rawValue : null;
var item = (location == 'kitchen') ? 'Light1' : (location == 'bedroom') ? 'Light2' : null;
var colorValue = (color && colors[color]) ? colors[color] : '0,0,100'; // default to white
print('location=' + location + ' colorValue=' + colorValue + ' item=' + item);
if (item) {
switch (parsed.intent.intentName) {
case 'ActivateLightColor':
events.sendCommand(item, colorValue);
break;
case 'ActivateObject':
events.sendCommand(item, ON);
break;
case 'DeactivateObject':
events.sendCommand(item, OFF);
break;
}
}
- Link the two nodes by clicking the output of the orange one and dragging to the input of the blue one.
It should look like this:
- Save (3rd icon in the toolbar), give a name (e.g. “snips”) and click the blue Publish button.
- Start using Snips: say “hey Snips” then a supported sentence (“please set the kitchen lights blue”) - openHAB will change the items’ states and the rule should execute the script accordingly.
That’s it – please share your feedback if you try this!
Additional resources:
- Calling for help beta-testing a CMU Sphinx speech recognition add-on
- Integrate a Snips voice assistant with openHAB - Walkthrough
- Also check out the Mycroft openHAB skill for another alternative: Mycroft Openhab Skill