Welcome to the HABot Walkthrough series!
This is the first of the series of topics about HABot, the chatbot for openHAB and a new card-based user interface. It is available in the openHAB distribution since version 2.4.0.M5 (Milestone 5).
This is a wiki topic, regular members as well as staff may edit it freely and are encouraged to do so! The content will eventually be migrated somewhere in the main documentation when it’s stabilized and enough feedback has been received. When replying to posts on this series, please try to stay on-topic - there will be several of them, each covering a particular feature or area, so please look for the appropriate topic and reply there, or create a new topic. Thanks!
What is HABot?
HABot is a natural language interpreter; you can ask it questions about your items in plain language, or perform actions. Contrary to the simple or rules-based interpreter, HABot’s will use machine-learning natural language processing (powered by Apache OpenNLP) to figure out what you mean; using its training, it will therefore try to understand you even if you’re not asking the question the way is has been pre-programmed, so you can for example be polite and add “please” or “could you tell me” even if it’s useless information (and perhaps a little awkward!). In this regard, It’s similar to voice-only assistant like Alexa, Siri, Google Assistant or Cortana, but is focused on your openHAB smart home setup and doesn’t need any cloud service to do the interpretation.
HABot could be used as a building block for voice-only interaction (à la Amazon Echo, Google Home or Mycroft/Snips) which was the original intent, but this use case is not the main focus anymore: a major feature of its chat-based user interface is that it will also provide visual information in the form of a “card” along with the natural language answer. These cards may contain up-to-date read-only information (relevant items’ states, images and charts) but may also contain interactive controls like switches, sliders and buttons to allow control in a traditional fashion. Therefore, a common use case for HABot, instead of giving an order, is to ask it to find items corresponding to the query (for instance, “lights on the first floor”) and use the controls on the card to visualize the states and act on the items, like switching them on or off.
Introduction: Skills & intents
To function, the natural language processing interpreter needs training data to derive an Intent and extract Entities from the query. It also needs to know how to respond to a query once the intent has been identified. Both of those aspects are handled by a Skill, which will provide the training data and the implementation of a particular intent.
HABot comes with the following built-in skills:
Get information (Intent Id:
get-status) - finds information and reply with a card
Examples: what’s the temperature in the kitchen, show me the lights in the bedroom, shutters in the living room, is the front door open?
Activate object (Intent Id:
activate-object) - finds switchable items and turn them on before presenting them in a card
Examples: turn on the tv, activate the alarm, i’d like some music in the bathroom
Deactivate object (Intent Id:
deactivate-object) - finds switchable items and turn them off before presenting them in a card
Examples: turn off the air conditioning, switch off the projector in the living room, no more heating in the bedroom
Set value (Intent Id:
set-value) - find appropriate items and try to set their state to the specified “value”, which can be a number (percentage, setpoint…) or a color.
Examples: set the thermostat to 80 °F, volume in the living room to 30%, change the temperature to 21 degrees, set the lights in the patio to yellow, turn the color of the lights on the first floor blue
Get historical information (Intent Ids:
get-history-hourly/daily/weekly/monthly): these intents will all try to generate and charts with historical information on the matched items, using the default persistence service.
Examples: temperature over the last 2 days, show a graph of the power consumption for last week, draw a chart of the water consumption for the past 3 weeks, what was the humidity of the last 4 hours, show me the temperature history for the past 6 months
Get the last change for an item (Intent Id:
get-history-last-changes) [note: not working so well at the moment!]: show a card indicating the last time an item’s state changed.
Examples: when was the window in the bathroom opened?, when was the alarm triggered?, when did you detect movement in the garden
Create a rule or timer (Intent Id:
create-rule) [note: not taking the time & date information in the query into account at the moment]: show an interactive & user-friendly card to quickly design a rule for the next-gen experimental rule engine (which must be activated first) on the fly.
Examples: create a rule, set up a timer, i want to set up something to run later
Tip: A note on “shortcuts” - if you simply state an attribute (an item’s label or synonym or terms derived from tags, more on that later) like “front door”, “Amy’s room”, “garage” or “sunrise time” and it matches one or more items, it will be automatically considered a
get-statusintent, so it will present you a card with what was found without performing any action.
Review the built-in training data for your language (HABot currently supports English, French, German and Dutch) at: https://github.com/openhab/openhab-webui/tree/master/bundles/org.openhab.ui.habot/src/main/resources/train
for clues on how best to formulate your queries for the best accuracy. The more training data, the better it will perform, so PRs are appreciated in this area!
Additional skills can be added to HABot via OSGi services coming from other add-ons, which must provide the training data for the intents.
If you’re new to HABot, you can download and run a fresh new instance of openHAB 2.4.0.M5 or later and select the demo package when starting up: the demo items will be tagged and annotated so that you can play with it right away before you configure it for your real instance - it’s also a great base to learn and experiment how to perform the tagging!
Install HABot from Paper UI, look for it in Add-ons > User interfaces and click Install:
Be patient while the HABot bundle is installed (it can take up to a minute), then return to the openHAB dashboard, you’ll find HABot among the other user interfaces:
You can already use the interface but it won’t perform well until you semantically tag your items. It is also a so-called “Progressive Web App” so under the right conditions (mainly being served over HTTPS), you can add it on your phone’s home screen or your desktop as an app.