New HumanLanguageInterpreter addon based on Stanford CoreNLP

Hello all,

I’ve been working on a new addon which uses the Stanford CoreNLP parser for understanding simple queries and commands (English only at the moment). If you want to give it a try, you can find instructions here:

It’s designed to be able to work against items defined via OpenHab’s Home Builder without needing additional metadata. So if you use Home Builder, you can test it with your own site; otherwise, for now it’s probably best to test it with the demo items. Currently only boolean items (switches/contacts) are understood.

Here are some examples of what it can understand so far:

  • how many lights on the ground floor are on?
  • turn off all lights on the first floor
  • turn on the bathroom ceiling light
  • is the kitchen window open?
  • is any light on in the kitchen or the bedroom?
  • are any doors open?
  • which doors are open?
  • which lights are on in the kitchen?
  • is there a window in the kitchen?
  • how many rooms are on the ground floor?
  • which lights are in the cellar?

This is quite rudimentary, mostly because the big effort so far has been to lay the groundwork for a world model + parser + interpreter while struggling with the quirks of CoreNLP (plus wrangling OSGI). I have a lot of ideas for enhancing it, but I would love feedback from others about what features would be most useful.


1 Like

I’ve added a listing for this addon to the marketplace as well:

Since it’s not a binding, and it’s a zip file, I’m not sure whether the autoinstall will work…I’m going to try that out. If anyone knows if there’s a way to convert a bunch of OSGI bundles into a single uber bundle, that would simplify things considerably.

I figured out how to roll my own uber OSGI bundle, so I’ve updated the listing, and it’s now possible to install this addon via the Paper UI.

I also updated the instructions page. The addon is currently listed as a binding, even though it’s really a voice addon. That’s because there’s currently no way to list a voice addon in the marketplace.

Wow! Pretty cool. How’d you do it?

The parser starts with a constituent parse (NP, VP, etc) from Stanford CoreNLP, and then applies rules top-down, attempting to match the phrase tree to sentence patterns it understands, and transforming the syntax tree into a semantic tree.

CoreNLP provides a number of different constituent parser implementations, and some of them work well for certain sentence patterns but not for others. So SHLURD uses a fallback parsing technique: if it fails to come up with a semantically valid transformation for the first parse attempt, it tries again with the next constituent parser implementation, giving up only if it has exhausted the list.

If parsing is successful, then SHLURD attempts to interpret the semantic tree by matching references against its world model (which is built by reading your item definitions from openhab). Then if the sentence is a query, it responds with the answer based on current item states; or if it’s a command, then it carries it out by sending an openhab command event to the relevant item(s).

Will this be updated for openhab version 2.5.x? When installing through PaperUi i get:

[ERROR] [com.lingeringsocket.shlurd.openhab                ] - bundle com.lingeringsocket.shlurd.openhab:2.2.0.qualifier (359)[org.openhab.shlurdhli(433)] : The activate method has thrown an exception

in openhab2.log…

I’ll take a look and see if I can bring it up to speed with the latest…