Nanoleaf Canvas, capturing Touch Events

Here is how you get Nanoleaf Canvas Touch Events sent to openHab items.

.items

Group:Number canvasTouch
Number CanvasTouchId      (canvasTouch)
Number CanvasTouchGesture (canvasTouch) 

.rules

rule "Nanoleaf touch"
when
    Member of canvasTouch received update
then
    logInfo("CanvasTouch", triggeringItem.name + " State=" + triggeringItem.state)

    switch(triggeringItem.name) {
        case "CanvasTouchId":{  
          logInfo("CanvasTouch", "PanelId=" + triggeringItem.state + " gesture=" + CanvasTouchGesture.state)
          switch(triggeringItem.state) {
              case 22092: {
                  CanvasKitchenPower.sendCommand(if (CanvasKitchenPower.state==ON) OFF else ON)
                  if (CanvasKitchenPower.state==OFF) {
                      CanvasKitchenColor.sendCommand(100)
                      CanvasKitchenEffect.sendCommand("Fray")
                  }
              }
              case 15141: {
                      CanvasKitchenColor.sendCommand(25)
                      CanvasKitchenEffect.sendCommand("WhiteDimmed")
              }
          }
        } 
        default: {
        }
    }
end

nanoleafTouches.py

# -*- coding: utf-8 -*-

import requests
import logging
import json

logging.basicConfig(filename='/home/omr/nanoleafTouches.log', level=logging.DEBUG, format='%(asctime)s %(message)s')
logging.info("Started")

canvasIP='192.168.1.12:16021'
canvasToken='snJIIotZTyLiBomN2Lym6PHvH5pUkHkd'
canvasEvent='/events?id=4' # touch

openHabIp='localhost:8080'
openhabGestureItem='/rest/items/CanvasTouchGesture'
openhabIdItem='/rest/items/CanvasTouchId'

canvasGetUrl='http://'+canvasIP+'/api/v1/'+canvasToken+canvasEvent
openHabPostUrlGesture='http://'+openHabIp+openhabGestureItem
openHabPostUrlId='http://'+openHabIp+openhabIdItem

headersGET = { 'Accept': 'application/json',  'Content-Type': 'application/json', }
headersPOST = { 'Accept': 'application/json', 'Content-Type': 'text/plain', }

#print(canvasGetUrl)
#print(openHabPostUrlGesture)
#print(openHabPostUrlId)

def connectToCanvas():
  global response
  global rr
  global newlineSeen

  response = requests.get(canvasGetUrl, headers=headersGET, stream=True)
  logging.info('Listening...')
  rr=''
  newlineSeen=False

def parseEvent():
  global rr
  rr=rr[12:] # throw away no JSON part (is always id=4)
  logging.info(rr)
  loaded_json = json.loads(rr)
  gesture=loaded_json['events'][0]['gesture']
  panelId=loaded_json['events'][0]['panelId']
#  print(gesture)
#  print(panelId)
  res1=requests.post(openHabPostUrlGesture, headers=headersPOST, data=str(gesture))
  res2=requests.post(openHabPostUrlId, headers=headersPOST, data=str(panelId))

def rxEvent():
  global response
  global rr
  global newlineSeen

  r=response.raw.read(1) # parse every single byte received until /n/n
  if len(r) == 0:
    connectToCanvas() # connection was lost, re-connect
  else:
    if newlineSeen:
      if r == '\n':
        parseEvent()
        newlineSeen=False
        rr=''  # start over
      else:
        newlineSeen=False
        rr=rr+r # accumulate
    else:
      if r == '\n':
        newlineSeen=True
        rr=rr+' ' # replace \n with ' '
      else:
        rr=rr+r # accumulate


connectToCanvas()
while True:
  # typical response:
  # id: 4\ndata: {"events":[{"panelId":42794,"gesture":0}]}\n\n
  rxEvent()
  
# never stop listening  

.log (.py)

2019-06-25 23:06:39,093 Starting new HTTP connection (1): 192.168.1.12
2019-06-25 23:06:52,721 Started
2019-06-25 23:06:52,727 Starting new HTTP connection (1): 192.168.1.12
2019-06-25 23:06:52,741 "GET /api/v1/snJIIotZTyLiBomN2Lym6PHvH5pUkHkd/events?id=4 HTTP/1.1" 200 None
2019-06-25 23:06:52,742 Listening...
2019-06-25 23:07:39,429 {"events":[{"panelId":15141,"gesture":0}]}

.log (oh2)

2019-06-25 23:07:39.443 [INFO ] [e.smarthome.model.script.CanvasTouch] - CanvasTouchGesture State=0
2019-06-25 23:07:39.448 [INFO ] [e.smarthome.model.script.CanvasTouch] - CanvasTouchId State=15141
2019-06-25 23:07:39.450 [INFO ] [e.smarthome.model.script.CanvasTouch] - PanelId=15141 gesture=0

.service

[Unit]
After=network.target

[Service]
Type=simple
ExecStart=/usr/bin/python /home/omr/nanoleafTouches.py
StandardInput=tty-force

[Install]
WantedBy=default.target

1 Like

@OMR

I am very much interesting in using the panels as switches (actually this is the main reason I bought them). I am actually pretty familiar with openhab but I don’t exactly the following

  • Where (directory in openhab) should I put the nanoleafTouches.py
  • Where (directory) would I find the log file?
  • What is the name of the service file (nanoleaftouch.cfg?) and does it go into the services directory of openhab?

Cheers
Stefan

Hello.
Nice to know someone else has use for this :slight_smile:
It should definitely be part of the binding some day.
This solution works pretty well, but I have 2 issues:

  1. The service doesn’t start after a reboot. Don’t know why. I have enabled it.
  2. If the canvas restarts, the service also needs to be restarted.

Pleas let my know if you are able to figure this out.

Mine just live in my ~ folder (/home/omr)

Also in ~ (or change the path in the program)

omr@shs2:~$ sudo systemctl status canvasTouch.service
● canvasTouch.service
   Loaded: loaded (/etc/systemd/system/canvasTouch.service; enabled; vendor preset: enabled)
   Active: active (running) since Sat 2019-09-28 18:30:20 CEST; 8s ago
 Main PID: 19570 (python)
    Tasks: 1
   Memory: 9.4M
      CPU: 113ms
   CGroup: /system.slice/canvasTouch.service
           └─19570 /usr/bin/python /home/omr/nanoleafTouches.py

Sep 28 18:30:20 shs2 systemd[1]: Stopped canvasTouch.service.
Sep 28 18:30:20 shs2 systemd[1]: Started canvasTouch.service.
omr@shs2:~$
1 Like

Thanks @OMR,

I am now working on setting it up on my system.

One question before I do using it :wink:

I try to understand the code (I am pretty good in other languages but I don’t know python and its API very well) and there is one thing that strikes me: Do you ever wait at some point to give the CPU / the panels some kind of relief or are you firing one request after the other against the panels?

cheers
Stefan

Meanwhile I was brave enough to try it out and even in top there’s hardly any cpu that you can see for that process. … and it works. Now I can play around a bit with it. Sometimes it doesn’t recognize my touches, hence I will investigate in which case and why (I think you shouldn’t be too soft but rather “tap” a bit harder). Maybe in the future I am looking into how this could be applied to the nanoleaf plugin and provide a merge request (at least I feel some itch for that at the moment). Let’s see how it goes…

By the way - if you want to show it in your sitemap you should define it as such

Items:
Number CanvasTouchId "Nano Touchid [%.0f]"     (canvasTouch)
Number CanvasTouchGesture "Nano Gesture [%.0f]" (canvasTouch) 

Sitemap:
Frame label="Nano" {
          Default item=CanvasTouchId  
          Default item=CanvasTouchGesture 
}

For some reason %d doesn’t work here - not clear why.

It also seems that receiving gestures is pretty unreliable (I have seen them work rather seldomly but it did once in a while)

Regarding the service: Did you put it in init.d?

I think it does the equivalent to linux select()/poll() on this line:

r=response.raw.read(1) # parse every single byte received until /n/n

I’m not using python as my primary programming language either, but I was inspired looking at some other examples for integrating my car into OH2 (https://github.com/ardevd/jlrpy).

As for init vs systemd have a read (or google): https://www.tecmint.com/systemd-replaces-init-in-linux/

As for the Canvas touches and gestures in particular, it takes a bit of practise. I don’t use gestures yet, but touches quite a lot. I also find that the people over at Nanoleaf are quite interested in user feedback, so they might be able to refine the gesture detection if it gives you trouble.

As for contributing with bindings, I have read on this forum and on Github that there has been some tool-chain trouble during the last months. I know they are trying to come up with a (better) “how-to set that up”. Haven’t checked in a while, so they might have something already.

1 Like

Everything works pretty well now, so again thanks for your work! Very much appreciated!

They way I did the startup for the python script in my case is as a mentioned earlier with init.d . I have written a pretty simple init-script which just starts the python script in the background and looks as follows:

Filename: nanoleaf
Directory: /etc/init.d

#! /bin/sh
### BEGIN INIT INFO
# Provides:          nanoleaf python script for openhab2
# Required-Start:    $all
# Required-Stop:     $all
# Default-Start:     2 3 4 5
# Default-Stop:      0 1 6
# Short-Description: nanoleaf events
# Description:       Start nanoleaf python script
### END INIT INFO
# Author: Stefan Höhn

nohup python /etc/openhab2/python/nanoleaf.py &

Then do

sudo chmod 755 /etc/init.d/nanoleaf

and

sudo update-rc.d nanoleaf defaults

and this works well on restart of the system.

1 Like

Just to let you know, Ole, I lately got together mit Martin who has written the nanoleaf binding and he agreed to let me contribute to his binding. The good news is that meanwhile it is done and under review. There are small things I like to optimize but basically it is working pretty nicely.

So stay tuned as touch events will be available by individual panel and you can write rules based on a pulse on a switch item.

cheers
Stefan

Hi Ole, I just wanted to let you know that the touch support is now officially supported in the 2.5.

A big thanks to you because your working Proof-of-concept led me thinking about implementing it in the binding which I then finally did. So Kudos to you for doing the proof and convincing me to do it.

Thanks
Stefan

1 Like

Thank you also.
Will try it out :slight_smile:

@stefan.hoehn will it be part of the 2.5 release soon, or do I need to install a .jar ?

It is already part of 2.5.1. Installing 2.5.1 should be enough to have it working. One note though: I just noted that setting the color of an individual panel got broken at some stage during the official canvas support implementation. Fix is already under review and will soon be out with 2.5.2.

Hmm, I’m already on 2.5.1.
I’m using auto-discovery.
Maybe I need to delete my Light Panels and re-discover them?


I had the same problem was on 2.5.1 bu no touch aviable, used a jar or updated the installed binding, both solved it for me

Can you do me favor and do the following

  • turn on debugging in karaf (log:set trace org.openhab.binding.nanoleaf)
  • Delete the light panels. It is likely that panel things are not compatible anymore.
  • Do not delete the controller (at least for now)
  • Rediscover your panels.
  • They now should have the new channels available.
  • If not provide the full log by filtering with “nano”

Hope that helps
Stefan

Tried removing 1 panel, and re-discover. No change.
Removing 102 panels will take some time … :wink:

OH my gosh! 102 Panels? Really? I can’t believe that there really people out there who have so many devices :-).

  • Do they all belong to the same controller?
  • What are you doing with so many panels?

Can you contact me directly? Just to be sure I want to send you the last snapshot of the bundle and see if that makes a difference.
And can you also provide me the log of all nano log statements? Be sure to set it to trace when you discover a panel.

Believe it :slight_smile:

Yes, same controller. (Have another one with only 6 panels behind the TV)

Sure, just send a PM through this forum?

1 Like

Awesome. Yep, PM me with your mail address.