Just a Smart Home or your Smart Home – Providing Personalized User Interfaces for Eclipse Smart Home

(Lukas) #1

We would like to discuss some topics with the OpenHAB/Eclipse Smart Home Community that we have presented a while ago to Thomas Eichstädt-Engelen and Kai Kreuzer.
First of all, let us introduce us to you. We (Lukas Smirek and Professor Gottfried Zimmermann’ team) are researchers at the Stuttgart Media University (“Hochschule der Medien”). Our research focus is on personalized user interfaces and accessibility.
Our idea is to implement a mechanism for personalized and pluggable user interfaces in Eclipse Smart Home. These mechanisms and concepts have already been applied in another project, called the Universal Remote Console (URC).
In order to get more specific, let us discuss some examples:
Use case 1: Exchange of broken devices
Imagine, you have a TV set or a sashing machine in your smart home that you can control with your smartphone. After a while one of the devices breaks and you have to exchange it for a new one. This time you decide to buy a different model from a different vendor. Unfortunately, the user interface of the new device is very inconvenient. Thanks to the user interface exchange mechanism of the URC framework, you can keep your old familiar user interface and do not have to install a new one.
Use case 2: Your new holiday flat
Imagine, you buy a new holiday flat and equip it with some smart devices that you want to remote control. Unfortunately you have to choose some different models for your holiday flat that in your regular home. Having different user interfaces in the two places can be very inconvenient and confusing. Fortunately, by using the concepts of URC you do not have to adjust to different user interfaces because you can use the same ones in both places.
Use case 3: Supporting your parents
Imagine, one of your parents suffered a stroke and has problems with his fine motor skills. Hence he or she can no longer control the heating system because you need a touch screen to do it. Having a hard wired user interface might make it necessary to replace the whole heating system. Making use of the URC user interface exchange mechanism enables you to keep the main system and reduces your effort to exchange the conventional touch screen against a tablet with eye tracking and the system is for both parents accessible.
Use case 4: The baby sitter
Imagine, you have a nice old lady that takes care of your kids when you want to go out with your spouse. After bringing the children to bed she wants to watch TV. Unfortunately she is not very good in learning new technical stuff and has problems with your audio/video system. Fortunately, she can use her familiar user interface on your tablet to control the unfamiliar Audio/Video system. Additionally, the user interface is a simplified one that shows only the most relevant functionalities.
Use case 5: Your business trip to china
Imagine, you are on a business trip in China. When you arrive in your hotel room you want to adjust the air conditioning settings. Other than in former days, when you had fight with a control panel with Chinese letters, this time the control panel automatically shifts to German and uses an icon set that is familiar to you. Furthermore the system recommends some environmental settings similar to the ones in your home.

In order to enable these use cases there are two major things necessary. First of all, an abstraction layer that abstracts from specific devices and separates the user interface from the real hardware. Secondly a user interface server that provides globally available user interfaces and user interface settings.
Concerning the abstraction layer, the current concept of thing types and channel types is a very good starting point. However, we think that the potential of these concepts is not yet used to its full extent. In order to enable use cases like exchanging a new device against an older one requires that both devices are using the same abstraction. Hence we propose to build a binding that contains a set of standardised channel types that have the property “system”. These channel types can then be reused to build bindings for concrete devices. A nice side effect of such a set of standard channel types is that the developer’s effort for integrating new devices is decreased.
Furthermore, to enable use cases like the one with the holiday flat, makes it necessary that the relevant items that shall be controlled via a user interface are no more hard coded in the user interface. Instead a library is required that enables user interface developers to specify the thing type to which they want to connect in a first step and then the channel type to control a certain functionality in a second step. We intend to develop a JavaScript library that could look as follows:

// connect to the openHab gateway
var openHab = new OpenHAB();

// Get a list of all connected TV sets
var tvs = openHab.getThings('TV');

// ... select a certain TV set

// connect to the selected TV set 
var tv = openHab.getThing('tv_1');

// look up a list with all available channels
var channels = tv.getAvailableChannels();

// ... do something/ select a channel

// control a channel 
tv.setChannel('tv:volume', 75);

In order to have user interfaces and settings recommendations globally available – like in the example with the business trip to China – a user interface resource server is required. Therefore, we intend to use the OpenAPE framework (open Accessibility Personalisation Extension). OpenAPE provides a server and a dedicated Java client that can be easily integrated into Eclipse Smart Home.
User interface developers can upload all kind of user interfaces (or links to them) to this server to make them available in a global, central place. In order to get more specific user interface recommendations, a user can define the context of use. The context of use consist of four parts: user context, equipment context, environment context and task context. So, a user interface in Eclipse Smart Home is needed that can be used to define the different context types. The equipment context is defined by the things for which a user interfaces shall be looked up, the environment context is defined by a subset of the available linked channels that give information about the environment (time, temperature, environmental light etc.) and the task context is defined by linked channels or items that indicate the current activity of the user.
The user context is directly defined on the OpenAPE server.

We have just started to work on a first prototype to demonstrate our ideas. Soon it will be available at:

Further information about the URC framework and a comparison between URC and ESH can be found in
L. Smirek, G. Zimmermann, and M. Beigl, “Just a Smart Home or Your Smart Home – A Framework for Personalized User Interfaces Based on Eclipse Smart Home and Universal Remote Console,” Procedia Computer Science, vol. 98, pp. 107–116, 2016.

Further information on OpenAPE:
L. Smirek, P. Münster, and G. Zimmermann, “OpenAPE - A Framework for Personalised Interaction in Smart Environments,” presented at the SMART ACCESSIBILITY 2017, The Second International Conference on Universal Accessibility in the Internet of Things and Smart Environments, Nice, France, 2017, pp. 1–5.


(Rich Koshak) #2

I believe openHAB/ESH provides a good abstraction layer between devices/APIs/technologies and not only the user interfaces but also storage for analysis and rules logic.

If you think about it like the TCP networking protocol stack:

Level OH Concept Purpose
Application Rules, Persistence, UIs Defines behaviors in reaction to events, represents the HA system to the users
Transport Items and the Event Bus Items get linked to Channels, events get published on the Event Bus that the rest of the system can react to
Internet Things, and Channels Represents a device or API and exposes its data and control points
Physical Bindings and Actions Interfaces between OH/ESH and a technology or API, for example, communicate with a Zwave USB controller to interact with a Zwave network

This approach is what lets OH users, for example, have a Sonoff MQTT switch appear and function exactly like a Zwave switch everywhere in OH from the Item level and above. As a user, I don’t need to know or care that my Sonoff switches work over wifi using MQTT messages and my Zwave uses a proprietary wireless mesh network. On my UI they are both just a switch.

This also is available I believe as the UIs are all driven by the OH/ESH REST API. The existing UIs are all quite different and serve different functions, but they all work through the same REST API.

I’m probably not the best person to comment as I’m not a binding developer and tend to approach advocacy and comments on the forum from the end user’s perspective.

That being said, I think I’d need some convincing that this:

  • would indeed work
  • would improve the current abstraction concepts without confusing the users
  • this proposal is significantly different than the current set of standard Item types.

As OH is built right now, that standardized set of channel types is currently implemented by the current set of Item types. There are admittedly some inconsistancies, such as binding authors using a Switch when they should be using a Contact for binary sensors like a reed switch.

Again, this is what Items are for. If I swap out the device, I only need link the Channels for the new device, even if they are for a different technology, to my existing Items. And now my UI, my Persistence, and my Rules will work unchanged.

I’m the first to admit that there is plenty of room for improvement in Items and how they link to Things. But this proposal, as written in this posting, appears to be completely ignoring Items and proposing a solution would require significant changes to the core and that would only work for UIs and not for the rest of OH. For many if not most of us, the UIs are the least important part of our home automation. It is the Rules and Persistence that are of utmost import.

tl;dr, I like the concept but the approach as described will require lots of changes and completely ignores the existing mechanism for this and only addresses the UI, completely ignoring Rules and Persistence. So I’m not convinced.

(Lukas) #3

Hi Rich

Thanks for your answer. There are many things I totally agree with you. Still, I would like to clarify some points to avoid misunderstandings.

  1.   Of course rules and home automation are most important to control a smart home. However, there are always some points when you want to switch of a rule for a moment or you want to control something explicitly. Most users can use paper ui, Site maps etc. To do this, but these user interfaces are not optimal for everyone. We are concerned about people with disabilities that can benefit from smart homes but have problems to use the standard openhab user interfaces and need some specialized ones. 

In order to connect these specialized user interfaces to openhab we definitely want to use the REST API and control the relevant items.

  1.   Concerning standard items and my idea of standard channel types: As far as I understand, using the item types it is defined what kind of control element will be displayed in the standard uIs, but you still do not yet know what kind of functionality is controlled (of course the label gives some hint). Additional information about the controlled functionality can be provided by the linked channel type.

Now coming back to specialized user interfaces that automatically connect to openhab. Let’s say we have a UI for controlling the lights, if it only checks the item types it does not have enough information to which item it should connect because it does not know if the switch item is for controlling the light or e.g., the main switch for the heating system.

By knowing to which channel type an item is connected the UIs can automaticallyselect the correct items to which it connects. Having now standardized channel types that are used across bindings the UI can be used even more universally.

So, I think we do not want to change so much. Instead we point out possibilities of existing concepts.

(Rich Koshak) #4

That is what Tags are for. You assign Tags to an Item to inform other subsystems of other information like what you are referring to. For example, one uses Tags to tell Amazon Alexa which Items are switchable. A different set of tags are used for Hue Emulation.

For something like this, I would expect one to use Tags on Items to inform your UI what Items are related to what subsystems.

I still think you are coming up with a new approach to solving a problem that OH already has mechanisms to address, namely Items and Tags.

(Lukas) #5

All right, got your point. So, is there some java script code that can be used in UIs and that automatically connects to items with certain tags or categories?

(Benjamin) #6

In habpanel there are indeed given functions to retrieve groups and as well tags:

itemsWithTag(tagname) gets an array of items with the specified tag.

(Rich Koshak) #7

All the UI work goes through the REST API. If you haven’t already, install the REST API Documentation in the Misc tab of Addons in PaperUI. It provides the full documentation of the REST calls available and it is interactive, letting you make calls and see the results.

In this case, an HTTP GET to https://openhabserver:8080/rest/items?tags=lighting&recursive=false will return a JSON formatted list of all the Items that have the tag “lighting.”

I don’t do web development so I’m not sure the best ways to handle making those calls from JavaScript but if you look at the code for any of the existing UIs you will see similar calls being made.

And just to be clear about the use of the term “category,” in PaperUI, the “Category” field on an Item is used to define that Item’s icon. So when you see the category in the JSON that gets returned that is the name of the icons the user defined for that Item to use. You could use this to carry alternative information for your UI I think without causing too many problems, but if you do so it will break BasicUI and ClassicUI at a minimum, possibly HABPanel as well. What this really means is a user would have to choose to use your UI or one of the others but not both without a little extra work.

(Lukas) #8

Hello all,

I would like to draw your attention to a webinar that may be of interest to you. It is about the open-source implementation of ISO/IEC 24752-8 and how it relates to EclipseSmart Home and OpenHAB. The standard specifies services for managing user profiles (“user contexts”) and data on other use context information. All context information can be used for the personalisaton of applications and user interfaces. The open-source implementation is OpenAPE http://openape.gpii.eu/index = Open Accessible Personalization Extension.

The webinar will take place on March 13th at 2:00 pm CET.

The agenda:

  •      Explanation of the OpenAPE architecture and workflow model
  •      Latest developments
  •      Demonstration of the new web interface to manage the different context formats
  •      Explanation of OpenAPE.js, that can be used in applications in order to personalize them
  •      Eclipse Smart Home and OpenAPE
  •      Time for your questions and ideas

The webinar is going to be broadcast via Adobe Connect at https://webconf.vc.dfn.de/openape/.

Note: If you have not used Adobe Connect before, please connect 5 minutes prior to the meeting. You may need to download and install an add-in for your browser.

Option 1: Via Internet browser (audio & screen sharing, optional video):

Go to https://webconf.vc.dfn.de/openape/