openHAB Blindfolded

Hello openHAB Community,

my friend and coworker, who is blind, tried to use openHAB and wanted to share his opinions.
Brace yourself here he goes! :slight_smile:

I’ve looked at a already running openHAB instance as a blind user. My
goal here is to show you what should be improoved. And I’m hoping
someone can fix the issues I’ve encountared.

WTF
At first I’ve checked out the paper UI. Sadly a couple of buttons
weren’t labeled correctly for use with a Screen Reader. Then there are
many radio buttons wich serve no purpose. After about one hour playing
around I’ve gotten to a point where I think I know where and how to find
the informations I’m looking for but honestly that was with a lot of try
and error. In that state I couldn’t recommend using openHAB to casual
users even when I set it up for them. Provided they wanted to use it
with their Computer and are in need of a screen reader.

Navigating around is extremely unpredictable and frustrating. I highly
doubt that someone who doesn’t like to play around would take the time
to make sense of the structure. It might be visually pleasing but with a
screen reader it’s a total different kind of beast.
When installing Add-ons I found that the refresh-button exist twice or
at least it is announced twice to the screen reader.

Then I tested the basic UI which honestly was a horrible experience. At
least I had seperate Rooms but there wasn’t a way to use the smarthome
what so ever. I saw what elements where in a specific room but couldn’t
interact with them. The buttons that were there where just announced as
„button“. Honestly the Basic UI is totally unusable.

After that encounter I have checked out the iOS App which honestly
surprised me. There are a few buttons without correct labeling but that
is basically it for my criticism here. It performs really well and
allows me to interact with the smarthome without any problems. In
general I’d prefer a dropdown menu for choosing the colour of the
lighting but that is more like a feature request then a valid point
towards accessibility.

Last but not least I’ve checked out the Android App and was plesently
surprised as well. It has more things you can do and interacting with
your voice is, in my opinion, the way of the future. However the buttons
for increasing and decreasing values such as the temperature are
completely unlabeled which should be fixed. I only get „unknown button“
as feedback from Talkback.

My conclusion is that I’d like the paper UI more to be like the mobile
Apps. And that Developers especially in open-source projects should be
mindful to label their buttons and links correctly so everyone can use
the Software.

11 Likes

This information is golden. There are a lot of issues here and I would hate for them to be lost.

First, did your friend try Habmin? That might be a little easier to use with a screen reader than PaperUI (or it could be a disaster, I don’t know). But it is a viable alternative.

So, to start, I would recommend creating an Issue over at the Eclipse SmartHome project for the PaperUI problems.

The Basic UI problems (I would be interested to know if the same problems exist for Classic UI and HABpanel) deserve a separate issue at the same Eclipse SmartHome project.

The issue with correct labeling of buttons in the iOS app should be filed on the separate iOS App project and the Android app should be filed in its project.

As for the dropdown issue, that can be done using a Selection element on the sitemap as opposed to the Color element.

@maintainers, this is a really important posting.

Thanks for sharing!

1 Like

Fantastic share, home automation is a very common thing sought after with people with disabilities and I think openHAB could do better in terms of accessibility as you’ve mentioned. It’d be great for people who recognise this most to make an issue request on github whenever something is found, and maybe it’s something that we could focus on in the future?

1 Like

Thanks for sharing your thoughts!

I totally agree, the screen reader experience is far from perfect… however I’d not declare it as horrible.

I have to share a story myself. When I started to look into home automation and eventually ended up with openHAB a few years back all of this was initiated because a blind family member asked me for such a system in his home.

He is now using the system with dozens of components on a daily basis, including building his own (small) rule modifications. He is working with the iPhone app and the Basic UI and since a few months nearly everything is controlled via Amazon Alexa voice commands - that really changed everything for him. He is btw solely relying on configuration files which is quite easy for him and avoided to look into Paper UI to begin with.

His openHAB home automation system is a huge enrichment of his daily life. Automatic heating, controlling the lights for guests or getting a notification when the vacuum robot got stuck are just some of his new possibilities.

Basic UI has it’s obscurities. He learned to live with those but I’ve already looked at the BasicUI code and found a few possible improvements involving special ARIA elements. With the reminder, that there are other blind openHAB users out there, I will look at these again!

4 Likes

Hi,

since I’m the blind person in question I’ve signed up to join the discussion and learn from many others to come. I have as suggested tried the classic UI. So far on the Desktop I like it the most. Getting an overview of what is available is very straight forward. However Sliders and Buttons appear as graphics.

With a mixture of classic UI, paper UI and the mobile apps living with openHAB is doable. And I’ve got the feeling that it will get better from now on.

However I am very positively surprised about the feedback here and I’m looking forward to gain more knowledge about the project and give back to the community as much as possible.

7 Likes

Its really great to see such a positive reaction from a community around accessibility improvements are concerned.
Its also great to hear of other blind and visually impaired members too!
I have found hooking up Alexa and Siri with Openhab is simply magic.
it means I can now perform many tasks without needing to get accessibility tools involved, and that way my time on the computer is much more focused and I can get to grips with improving the integration of the OH2 system for my whole family to share.

OFF TOPIC
I have developed a set op scripts for use with Alexa and IFTTT that I use to scan a letter and then I can ask Alexa to read the scan to me, or she can send it to me in an email as an image or as OCR text. I would love to get this into the NZ Blind Foundation but really need Google or Amazon to officially support their smart assistants in NZ. The scripts a little rough and read but work well for me, I will certainly share if you think they could be helpful.

Great Post!

Paul

1 Like

Scanning and reading my mail was something which I hadn’t had on my radar yet. But I really like the idea. Seems like for complete home automation I have to train the dog not only to fetch the mail but also to open it and put it on the scanner. But seriously I’d love to learn more about this.

Here is an issue on the Eclipse SmartHome project for PaperUI with a back link to this post: https://github.com/eclipse/smarthome/issues/3766

And one for BasicUI as well: https://github.com/eclipse/smarthome/issues/3767

3 Likes

So I have used my Sunday Night for setting up openHAB for the first time myself to see if the process is accessible using openHABian.

So far I just had one problem. The page for installing new Add-ons were not the easiest to work with. For some reason I couldn’t select a category and was wondering why I couldn’t find the network binding for example. However I don’t know yet if that is a problem with VoiceOver on the mac. So I will investigate further and let you guys know.

So far I have just discovered my devices and am looking forward to fully implement openHAB into my home. By the way the documentation I have read so far is excellent.

Another day, another message. I have tried to administer openHAB using Firefox and orca on Linux. On the first page the different UIs weren’t marked as links so I had to navigate using the flat review and simulating a mouse click to reach the paper UI. That should be improved.

Another Issue I’ve found is when reading the forum my Screen Readers sometimes jump around so I loose track of the conversations. So far I don’t know what causes this behavior but the same occurs on the start page when administering my openHAB instance.

If I understand correctly, a screen reader will read whatever the “Title” is when you hover over a button as a tooltip or start the page. We should probably add these then!

That’s the crux but it the screenreader is more complex then that. Anyhow, yes that would most probably be the solution :slight_smile:

@MindSneaker thanks for your continued reports! Everything sounds good so far and the. let’s call them minor, issues you had will help when someone finds the time to improve the UIs. I’d be happy to spend some time there but am honestly currently not in the position to make any promises.

Regarding the forum: The forum is powered by a software called Discourse. We have no influence there but maybe you will find some hints by googling for “discourse screenreader”.

I’m glad you got the openHABian setup working. Did you also work on the SSH console or is your experience currently limited to “flash, plug, wait, enjoy?”

These are great news!!

1 Like

Thanks for the pointers. I’ll look into it soon.

For now I just used the graphical frontend provided by openHabian. However in general working with SSH would be more productive as soon as I can wrap my head around setting my smart home stuff up using the cli. Preferable there would be a script that shows new devices such as the inbox in the Paper UI does.

The goal here should be that the cli should work seemlessly like the Paper UI. So if you know one you can work with the other without any problem. However that might be a first world problem’s task. But I think someone could benefit from such an improvement.

1 Like

Hey @MindSneaker,
I believe you are looking for two things. The openHAB logs and the remote console (karaf). With these to things at your fingertip you can stay clear of the Paper UI (as do I).

The log can be accessed multiple ways:

  • via SSH: just type the command tail -f /var/log/openhab2/openhab.log or tail -f /var/log/openhab2/openhab.log or tail -f /var/log/openhab2/*.log

  • via frontail / the openHAB Log Viewer: A webtool provided by openHABian. It’s basically the tail output from above but on a webpage. I doubt that this one will be a good option in combination with a screenreader.

  • via the openHABian samba share “openHAB-share”. That’s great because you can use your favorite text editor to browse the log. A problem might be, that the classic editor doesn’t reload the content of the file, which is the most important feature you’d want. Being able to see newly arriving lines is what’s helpful during the set up of your home automation. See this article for a few options and please let us know if you found a tool suited for your needs: https://stackify.com/13-ways-to-tail-a-log-file-on-windows-unix

The karaf/openHAB remote console is a second SSH server on your Raspberry Pi, provided by openHAB. You can make it available in your network by the openHABian configuration menu. (I fear that you’ll not be able to control this menu through SSH. Please contact me if that’s the case!) You can also do that manually. See the remote console article for more details. After being connected type help to find all available commands. You’ll discover the “smarthome:…” commands, which cover all you need. As an example: You can type smarthome:items *Battery* to get all items with “Battery” in their item name listed.

Good luck!