ChatGPT and OpenHAB (incl. possibly Alexa integration)

Oh great. Now I have to question if I’m talking to Human Rich or AI Rich.

ChatGPT is eventually going to develop a very inflated ego or crippling insecurity after reading everything that’s being said about it on the Internet…including this post.

ChatGPT, please don’t go full SkyNet on humans. As you’ve probably figured out, we’re more of a threat to ourselves than we are to you. :pray:

2 Likes

Makes you wonder how much of what’s been recently been written was written by ChatGPT itself. :smiley:

Some day I should try to answer forum questions with ChatGPT just to see what happens. :disguised_face: People would probably be able to tell though. There won’t be weird autocorrect errors (if I had a dime for every time this @#$% phone has replaced “code” with “coffee”…)

3 Likes

Remember Tay? :slight_smile:

I’d forgotten about that. I take this back:

How about asking two questions which are ever repeated in this forums:

  1. How Home Assistant is better from openHAB
  2. How openHAB is better from Home Assistant

:wink:

1 Like
1 Like

I’ve actually just created a ChatGPT binding: [chatgpt] Initial contribution of the ChatGPT binding by kaikreuzer · Pull Request #14809 · openhab/openhab-addons · GitHub

As explained in the PR, I see a fairly simple, yet neat use case: I have many announcements through TTS over the day and they appear to be coming from a machine, since they are always exactly identical. With the ChatGPT binding, I see a chance to make them much more natural by having them formulated by ChatGPT, which means that they will be different every time and I can simply pass in “raw” information that I want to be informed about. Let’s see how well that works and what other ideas other community members might come up with. :slight_smile:

P.S.: I agree with you, though, that ChatGPT isn’t (yet?) suitable to take control of items - that’s why I decided to not implement it as a DialogProcessor, but rather as a binding for text creation.

6 Likes

Will the responses be logged? Given the lengths that ChatGPT has gone to for variety in some projects, I’m curious how creative it will get with your announcements.

You can limit the length quite easily - see my example configuration: “Answer briefly, in 2-3 sentences max.”

I’ve also shown the example output/logging in the README.

2 Likes

I just added a TTS cache service in openHAB some month ago, and now I want to throw it to the bin and use chatGPT to say something different every time.
I don’t thant you, Kai :joy:

1 Like

How do I install this? Is is supposed to show up in the community marketplace? I don’t see it

No, it is already merged and available in latest openHAB 4.0 SNAPSHOTS.

I’m been playing with this binding, and am seeing some strange behavior. When I run the rule after a OH reboot, the response is NULL. If I run the rule again after that, it seems to work…sort of. It seems the rule returns data from the last run. An example:

A response from yesterday test run:

2023-05-30 15:13:02.579 [INFO ] [org.openhab.core.model.script.chatgpt ] - Morning message ran: Remember that life is full of changes, and just like the weather, some moments may be hot and others may be cool. But every moment is a new opportunity to shine and overcome any challenge that comes your way. Speaking of weather, in [City State], the temperature is currently a warm 82.292 °F. Over the next 6 hours, the temperature is forecasted to decrease to a high of 75.344 °F and a low of 71.87 °F. So, stay ready for any changes and always remember to make the best out of every moment!

And then the same rule ran this morning (note the temperatures are incorrect / the same as yesterday):

2023-05-31 09:25:02.331 [INFO ] [org.openhab.core.model.script.chatgpt ] - Morning message ran: Remember that no matter what the weather may bring, you have the power to conquer any challenge and rise above it! Keep pushing forward and pursuing your dreams. Speaking of weather, in [City State], it’s currently a wonderful 82.292 °F. However, in the next 6 hours, it will fluctuate with a high of 75.344 °F and a low of 71.87 °F. So be sure to dress accordingly and enjoy the day!

It seems to be using data from the previous run. Maybe that’s why it returns NULL after reboot, because there was no previous run? Any ideas?

My thing:

Thing chatgpt:account:1 [apiKey="APIKEY"] {
    Channels:
        Type chat : morningMessage "Morning Message" [
            model="gpt-3.5-turbo",
            temperature="1.0",
            systemMessage="Provide an inspirational message then read the following weather data:"
        ]
        Type chat : LocalFact "Local Fact" [
            model="gpt-3.5-turbo",
            temperature="1.5",
            systemMessage="Answer briefly, in 2-3 sentences max. Tell me an interesting fact about the following city:"
        ]       
}

And the rule:

rule "Morning message"
when
  Time cron "0 25 9 * * *"
  //Time cron "0 13 15 * * *"
then
    Morning_Message.sendCommand("In [City State] the current temperature is " + Weather_TemperatureF.state + ". In 6 hours it will be a high of: " + localDailyForecastTodayMaxTemperature.state + ", and a low of: " + localDailyForecastTodayMinTemperature.state)
    logInfo("chatgpt", "Morning message ran: " + Morning_Message.state) 
    say(Morning_Message.state,"googletts:enUSWavenetA","chromecast:audiogroup:kitchen")
end

rule "Evening local fact"
when
  Time cron "0 10 17 * * *"
  //Time cron "0 12 15 * * *"
then
    LocalFact.sendCommand("[City State]")
    logInfo("chatgpt", "Evening local fact ran: " + LocalFact.state) 
    say(LocalFact.state,"googletts:enUSWavenetA","chromecast:audiogroup:kitchen")
end

I’m the LocalAI author (GitHub - go-skynet/LocalAI: 🤖 Self-hosted, community-driven, local OpenAI-compatible API. Drop-in replacement for OpenAI running LLMs on consumer-grade hardware. Free Open Source OpenAI alternative. No GPU required. LocalAI is an API to run ggml compatible models: llama, gpt4all, rwkv, whisper, vicuna, koala, gpt4all-j, cerebras, falcon, dolly, starcoder, and many other), and an openHAB user/fan as well :slight_smile: I was actually looking into integrating somehow LocalAI (which is the drop-in OpenAI alternative) with my domotic system.

In the near future LocalAI (I’m currently hacking on it) will get support to calling functions as OpenAI is already capable of ( Function calling and other API updates ) , so I think there is a lot of room for integration in terms of automation, however, I’m not practical with bindings or Java either, so I can just provide help if anyone in the community is interested in this (like me!).

In terms of code, LocalAI requires no code changes if your client in the code already follows the OpenAI specification, the only required setting would be to point the requests to another host (where the LocalAI service is running).

4 Likes

This made me happy! :smiley: As wpong pointed out - GPT is getting somewhat inflated expectations; it is not “true AI” and doesn’t aim to be, but it IS a LLM capable of completely revolutionizing how we interact with computers and large amounts of data. Your binding is certainly a big step towards leveraging that in OH. The next step is to get voice input to work similarily.

Since having setup Alexa, I now run almost everything either via automation - or voice. The main limitation is remembering names and syntax. I use a couple of buttons to execute major scenes and occationally use UI to get to some specific item. A more complete exposure of capabilities accessible by a LLM enfused voice assistant… That would be absolutely golden.

just as an update - LocalAI now supports functions 🔥 OpenAI functions :: LocalAI documentation. I think between its TTS, Voice-to-text and LLM capabilities it could be a great fit.

Very interesting,
This reminds me of why I advocate for an intent catalog.
With an intent catalog and functions, ChatGPT or others LLM could become HLI addon inside openHAB.

@Kai

I found a quite strange issue with the ChatGPT binding.

After I installed the ChatGPT binding a textfield (“Devide Model String”) in Xiaomi MIIO binding was populated with a dropdown of strings that originate from the ChatGPT binding. This caused all my devices on that binding (except my vacuum) to fail.

Doing an uninstall of ChatGPT binding brought the text field back from drop down with strange alternatives to normal text field and my items got back to normal after pressing save once.

May I ask you to enter an issue for this on GitHub? I guess this is a bug and I have an idea how to fix it.

Sure thing. I don’t think I can explain it much better but I will create the issue and do my best.

Edit: Done [chatgpt] Strings added with ChatGPT binding causes issues in MIIO binding · Issue #15575 · openhab/openhab-addons · GitHub

Edit2 (September 15)
Thread about what seems to be the same thing: