ChatGPT and OpenHAB (incl. possibly Alexa integration)

I’m been playing with this binding, and am seeing some strange behavior. When I run the rule after a OH reboot, the response is NULL. If I run the rule again after that, it seems to work…sort of. It seems the rule returns data from the last run. An example:

A response from yesterday test run:

2023-05-30 15:13:02.579 [INFO ] [org.openhab.core.model.script.chatgpt ] - Morning message ran: Remember that life is full of changes, and just like the weather, some moments may be hot and others may be cool. But every moment is a new opportunity to shine and overcome any challenge that comes your way. Speaking of weather, in [City State], the temperature is currently a warm 82.292 °F. Over the next 6 hours, the temperature is forecasted to decrease to a high of 75.344 °F and a low of 71.87 °F. So, stay ready for any changes and always remember to make the best out of every moment!

And then the same rule ran this morning (note the temperatures are incorrect / the same as yesterday):

2023-05-31 09:25:02.331 [INFO ] [org.openhab.core.model.script.chatgpt ] - Morning message ran: Remember that no matter what the weather may bring, you have the power to conquer any challenge and rise above it! Keep pushing forward and pursuing your dreams. Speaking of weather, in [City State], it’s currently a wonderful 82.292 °F. However, in the next 6 hours, it will fluctuate with a high of 75.344 °F and a low of 71.87 °F. So be sure to dress accordingly and enjoy the day!

It seems to be using data from the previous run. Maybe that’s why it returns NULL after reboot, because there was no previous run? Any ideas?

My thing:

Thing chatgpt:account:1 [apiKey="APIKEY"] {
    Channels:
        Type chat : morningMessage "Morning Message" [
            model="gpt-3.5-turbo",
            temperature="1.0",
            systemMessage="Provide an inspirational message then read the following weather data:"
        ]
        Type chat : LocalFact "Local Fact" [
            model="gpt-3.5-turbo",
            temperature="1.5",
            systemMessage="Answer briefly, in 2-3 sentences max. Tell me an interesting fact about the following city:"
        ]       
}

And the rule:

rule "Morning message"
when
  Time cron "0 25 9 * * *"
  //Time cron "0 13 15 * * *"
then
    Morning_Message.sendCommand("In [City State] the current temperature is " + Weather_TemperatureF.state + ". In 6 hours it will be a high of: " + localDailyForecastTodayMaxTemperature.state + ", and a low of: " + localDailyForecastTodayMinTemperature.state)
    logInfo("chatgpt", "Morning message ran: " + Morning_Message.state) 
    say(Morning_Message.state,"googletts:enUSWavenetA","chromecast:audiogroup:kitchen")
end

rule "Evening local fact"
when
  Time cron "0 10 17 * * *"
  //Time cron "0 12 15 * * *"
then
    LocalFact.sendCommand("[City State]")
    logInfo("chatgpt", "Evening local fact ran: " + LocalFact.state) 
    say(LocalFact.state,"googletts:enUSWavenetA","chromecast:audiogroup:kitchen")
end

I’m the LocalAI author (GitHub - go-skynet/LocalAI: 🤖 Self-hosted, community-driven, local OpenAI-compatible API. Drop-in replacement for OpenAI running LLMs on consumer-grade hardware. Free Open Source OpenAI alternative. No GPU required. LocalAI is an API to run ggml compatible models: llama, gpt4all, rwkv, whisper, vicuna, koala, gpt4all-j, cerebras, falcon, dolly, starcoder, and many other), and an openHAB user/fan as well :slight_smile: I was actually looking into integrating somehow LocalAI (which is the drop-in OpenAI alternative) with my domotic system.

In the near future LocalAI (I’m currently hacking on it) will get support to calling functions as OpenAI is already capable of ( Function calling and other API updates ) , so I think there is a lot of room for integration in terms of automation, however, I’m not practical with bindings or Java either, so I can just provide help if anyone in the community is interested in this (like me!).

In terms of code, LocalAI requires no code changes if your client in the code already follows the OpenAI specification, the only required setting would be to point the requests to another host (where the LocalAI service is running).

4 Likes

This made me happy! :smiley: As wpong pointed out - GPT is getting somewhat inflated expectations; it is not “true AI” and doesn’t aim to be, but it IS a LLM capable of completely revolutionizing how we interact with computers and large amounts of data. Your binding is certainly a big step towards leveraging that in OH. The next step is to get voice input to work similarily.

Since having setup Alexa, I now run almost everything either via automation - or voice. The main limitation is remembering names and syntax. I use a couple of buttons to execute major scenes and occationally use UI to get to some specific item. A more complete exposure of capabilities accessible by a LLM enfused voice assistant… That would be absolutely golden.

just as an update - LocalAI now supports functions 🔥 OpenAI functions :: LocalAI documentation. I think between its TTS, Voice-to-text and LLM capabilities it could be a great fit.

Very interesting,
This reminds me of why I advocate for an intent catalog.
With an intent catalog and functions, ChatGPT or others LLM could become HLI addon inside openHAB.

@Kai

I found a quite strange issue with the ChatGPT binding.

After I installed the ChatGPT binding a textfield (“Devide Model String”) in Xiaomi MIIO binding was populated with a dropdown of strings that originate from the ChatGPT binding. This caused all my devices on that binding (except my vacuum) to fail.

Doing an uninstall of ChatGPT binding brought the text field back from drop down with strange alternatives to normal text field and my items got back to normal after pressing save once.

May I ask you to enter an issue for this on GitHub? I guess this is a bug and I have an idea how to fix it.

Sure thing. I don’t think I can explain it much better but I will create the issue and do my best.

Edit: Done [chatgpt] Strings added with ChatGPT binding causes issues in MIIO binding · Issue #15575 · openhab/openhab-addons · GitHub

Edit2 (September 15)
Thread about what seems to be the same thing: