ChatGPT binding enhancement

Hi, everyone!

I’ve upgraded the ChatGPT binding a bit. Now additionality you can control any device in your openHAB system using natural language via chatGPT function calling. I find a lot of potential in this approach, and in the future it will be possible to ask, for example, to draw an image and display it on a TV screen. Or compose a melody and play it on your media device. I like to have one assistant for everything in one place. And the good thing is that it supports many languages.
Here is pull request. And here you can look in README.md
Jar for testing - Nextcloud

4 Likes

Just yesterday I was playing with the current binding and I was thinking how it could allow it access to my items. This is cool!

Sounds pretty amazing, thank you! So I finally need to check what it takes to get speech to text working (haven’t got text to speech either but at least I have some devices that should be able to do that)

Right now I’m using Google TTS and STT. But I think to deploy it from OpenAI as well, because one model support many languages. And I finding it also very useful.

the .jar seems to be deleted

May I ask which hardware you or others experimenting with chatGPT-speech use or recommend?
Currently I use mainly echos but as amazon limited the availability of the last-spoken-command (not pushed anymore) I think I should start looking for other possibilities.
I followed the willow discussion but it’s gotten a little silent there.

I’m using echos as well, I can get her to speak out the reply from gpt-4o, but I can’t make her sound like Eddy Murphy, I guess the new version takes care of that?

New link.

I’m using Google Home mini. Everything is working fine. The only bad thing is that I cannot use the microphone of this speaker.

Oh? I have a fleet of them. How does the tts sound? Mine is quite robotic, is yours fluent?

Allow me one question…

I haven’t tried this binding, though I am wondering, is this a locally installed LLM?
I am running Ollama locally… it would be nice to have a binding that would work with a localised LLM. Ollama has a local API, hence, as far as I understand it, iOH could be interfaced with it.
Is this being developed? As in OH to local LLM (of choice)?

I think that llama is using the same api so you should just need to change the endpoint to your local one!

Yes and no.
The ChatGPT binding has a configuration property to define the apiUrl, and so we can use a local LLM compatible with the OpenAI API.
Ollama is quite compatible with the OpenAI API, but not completely:

1- ollama OpenAI API doesn’t (yet) implements /v1/models. So we can’t use the ChatGPT binding (models URL is an advanced parameter, and as far as i know it seems to be mandatory ? I can just say that ollama doesn’t work for me with the binding, it fails with COMMUNICATION_ERROR)

2- ollama OpenAI API doesn’t (yet) implements function calling:. So the PR for using ChatGPT to control openHAB described in this discussion thread cannot be used, even if (1) is fixed.

1 Like

got an error loading the jar, I assume it’s because I’m still on OH 4.1.0?

Had a look at some Youtubes… while I can’t say if it works with OH, it can do function calling with some python addon (pydantic)… https://www.youtube.com/watch?v=eHfMCtlsb1o

Yes, ollama can do that.
But for this to work here with the ChatGPT binding, we need the ollama OpenAI compatibility API to support function calling. And it is not the case yet (but it seems to be on the roadmap).

I thought it depended more on what tts you were using. Mine sound good.

Now it working as a HLI service. New jar for testing.