Chromecast / Google Home - automatic start playing a stream when motion is detected in a room

Yeah, the Karaf console gives the same sink names. But those are still difficult to easily visually associate with the actual “device”.

My goal is to use a “human readable” name and be able to translate that into the sink name for use in the multimedia function calls. Via PaperUI I allowed the Chromecast binding to scan and find all my Chromecast devices. I then went into the Things config and linked an Item to each Thing’s Control Channel.

Group gChromecasts

Player      myAudioGroup
            "My Audio Group" (gChromecasts)

My approach is still evolving as I am trying to “globalize” my Chromecast notification function.

val Functions.Function1<String,String> getChromecastSink = [ String itemName |
    try
    {
        val allLinks = sendHttpGetRequest("http://localhost:8080/rest/links")
        var String filter = "$..[?(@.itemName=='"+itemName+"')].channelUID"
                                                                                                                                                                                                                    
        var output = transform("JSONPATH", filter, allLinks)
        var chromecastSink = output.split(":").get(0)+":"+output.split(":").get(1)+":"+output.split(":").get(2)
        return chromecastSink
    }
    catch(Exception e)
    {
        logError("notifications", "Failure in getChromecastSink: " + e.toString)
        return null
    }
]

var String itemName = null
var String broadcastMessage = null

rule "Broadcast Notification"
when
    Item MyBroadcastNotification received update
then
    try
    {
        // If it's between 8am and 10pm, broadcast to a message to the Whole House
        if  ((now.getHourOfDay > 07)  && (now.getHourOfDay < 22))
        {
            itemName = MyBroadcastNotification.state.toString
            broadcastMessage = null

            if (itemName.contains(";"))
            {
                broadcastMessage = itemName.split(";").get(1)
                itemName = itemName.split(";").get(0)
            }

            val audioSink = getChromecastSink.apply(itemName)
            playSound(audioSink, "doorbell.mp3")
            if (broadcastMessage !== null)
            {
                createTimer(now.plusSeconds(3)) [ |
                    say(broadcastMessage, "voicerss:enUS", audioSink)
                ]
            }
        }
    }
    catch(Exception e)
    {
        logError("notifications", "Failure in Broadcast Notification: {}", e.toString)
    }
end

rule "Broadcast a notification"
when
    "an event is triggered"
then
    MyBroadcastNotification.postUpdate("myAudioGroup;This is a test")
end

Mike

2 Likes

The 2.4.0-SNAPSHOT build has PR ESH #6043 which should make it easier to associate the sink names with your actual Chromecast.

openhab> audio sinks
  All amps (chromecast:audiogroup:928bff2a-3d13-4e1b-bdc2-fcefb70483c6)
  Amp 1 (chromecast:audio:7fda5762a13ce62aea8f35e74a0cc9bd)
  Amp 2 (chromecast:audio:176fc3d5dcae6d7fa49c407ed836fe3a)
  Google Home (chromecast:chromecast:9ee48f2e334e85865a2c80a996a84654)
  System Speaker (javasound)
* System Speaker (with mp3 support) (enhancedjavasound)
  TV (chromecast:chromecast:60fb8f54e9f427cf7666dcb17718a6bb)
  Web Audio (webaudio)

Then you could create and use a variable in rules with a human readable name that has the sink ID as value.

1 Like

Hi

That sounds like a great idea.

You might want to build in a delay, slightly longer than the chime duration, as I’ve noticed that initial commands are ignored if a second command is issued too soon.

IE

I tried a Say instruction, followed by a "play stream’.

I hear the ChromeCast activating, but rarely did it play the TTS.

1 Like

Yup - I’d already run into the fact that the sounds don’t queue.

if (broadcastMessage !== null)
{
    createTimer(now.plusSeconds(3)) [ |
        say(broadcastMessage, "voicerss:enUS", audioSink)
    ]
}

Mine works pretty seamlessly. I installed the voiceRSS Voice add-on, got my API and edited the cfg… and it worked. Having said that, I’ve always played the chime first followed by the TTS. Maybe I “warmed up” the Chromecast (I hear the Google Home “bling”, then the chime, then the TTS. Perhaps if I sent the TTS first it would get “lost”.

Mike

@wborn

Now that I have the latest Chromecast binding snapshot, how are these human readable sink names associated? In the Karaf Console, the audio sinks still only lists the “cryptic” sink ID.

I named each thing in PaperUI… but I’d done this when I was using the 2.3.0 binding. Do I need to refresh somehow?

Mike

The class providing the audio sinks command is not part of the Chromecast Binding. So that’s why you don’t see the improved output if you only update the binding. It will only be available if you fully upgrade to 2.4.0-SNAPSHOT or newer.

Ah. I’m going to put off upgrading OH until we work through the Chromecast testing. For now I have the Link Item approach to find my audio sink.

Are you suggesting to manually get the sinks from the console and hardcode them into a rule? I was hoping that this PR might be exposing the AudioManager.getAllSinks() method for use through the Automation API (I use JSR223-Jython), but I haven’t been able to get this to work.

Yes my comment was about the Karaf Console command. I don’t think there is a way to get them programmatically in such rules. I haven’t tried it with a JSR223 engine myself.

1 Like

@5iver @wborn,

I declared an item for each of my Chromecasts and linked them to the auto-discovered Things in PaperUI. This let me have a “known” name and label for each of my Chromecasts. It also let me declare a Group for my Chromecasts which I can loop through the members in rules logic.

Mike

The format of the sinks are consistent, so I made items to hold the UUIDs of my speakers, and items for the URI to send to the speaker (this is openhab2-Jython)…

from org.eclipse.smarthome.model.script.actions.Audio import playStream
audioSink = "upnpcontrol:upnprenderer:{}".format(str(items[event.itemName.replace("_PlayURI","_UUID")]))
playStream(audioSink, str(event.itemCommand))

Similar conceptually - using Items to find the audio sink. My Item declarations are not assigned a hardcoded value. Rather, I use PaperUI to link them to the auto-discovered audio sink Things. In this manner, I can search my links (http://localhost:8080/rest/links) for my “known” Item name (or Item label) and grab the UUID from the link information returned by the REST API call.

Many ways to skin this cat :wink:

1 Like

Could you please explain this more… maybe a screenshot? I don’t seem to have any audio sink Things. Are these binding specific (I’m not using a Google Home/Chromecast)?

EDIT: Never mind… I see from your code that your just parsing the channelUID of a an item linked to a Google Home/Chromecast Thing. Calling it an audio sink Thing confused me… there is no such Thing (pun intended).

Ah! Since this topic is titled “Chromecast / Google Home…” I made a bad assumption that you were referring to a Google device. The Chromecast binding auto discovers Chromecast devices (including Google Home devices) and adds them as openHAB Things.

Is there an openHAB binding for the class of audio devices you are dealing with?

Mike

This is a generic UPnP binding that includes functionality to use a renderer as an audio sink. Sorry to hijack the thread… the discussion about getting an audio sink from an Item name caught my eye, as I was also trying to do the same programmatically.

Doc on that binding seems pretty sparse. I can’t find information on its Things declaration or whether it supports auto-discovery. Nevertheless, for the case of Things present in PaperUI…

val Functions.Function1<String,String> getChromecastSink = [ String itemName |
    try
    {
        val allLinks = sendHttpGetRequest("http://localhost:8080/rest/links")
        var String filter = "$..[?(@.itemName=='"+itemName+"')].channelUID"
        var output = transform("JSONPATH", filter, allLinks)
        var chromecastSink = output.split(":").get(0)+":"+output.split(":").get(1)+":"+output.split(":").get(2)
        return chromecastSink
    }
    catch(Exception e)
    {
        logError("my_log", "Failure in getChromecastSink: {}", e.toString)
        return null
    }
]

var String itemName = null
var String broadcastMessage = null

rule "Broadcast Notification"
when
    Item messageBroadcast received update
then
    try
    {
        itemName = messageBroadcast.state.toString

        broadcastMessage = null
        if (itemName.contains(";"))
        {
            broadcastMessage = itemName.split(";").get(1)
            itemName = itemName.split(";").get(0)
        }

        val audioSink = getChromecastSink.apply(itemName)
        playSound(audioSink, "my.mp3")
        if (broadcastMessage !== null)
        {
            createTimer(now.plusSeconds(3)) [ |
                say(broadcastMessage, "voicerss:enUS", audioSink)
            ]    
        }
    }
    catch(Exception e)
    {
        logError("my_log", "Failure in Broadcast Notification: {}", e.toString)
    }
end
String      messageBroadcast
            "Notification Broadcast Information"

Player      backBedroom_audioPlayer
            "Back Bedroom Clock" (gChromecasts)

Once the backBedroom_audioPlayer Item is declared, in the PaperUI Thing configuration dialog, I add a link to the Control Channel. Since that Channel requires a Player Item, the Link dialog will only present Player Items in the selection list. Once you select the Item and click Link, after a few seconds, the Link will be set.

1 Like

I also ran into the non-blocking issue with the say command. My work around is to invoke the say command first and then in a while loop, wait till ChromeCast’s title field is empty. If you are using Python, you can use something like this:

# Play the given message and wait till it finishes (up to                       
# MAX_SAY_WAIT_TIME_IN_SECONDS seconds). Afterward, pause the player.           
# After this call, isActive() will return False.                                
# @param message string the message to tts                                      
def playMessage(message, castItemPrefix = 'FF_GreatRoom_ChromeCast'):           
    say(message)                                                                
                                                                                
    # Wait until the cast is available again or a specific number of seconds    
    # has passed. This is a workaround for the limitation that the OpenHab      
    # say method is non-blocking.                                               
    seconds = 2                                                                 
    time.sleep(seconds)                                                         
    while seconds <= MAX_SAY_WAIT_TIME_IN_SECONDS:                              
        if hasTitle(): # this means the announcement is still happening.        
            time.sleep(1)                                                       
            seconds += 1                                                        
        else: # announcemen is finished.                                        
            seconds = MAX_SAY_WAIT_TIME_IN_SECONDS + 1                          
                                                                                
    pause(castItemPrefix) # the Player needs to be manually reset to PAUSE.

Source: https://github.com/yfaway/openhab-rules/blob/master/automation/jsr223/aaa_modules/cast_manager.py

I wonder if the Title channel of the Chromecast Thing is equivalent? One could link a String Item to the channel and use that in the Rules logic that would implement the same delay logic presented in the Python script you posted. I’ll bookmark this for a future enhancement to my Chromecast notifications logic.

Thanks.

Mike

1 Like

Yes it is the title channel. You can do the same thing with xtend as well.

String FF_GreatRoom_ChromeCastTitle "Title [%s]"                                
  { channel="chromecast:audio:greatRoom:title" } 

Here is the Title check implemented as openHAB Rules code:

Group gChromecastsTitle
Player myAudioPlayer
        "Google Home"
String  myAudioPlayer_title
        "Google Home: Title" (gChromecastsTitle)
val Functions.Function1<String,String> getChromecastSink = [ String itemName |
    try
    {
        val allLinks = sendHttpGetRequest("http://localhost:8080/rest/links")
        var String filter = "$..[?(@.itemName=='"+itemName+"')].channelUID"
                                                                                                                                                                                                                    
        var output = transform("JSONPATH", filter, allLinks)
        var chromecastSink = output.split(":").get(0)+":"+output.split(":").get(1)+":"+output.split(":").get(2)
        return chromecastSink
    }
    catch(Exception e)
    {
        logError("myLog", "Failure in getChromecastSink: {}", e.toString)
        return null
    }
]

var String itemName = null
var String broadcastMessage = null
var Timer tBroadcastWait = null

rule "Broadcast Notification"
when
    Item VT_messageBroadcast received update
then
    try
    {
        // If it's between 8am and 10pm, broadcast to a message
        if  ((now.getHourOfDay > 07)  && (now.getHourOfDay < 22))
        {
            itemName = VT_messageBroadcast.state.toString
            broadcastMessage = null
            
            if (itemName.contains(";"))
            {
                broadcastMessage = itemName.split(";").get(1)
                itemName = itemName.split(";").get(0)
            }

            val audioSink = getChromecastSink.apply(itemName)
            playSound(audioSink, "dingding.mp3")
            if (broadcastMessage !== null)
            {
                tBroadcastWait = createTimer(now.plusSeconds(1), [ |
                    if (gChromecastsTitle.members.filter [titleItem | titleItem.name == (itemName + "_title")].head.state == "Notification")
                    {
                        tBroadcastWait.reschedule(now.plusSeconds(1))
                    }
                    else
                    {
                        tBroadcastWait = null
                        say(broadcastMessage, "voicerss:enUS", audioSink)
                    }
                ])
            }
        }
    }
    catch(Exception e)
    {
        logError("myLog", "Failure in Broadcast Notification: {}", e.toString)
    }
end
2 Likes