InfluxDB quering for a data from last finished month and this month so far?

I have a super simple time series with one value per day of the solar production.
I seems I can’t query for the values from the current month only and from the last finished month. Grafana has such a filters in the graph display but I can’t figure out the syntax for that…
Can anyone help me with the syntax on that?

select * from B_Power_energy_daily_log 

B_Power_energy_daily_log
time	                    value
2017-01-18T21:27:30.293Z	134.513
2017-01-19T21:06:00.045Z	5.951
2017-01-20T21:06:00.04Z	    12.539
2017-01-21T22:50:00.115Z	13.033
2017-01-22T22:50:00.028Z	11.301
2017-01-23T21:20:00.027Z	13.73
2017-01-24T21:22:00.05Z	    8.992

That should do it.

Edit: I might have understood you wrong. What are you trying to do regarding last month and this month only?

I guess he wanted to do something else.

To chart data from the beggining of the previous month until now, you need to set (upeer right corner of grafana):
from: now/M-M
to: now

To chart up until the end of the current month, set now/M instead of now.

1 Like

if today is 6th Feb I want to get results from 1st-31st Jan and 1st-6th Feb into two separate SingleStat panels.

As I need a single stat with it I needs to be incorporated directly into metric query.
What is the syntax there?

@PawelGrzembka , exactly that is what I try to make just not for the graph but for a single stat.

I’m not sure if that’s possible from inside Grafana. Influxdb doesn’t support the month for aggregation (e.g. GROUP_BY) https://docs.influxdata.com/influxdb/v1.2/query_language/data_exploration/#relative-time
You could go with “4w”…

That would be really limiting!
But I see that Grafana can take months (like “1M”) in the Time Range - Override relative time field.
Does it mean it is localy calculated in the client and passed into influx as a date range?

I’m afraid so

Workaround mode on. Would that to be possible to get the date range prepared in the OH2 with the rule and then somehow insert it into the influx/grafana query with the newly supported subquery option?

Or can a rule in OH2 make a custom query to the persistence DB? and put that value into the varaible?

You want openHAB to query for “mean() over a certain time range” and store that inside an openHAB number item?

Totally possible, but not with the influxdb persistence service. You will have to build your own script, which could run once per day and update an openHAB item via REST API. Check out the forum for examples.

A script to access an influxdb instance is easy. Either use the HTTP interface or one of the programming APIs

I need sum(“values”) of the daily solar production values from the last month.

I will have to resort to shell/pyton script that will run on crontab that will query influx from a CLI and publish that with MQTT to populate the variable in OH. For some reason use MQTT for all external script communication

Just to to be able to populate that one field in the Grafana dashboard.
Meh, lots of work

I’m not sure why you need MQTT in the mix but yes that sounds like a solution. Bothersome I know. Take my sine python script as a starting point.

Please post the result! I would like to do something similar with energy consumption.

Here it is:

Python script that does call a shell script to execute influx query with correct time boundaries for the last month. Influx CLI seems to only take date in the epoch format. I cron it for 1 minute past midnight on the 1st of each month.

import calendar
import datetime
import time
import subprocess
import paho.mqtt.publish as publish
now = datetime.datetime.now()
cur_month = now.month
cur_year = now.year
cur_month_days = calendar.monthrange(now.year, now.month) [1]
cur_month_start = ("01." + str(cur_month) + "." + str(cur_year)+ " 00:00:00")
cur_month_end =   (str(cur_month_days) + "." + str(cur_month) + "." + str(cur_year) + " 23:59:59")
pattern = '%d.%m.%Y %H:%M:%S'
epoch_cur_month_start = 1000000000 * int(time.mktime(time.strptime(cur_month_start, pattern)))
epoch_cur_month_end   = 1000000000 * int(time.mktime(time.strptime(cur_month_end, pattern)))

prev_month = cur_month - 1
if (cur_month == 1) :
     prev_month = 12

prev_year = cur_year

if (cur_month == 1) :
     prev_year = prev_year - 1

prev_month_days = calendar.monthrange(prev_year, prev_month) [1]
prev_month_start = ("01." + str(prev_month) + "." + str(prev_year)+ " 00:00:00")
prev_month_end =   (str(prev_month_days) + "." + str(prev_month) + "." + str(prev_year) + " 23:59:59")
epoch_prev_month_start = 1000000000 * int(time.mktime(time.strptime(prev_month_start, pattern)))
epoch_prev_month_end   = 1000000000 * int(time.mktime(time.strptime(prev_month_end, pattern)))

response = subprocess.Popen("./get_influx_data.sh " +  str(epoch_prev_month_start) + " " + str(epoch_prev_month_end), shell=True,  stdout=sub
process.PIPE).stdout.read()
words =response.split("Z")
publish.single("inverter/energy_montly_log", float(words[1])

And a companion shell script get_influx_data.sh as I was to lazy to escape the whole thing in Python.

#!/bin/sh
influx -username 'admin' -password 'SuperSecretPassword123+' -database 'openhab_db' -precision=rfc3339 -execute "select sum(value) from B_Power_energy_daily_log WHERE time >= $1 AND time <  $2"

Thanks @xsherlock!

Why the need for the shell script or the influx-cli tool. Can’t you use the http interface?

Also I believe your timestamp building can be optimized. Untested:

now = datetime.datetime.now()
cur_month_start = now.replace(day=1, hour=0, minute=0, second=0, microsecond=0)
epoch_cur_month_start = int(cur_month_start.strftime("%s")) * 1000