I am a data junkie and I have a lot of sensors in my home for which I want to keep records. Most of them are temperature and electrical power and I want to see short peaks in the persisted data, so today I store them very frequent or even on change. Therefore a lot of the Mysql tables are flooded with data that is not so interesting and could be “compressed”, e. g. if over 30 minutes the temperature linearly decreases 1°C / minute I wouldn’t loose much information if I’d store a value only every 30 minutes, but as soon as I have fast changes I want to see them as accurate as possible and also the min and max values (or better say every peak) shall be kept. If possible even the integral should be the same as the original data. All without producing GBs of data.
To solve this, I’d wish for a dynamic persistence strategy, that monitors every change of the respective items, (temporarily) stores all of them and then filters them to be able to reproduce the input dataset by proper interpolation with a targeted accuracy.
In theory I could duplicate all items, persist the original ones in a round robin database and do the long term storage with rules. But 1st this looks quite complex to reach and duplicating ALL items is high effort and error prone, 2nd I don’t like the idea to copy-pasting a rule just because I added a new item…
Any idea is welcome