Monitor of electricity und solar panels
Installation
Long time ago the local energy provider installed a digital electric meter from EMH eHZ. This electric meter has the possibility to receive the current power consumption. I bought the Tasmota based bitshake device to put it on our EMH device to track current electricity.
Three values were provided:
- The overall consumption of the electric meter at all
- The current power received by outside energy provider
- The current power send to the outside energy provider if solar panels are in sun generating energy
Here a overview of my infrastructur tracking the solar energy:
bitshake Tasmota
The energy provider, not the energy vendor, needs to enable the electric meter device to provide the corresponding information. A formal email or service request provides you with the important information. If the energy meter is enabled to be tracked, the Tasmota reader is put onto the energy meter using diode based information. The Tasmota device can provide you the information in several ways but I use the MQTT sending of current values to an Mosquitto MQTT server base on an Raspberry Pi in my local home network.
The Raspberry PI is optimal for handling as a server for following topics:
- Working as an MQTT service receiving all MQTT events
- Running the database (I use postgres) to store all data received from all distributors.
- Handles the Grafana-based web service to analyze the received data
At the very beginning I used to work with an Raspberry Pi Model 3. But with the new SSD support I switch to Raspberry Pi Model 4 which uses less energy and data store on SSD is more secure than the usage of USB devices. My Model 4 Raspberry Pi contains a M2 SSD. USB-hard disc or Mini-SD cards are very unsecure.
Inside the Tasmota application there you need to program an energy tracking electric meter specific protocol. This script you need to enter into the Tasmota console.
My used EHZ device script looks like that:
>D
>B
=>sensor53 r
>M 1
+1,5,s,16,9600,eHZ,4
1,77070100010800ff@1000,Verbrauch,kWh,E_in,3
1,77070100020800ff@1000,Einspeisung,kWh,E_out,3
1,77070100100700ff@1,akt. Leistung,W,Power,0
#
This script maps received data out of eHZ into a MQTT message or event and sent it to the MQTT server to an specific channel of subscription. The Mosquitto server runs on my Raspberry Pi. The service provides the possibility to listen to an specific channel. Each time the channel received new data it is sent to all listening to that channel. The channels can be used to separate the need to know principle, so that only important information can be listen to.
MQTT messages processing
In my private I used to be a Golang developer. In my work I need to handle a lot of governance stuff and so the Golang development helps to have some fun.
For the MQTT messages of Tasmota I would like to handle it in somewhere. At the very end I worked out to just store it into the database for analysis all this data later on. I worked out some special tools I needed for other solutions. One of this tool was to handle multiple database using on framework only. With this flynn
API I was able to store the data easily to any database I'd like. At the beginning I worked with specific databases, then with MariaDB, but finally I decided to use Postgres.
This small service I developed I run in an Docker Microservice. It is very simple, just receiving the MQTT by listening to the specific channel and store the data into database.
The Mosquitto MQTT service and the Postgres database are opens source software and can be used for free of charge if you're not using it in an commercial environment.
The Mosquitto server, the MQTT-to-database service and the database itself are running inside Docker or Podman containers on the same Raspberry Pi Model 4.
The Mosquitto server and the mqtt2db service are running in one docker compose. The database in an separate one. Here is how the docker-composer could look like:
version: '3'
services:
mqtt5:
image: docker.io/library/eclipse-mosquitto
container_name: mosquitto-mqtt5
restart: unless-stopped
volumes:
- ./mqtt/config:/mosquitto/config:rw
- ./mqtt/data:/mosquitto/data:rw
- ./mqtt/log:/mosquitto/log:rw
- "/etc/localtime:/etc/localtime:ro"
mqtt2db:
image: thknie/mqtt2db:0.9.0.63
container_name: mqtt2db
restart: unless-stopped
environment:
- MQTT_STORE_URL=postgres://<Postgres host>:5422/home
- MQTT_STORE_TABLENAME=home
- MQTT_STORE_USER=postgres
- MQTT_STORE_PASS=${POSTGRES_PASSWORD}
- LOGPATH=/mqtt2db/log
- MQTT_TOPIC_URL=mqtt5:1883
- MQTT_TOPIC=tele/tasmota_9291A6/SENSOR
- MQTT_TOPIC_USERNAME=<user>
- MQTT_TOPIC_PASSWORD=<password>
volumes:
- "$PWD/mqtt2db/log:/mqtt2db/log"
- "/etc/localtime:/etc/localtime:ro"
Inside the Tasmota device the corresponding MQTT configuration need to be configured. I use the MQTT user and password security to handle only known services. Tasmota itself has no API possibility to store into database directly. The Mosquitto server as a destination is the most used one.
The MQTT2DB
service registered to the Tasmota channel and stores the data into a database. Inside the MQTT2DB
tool I defined the mapping of names in the event to the names on the database manually. It might be a option to do it dynamically. I did this for future developments.
Sure it is possible to extend the docker-compose
configuration or the Podman
pod configuration to contain all service like Mosquitto, MQTT2db and Postgres. I have several service using one database. In addition it is better being able to restart only the database because of backup, maintenance or update of Postgres.
The result is that all data received in the energy meter is stored and collected inside the database.
Power consumption review
By tracking the power consumption and generation I was able to identify the corresponding consumer. The big consumers are for sure the washing machine, the dishwasher or the refrigerator. There were some consumers I can identify and replace with less energy consumption. Especially LED light got a big reduction here. But especially the washing machine and dishwasher do not consume constantly. There are phases of heating and cooling in the wash process. The solar panels in combination with the battery provide constant energy. Main aim by me was to provide the base level of consumption so that this is generated with solar panels as much as possible. Our house hold base consumption is 190 watt. It is not much although two NAS server and a number of small device and the refrigerator are running all the time.
To handle and identify the overall consumption it was necessary to analyze the house energy. This is possible if you have the big picture with all these data in the database. Sure I can do it with Artifactual Intelligence but I assume it is not necessary. It is ok to analyze it in a timely matter and check to provide the overall energy instead of battery fill up.
Either of these startegies are possible
- Fill up the battery first and use the rest if it is not needed for battery
- Provide it to the environment up to the base consumption and the rest put into the battery
Important is that the battery storage will loose extra needs for storing and handling the battery. In addition my 1024 watt battery is filled up after one and a halve hours. After that it is not possible to handle the received energy and it need to go out for all without charge or benefit on my side. At this all although our solar panels are only in sun at afternoon.
Scenario | consumption |
---|---|
Base consumption summer | ca. 140 Watt |
Base consumption winter | ca. 190 Watt |
TV active | 350 Watt (for each TV ca. 150-200 Watt) |
Home office desk with three monitors | 380 Watt (if the screen saver is active it is going down to 200 Watt) |
The analysis of energy and electric data is possible because of the storage in the database. I use DBeaver or Grafana to analyze the energy consumption. A good graph shows best what is consumed or goes into public energy network. At the very beginning I power off each level of my private house to see which part used which energy. Later on I worked on replacing a number of old lamps and light bulbs to see a fundamental reduction of energy consumption. Replace a 60 Watt bulb with an 10 watt bulb reduce a lot. Especially if the previous owner of the house had up to 8 bulbs in an lamp. In the early 70s it was not a point to have a look on electricity and consumption. Some unexpected parts came into light. For example our heating was done with oil. The heater seems to pre-heat the oil before it goes into the heater with an electric heater. It is not possible to disable this, but because we are using a App for our heater, we optimize the heating very well and power off heating at all if we don't need it.
On the other side, the overall technical part, like my Raspberry Pi or NAS server, use less energy than expected. Each new device for the kitchen, which need to be replace because of defects like new refrigerator, oven, coffee machine or dish washer, seems to reduce the consumption.
Finally I check if the device can be switched off at the night or not. Later I see that the consumption in the night for TV was really less and let it in stand-by. But NAS server, not use in the night at all, these server I enabled hibernate mode in the night. We don't need the NAS server from midnight until 9 o'clock in the morning. And with wake-on-lan I can wake them up if needed.
Data of solar panels
On basic buying factors for my solar panels was, that I can access the data with an API. Unfortunately a lot hardware is dependent to the cloud. I don't find any solar panel without cloud usage, but Ecoflow provides a Rest API at least. I decided to buy solar panels coming with a microinverter from Ecoflow
. Later on after some practice, I bought a battery as well. Both, the microinverter and the battery data, can be read using the Rest API.
But additional possibilities are possible:
Ecoflow
has the possibility to listen on an MQTT service. Problem is that the MQTT service sents data only if you are in theEcoflow
-App. And the corresponding MQTT syntax, a protoc protocol, is not documented good. Especially the enable or disable of the MQTT streaming is missing and the protoc-protocol is changing- The HTTP Rest-API can be used to get device data of all
Ecoflow
Devices. On your account the devices are registered on this API. I can get all current consumption, solar panel generated watt or battery parameter using the HTTP request to the Rest API. It is possible to set some of the device parameter as well. I can set the part of watt I'd like to sent into my house or to sent in into the battery for later.
Here as well I wrote a small application. The application ecoflow2db
can read the MQTT event stream or the HTTP API for each device. The data is then written into the database. Because the parameter somewhere change, the corresponding table in the database is adapted if that happens. The tool can set the current needed power for the house. Dependent on the solar panel output it may or may not charge the battery with the rest which is above the given current needed power.
Graphical examination
How I analyze all the data stored in the database? The best option to analyze data is by creating graphs and charts. Here the best option is Grafana at the moment. Before I used Grafana, I used DBeaver with some SQL queries to get the corresponding data. That is a good begin to evaluate the SQL statements which later are used inside Grafana to display the graph. The Grafana-Server can directly read the data out of the database using SQL JDBC connections. Below I provided you the docker-compose configuration I use to start the Grafana microservice. It seems to take not much resource and I let it run on the Raspberry Pi as well. If the query is really complex or the page is complex, it can be that the Grafana server uses a lot CPU and memory. In this case it might be good to use another server. Keep in mind that the database read will be performed once the server where it runs.
I let it work in an Podman
pod. The idea at the beginning was to run everythin in Podman
user space. But the configuration of Podman
to be autoatically restarted after reboot is not working as expected. It's too complex. Therefor really import services are still running inside Docker
.
version: '3.8'
services:
grafana:
image: grafana/grafana-enterprise
container_name: grafana
restart: unless-stopped
environment:
# increases the log level from info to debug
- GF_LOG_LEVEL=info
ports:
- '3001:3001'
volumes:
- 'grafana_storage:/var/lib/grafana'
- './grafana.ini:/etc/grafana/grafana.ini'
volumes:
grafana_storage:
driver_opts:
type: none
device: ${GRAFANA_DATA}
o: bind
SQL examination
The ecoflow2db
tool creates the corresponding database tables automatically. Each device has it's own database table. The MQTT data is an extra table containing Ecoflow MQTT events read out of the protoc-protocol. But this protocol changes as much as the table need to be changed as well. That's why I does minimal implementation only.
These SQL tables can be used to analyze the corresponding data.
Here a simple example to get teh data of the electric meter:
SELECT "inserted_on" AT TIME ZONE 'UTC' AT TIME ZONE 'GMT' AS "time",
powercurr AS "Stromnetz", powerout AS "Ins Stromnetz"
FROM home ORDER BY inserted_on desc
It may be the case that the query need to handle the correct time zone. Sometimes, dependent to the database configuration or the INSERT statement of the client, the corresponding time is in UTC or local form. The query will adapt it to the current one used inside Grafana. Für eine Auswertung direkt auf der Datenbank kann ein SQL Tool wie DBeaver genutzt werden.
To gain the rough estimation of the consumption of the year, at this month forecast, this is a complex SQL statement calculating this:
select
(ce.total-mx.total)/(CURRENT_DATE- '2024-12-31')* 365
from
(
select
total,
inserted_on
from
home
order by
inserted_on desc
limit 10000) ce,
(
select
total as total
from
home
where
date_part('year', time) = date_part('year', now())
order by
time
limit 1) mx
order by
inserted_on
Here a graph to get the forecast for the whole year per month:
with t1 as (select
h.inserted_on,inserted_on::date as day,
row_number() over (partition by TO_CHAR(inserted_on::date, 'mm/yyyy')
order by h.inserted_on desc) as rn,
date_part('year', inserted_on) as oyear,
h.total as total,
TO_CHAR(NOW()::date, 'mm/yyyy')
from
home h),
t2 as (select
date_part('year', inserted_on) tyear,
row_number() over (partition by date_part('year', inserted_on)
order by h.inserted_on asc) as rx,
h.total as start
from
home h)
select t1.inserted_on ,t1.total - t2.start as "Verbrauch bis Monat",
greatest(0,(t1.total-t2.start)/(DATE_PART('doy',t1.inserted_on))* 365-(t1.total-t2.start)) as "Verbrauchsprognose"
from t2,t1 where rx=1 and rn=1 and t2.tyear = t1.oyear order by inserted_on
The dashboard of different data can look like this below. I created different examinations of energy consumptions: