After a lot of trouble shooting i got this to work. But Ambient and Meat Temperature is mixed up
I also made a Node-Red flow that detects (with the device_tracker fritz box) when the meater is in the network and runs the script.
Iāll make a pull request for detailed Hass setup information in your repository later.
@nailik. If the Ambient and Meat temp were switched can you provide be with the software version of your meater block and of the probes? They are not switched for me, so I want to figure out what is causing this.
@JayBinks , you said that you had figured out the protocol buffers. I have been working on the same thing for my code instead of the brute force I am currently doing. I was wondering if you would be willing to share your .proto file?
Seems like most of these approaches are reverse engineering the Meater. Just curious why, since there is an API https://github.com/apption-labs/meater-cloud-public-rest-api ?
I am doing it because I donāt want my data sent to the cloud. And I donāt want to rely on the cloud for my data. And reverse engineering stuff is fun, I learn a lot. Like in this I am learning about Google Protocol Buffers.
I have made a bunch of updates that should resolve your temp mix up, among other fixes. A new program has been added called meater_reader_buf.py. Give a try. (need to add SCALE = C to your config.ini).
Works thank you.
I would love some more changes:
Adding MQTT username/password to config.
Adding a path to config because i have to set paths e.g.:
file = open("/config/python_scripts/meat_table.txt", "r")
but i also need to set this for the config file.
These changes have been added.
Nice i appreciate your work.
However after deprecating the old script you now use protobuf in the new one, how do i have to install it on HassOs?
I run the hass docker and not hassOs. So I run this program in a separate system. But this suggestion might work:
Thereās a Hassio āCustom Depsā addon that lets you install whatever Python things youād like.
What was the reason to use protobuf? because itās bad when i need to install another addon just for a simple script to run i would like to make it as simple as possible.
protobuf is what is used to create the UDP packet. So this is the easiest and most reliable way to use the data packet. And I designed to code to be standalone from home assistant. I am looking into ways to get this written as a Hass plugin so that it will install itās dependencies. That however will take some trial and error since I run the vanilla home assistant docker version instead of HassOS.
I am working on a docker container for this now too. And it will work with hassos + portainer: https://github.com/hassio-addons/addon-portainer .
Just wanted to let you know I am following this thread with great interest. Thus far all solutions work nicely at my end and am really eager to see a solution that works in a wrapped component. Itās just that my coding skills are virtually non existentā¦
Keep up the good work!
The user Sotolotl has been working on an offical integration that we hopefully will see in the next release of HA:
I will follow what Sotolotl is doing closing. I like the integration idea, just not using the meater cloud api.
So from what I understand, this only works with MEATER BLOCK, not with MEATER or MEATER+! Correct? I what to buy a MEATER and would like to have this integrated into my system. I prefere the MQTT route because I also use NodeRed, and with MQTT I can have it working in both.
This works with all 3. It will connect to a MeaterBlock directly, or the other two via the Meater app (MeaterLink) on a smart device.
I read about that on the posts above, but I was under the impression that it was not working 100% via the app! So basically we use another smart phone to connect via Bluetooth to the probe, and then the phone communicates with the network and you grab those packages and publish them to the mqtt!?
Thank you
Correct, for meater and meater+ the traffic looks like this:
Meater(+) -> Bluetooth -> Meater App(MeaterLink) -> wifi -> MeaterBlockMQTT app -> mqtt
And I do have this working 100%, just hadnāt posted updates here of the working status of it, most of those updates are on github.