Two adc GPIO inputs with ESP32 Devkit V1

I am using a single ESP32 Devkit V1 on WiFi to monitor utility power current in Amps.

My yaml code is not working.

sensor:
  # Current measurement CT B
  - platform: adc
    pin: GPIO32 # Current measurement CT B
    name: "Power Sensor ESP32_3_B"
    id: power_sensor_esp3_3_b
    update_interval: 5s
    accuracy_decimals: 2
    #filters:
    #  - calibrate_linear:
    #    - 0.000002 -> 0.0
    #    #- 0.00372-> 7.65
    #    - 0.00372-> 0.015
    #  - lambda: return x;
    unit_of_measurement: "Amps"
    on_value:
      then:
        - script.execute: flash_led

  # Current measurement CT C
  - platform: adc
    pin: GPIO33 # Current measurement CT C
    name: "Power Sensor ESP32_3_C"
    id: power_sensor_esp32_3_c
    update_interval: 5s
    accuracy_decimals: 2
    #filters:
    #  - calibrate_linear:
    #    - 0.000002 -> 0.0
    #    #- 0.00372-> 7.65
    #    - 0.00372-> 0.015
    #  - lambda: return x;
    unit_of_measurement: "Amps"

Both CTs show the exact same current. This differs from what i actually happening as shown by my AC clamp meter, which show different incomin Amps on each phase.

Obviously I have not yet attempted to calibrate the CTs. I do not understand why the ESP32 is reporting the same current down to the milliamps on both legs.

What have I done wrong please?

btw:this works perfectly when I use two ESP8266s which only have one adc input, hence the ESP32 migration.

Thanks - Pete

What is the actual voltage read on your clamps (with a multimeter)? Remember the ESP8266 and ESP32 ADCs do read slightly differently, it’s possible you may need to set an attenuation value.

The clamp output voltage is very low. I’m reading less than 0.25V at about an 8Amp load. The CT I am using is the Emporia Gen 2 200A Current Sensor which outputs 0.0 – 0.333V (Measuring 0-200A) so it is a current based output. There is an RC network before the ADC input

That looks like this. The 22R is a ballast resistor. My circuit is only using the left hand part uo to the NodeMCU, so just the 2 x 10K and 22R + the 10uF capacitor. Obviously the CT plugs in to Tip and Sleeve of the jack.

Please remember that I am seeing differing readings on a pair of ESP8266s whereas they ar absoutely identical n the ESP32. Worse, since I’m using WiFi I cannot use the ESP32’s seconds ADC. The yamls syntax cheker won’t allow it.

I am pretty sure I have messed up my code but I cannot see where – yet.

There’s not much to mess up with a simple ADC - it looks fine. Your circuit looks fine too so as you can guess I am no help… :slight_smile:

My only suggestion is that your voltage at the ESP is over 1.1V and the auto attenuation isn’t working, thus why I suggested setting a manual attenuation as per the ADC instructions.

From the ADC page:

ESP32 Attenuation

On the ESP32 the voltage measured with the ADC caps out at ~1.1V by default as the sensing range (attenuation of the ADC) is set to 0db by default. Measuring higher voltages requires setting attenuation to one of the following values: 0db, 2.5db, 6db, 11db. There’s more information at the manufacturer’s website.

Thanks for your reply. I read the info on ESP32s capping out a 1.1V. I believe that this may relate to earlier versions. Why? Because I put 3V3 and GND alternately on both GPIO32 and 33. The both reprted 3V2 when held high and about 0.08V when grounded. Howver you are right. My potential divider of 2 x 10K resistors will always give > 1.1V at the GPIO pin.

Need to rethink this.

Thanks again - Pete

I am no expert on this subject but have gotten the ESPHome power sensor working.
The esp8266 (Wemos D1 mini) has a built in voltage divider so it can measure 0V-3.3V. I believe this hard wire on the D1 mini.
ESP32 will by default read 0V-1V but has an attenuation: parameter that can change that to 0V-3.3V
This is explained here.

Looks like adding
attenuation: 11db
to your yaml file should make the ESP32 work like the NodeMCU8266
Your schematic looks like this one for a ESP8266

I missed the earlier post about attenuation: