Hello there,
I have a 0.28-inch DC 100V 100A LED digital ammeter-voltmeter. I've been using it for quite some time. Now, I wanted to integrate it with a microcontroller.
I have a basic idea of how I can execute it but need some more suggestions from you guys.
The device has two wiring harnesses. One is for powering the meter, and the other is for measuring voltage and current. I need to connect the power harness to a 5V source and the measurement harness to the circuit I want to monitor. The voltage and current readings can be fed into the analog input pins of your microcontroller. So I think I need to use voltage dividers or current sensors to ensure the signals are within the microcontroller's input range. I'm going to write a program to read the analog signals from the meter and convert them into readable voltage and current values.
What else can I do to ensure the proper working of this integration?
Need a help to integrate
- that_embedded_guy
- Posts: 1
- Joined: Thu Apr 03, 2025 1:15 pm
The workaround seems fine, as converting into readable values (ADC) can be a little difficult, AS WHAT WE WANT TO MEASURE IS VOLTAGE AND CURRENT ITSELF SO GOTTA BE ACCURATE RIGHT!!. (0- 100V ----> converted in MCU's ADC range) (0- 100A ----> converted in MCU's ADC range), for precise conversions you might wanna use microcontrollers which has high resolution ADC but we know that:
ALL THE BEST ^_^
So you may use some capacitors to smoothen out readings from hardware POV at ADC channel, and use averaging or log filter in programming so that too makes it stable from software POV.WITH HIGH RESOLUTION COMES HIGH NOISE:)
ALL THE BEST ^_^