I'm developing an object detection system that consists of an infrared sensor composed of an LED and an infrared receiver. The goal is to identify characteristics of non-uniform objects passing through the sensor, using the analog signal generated by the receiver.
To achieve this, I'm utilizing a microcontroller to process the signal and classify the detected objects. However, the system is powered by a battery and objects cross the sensor at random intervals. Therefore, I would like to implement strategies to save power and extend the battery life.
My first question is whether there's a way to put the microcontroller into a "deep sleep" mode and wake it up only when there's a change in the voltage level of the receiver, ensuring that no millisecond of received signal information is lost.
Additionally, I'd like to know if there's any integrated circuit that can convert the analog signal to digital and temporarily store it until the microcontroller is triggered to perform object classification.
I appreciate any guidance or suggestions you can provide in advance.
-
\$\begingroup\$ Too little information. This could be a colored/clear broken/crushed glass sorter for all I can tell. And what signals look like, in time, isn't ancillary. It's central to narrowing down suggestions to consider. While I may have some experience that may apply, I can't offer any advice from what I see above. \$\endgroup\$periblepsis– periblepsis2024年05月13日 17:31:18 +00:00Commented May 13, 2024 at 17:31
1 Answer 1
My first question is whether there's a way to put the microcontroller into a "deep sleep" mode and wake it up only when there's a change in the voltage level of the receiver, ensuring that no millisecond of received signal information is lost.
That depends on the microcontroller, but assuming you're doing this on something comparatable to a cortex-m3 or -m4, it's very likely that your specific MCU supports relatively deep sleep modes from which they can easily wake up in less than half a millisecond. Whether all clocks are already stable in 500μs is a different question; but this will end up just being a trade-off in microcontroller power vs sampling jitter.
However, I'd guess you're barking up the wrong tree here: Typical infrared LEDs run in the 5 to 20 mA region. Much much more than a microcontroller in a "not so deep" sleep mode. So, the important thing would probably be to not use the LED much, not minimize CPU time.
So, my guess is that you want your microcontroller firmware to turn off the LED anytime you're not explicitly sampling the infrared receiver; you can then optimize your power usage a little more by going to sleep in between.
Note that proper analog-to-digital conversion of e.g. the current coming from a photodiode dictates that you need analog low-pass filtering, anyways. Which means that if you switch the LED on and off faster than you expect changes in light due to moving objects, then your low-pass filtering will remove the flickering anyways. It's probably also a good idea to add one measurement every couple hundred milliseconds with the LED off for a longer time, to compensate for dark current (and ambient light conditions, if any).
Additionally, I'd like to know if there's any integrated circuit that can convert the analog signal to digital and temporarily store it until the microcontroller is triggered to perform object classification.
That is literally what the ADC inside the microcontroller already does; an external IC wouldn't be inherently better or less power intense at that.
is triggered to perform object classification.
It's very likely the most sensible triggering mechanism in terms of complexity and power consumption is really just doing it in microcontroller software. These things are built for control jobs like yours! Any reputable modern microcontroller with an ADC I can think of has enough timer units that can be used to initiate an ADC conversion regularly, and a DMA unit that will, without the CPU core acting upon that, copy the value from the ADC unit to a memory location, from which the CPU can read the value next time it's woken up by a timer, and then decide whether it needs to do more complex detection.
But quite honestly, if your whole sensing setup is a single infrared receiver, giving you a single time-signal of light intensity over time, I think you might be overestimating the time your MCU will need to process that before going back to idle/sleep. Think about this wildly made up usage example:
Assume you get a new value from your ADC every 0.5 ms. You then look at that value, and maybe a history of your signal as represented by a filterbank that is made of four parallel digital filters in parallel, each made of five biquad filter sections. That makes 100 multiply-accumulates per sample.
Your average STM32 microcontroller of the cortex-M4F (e.g., STM32F446, a slightly older MCU) variant might run at 16 MHz when in use (up to 168 MHz are possible, should you need the performance). A single multiply-accumulate would take around two to four clock cycles, and we can assume that data is still in caches, most of the time. So, we'd be worst-case 400 cycles in for the math, then maybe another 100 for deciding on your object. 500 cycles per sample. In 0.5 ms = 500 μs, your CPU core has 8000 clock cycles. So, your CPU can idle for 94% of time; but even if you had run it with all peripherals enabled (which you don't need) and without using any sleep, you'd be using 6 mA at room temperature. So, possibly less than your LED already!
Because you haven't really told us what your detection (and pre-detection, which I guess I would call your "triggering the MCU to do detection) needs to achieve, it's hard to make actual recommendations, aside from the usual engineering fact:
Solve what the problems really are, not what you think they are. How much power goes where, where are your optimization potentials? Do you already have a software implementation of your detection? How well can you describe the signal conditions for pre-detection? Do you really need that 1ms interval? That seems to be overly ambitious for anything that's not in a device that inherently has a very large power source. (you wouldn't worry about milliwatt in a car motor, a paper production machine or an air traffic control tower!)