This is my loop()
function:
if(Serial.read() == 't') {
tarestate = true;
}
while(tarestate == true) {
timeNowTare = mills();
if(mills() - timeNowTare >= 3000) {
Serial.println("Tare");
}
tareState = false;
}
Where tareStare
is a boolean, timeNowTare
is a long.
I need to print on serial:
Tare
three seconds after I enter "t" into serial.
With everything I have tried to date, the program either stops or tares immediately after entering "t".
EDIT: Thanks to everyone who responded with answers. The solution was simply to move the timeNowTare = true; declaration up to the Serial.read() if statement. I also changed the while loop to an if statement because it seemed to make my code run quicker.
2 Answers 2
Move timeNowTare = millis();
inside first if
:
if (Serial.read() == 't') {
tareState = true;
timeNowTare = millis();
}
and tareState = false
inside second if
:
while (tareState) {
if (millis() - timeNowTare >= 3000) {
Serial.println(F("Tare"));
tareState = false;
}
}
The easiest solution is to just use delay()
:
if (Serial.read() == 't') {
delay(3000);
Serial.println(F("Tare"));
}
What? You said "no delay()
"? OK, then just write your own
implementation of delay()
and inline it in place of the actual call to
delay()
:
if (Serial.read() == 't') {
uint32_t timeNowTare = millis();
while (millis() - timeNowTare < 3000) continue; // wait
Serial.println(F("Tare"));
}
This seems to be more or less what you attempted. You write a while
loop that blocks the program for three seconds. It is also what leoc7's
answer does.
However, doing so is completely foolish. Why would you write your own
version of delay()
when the Arduino core provides you with a well
tested one? Why would you want to avoid the standard delay()
in the
first place?
It turns out there is a very good reason to avoid delay()
. The reason
is that delay()
blocks your program for the entire delay duration.
If your code needs to be responsive to some external inputs (as any non
trivial program at some point needs), this means within that time window
the program will be completely unresponsive. The bad thing about
delay()
is not the implementation (there is no point in replacing it
with your own), it's the fact that it blocks your program. If you
replace delay()
by your own blocking code you have won nothing.
The real solution is to think in terms of a finite state machine. Either the program is running in "tare mode" or not. The possible state transitions are:
- upon reading a
't'
on the serial port, you enter tare mode and take note of the current time - if you are already in tare mode and three seconds have elapsed since
you entered that mode, then you print out
"Tare"
and exit tare mode.
Here is a straight forward implementation of such state machine:
static bool tare_mode = false;
static uint32_t tare_start_time; // meaningful only in tare_mode
if (Serial.read() == 't') {
tare_mode = true;
tare_start_time = millis();
}
if (tare_mode && millis() - tare_start_time >= 3000) {
Serial.println(F("Tare"));
tare_mode = false;
}
Now you can add more code to loop()
and this extra code will never be
blocked by your "tare" feature.
timeNowTare
is just before you use it ... so, that will never be > 3000 ... perhapstimeNowTare = mills();
inside the first if