I will explain my task in a simplified way. I'm communicating among 3 Xbee's.
The attached code is for the sender Xbee. This Xbee actually sends multiple values of value1 (here for simplification I assigned it to constant value 10). For finding the value1 it does some other calculations as well and then assigns it.
But I find that if I reduce or remove the delay(100) I end up receiving improper values. I guess delay gives a buffer time for data transmission but my question is how to do I calculate the minimum delay or how can justify that this is the delay needed ?
I hope you can understand my question. If not please comment I will give some more pertinent details.
// Sender Xbee
void setup()
{
Serial.begin(9600);
}
void loop()
{
int value1 = 10; // some hardcoded values to send
Serial.print('H'); // unique header to identify start of message
Serial.print(value1,DEC);
Serial.print(","); // note that a comma is sent after the last field
Serial.print('\n'); // send a cr/lf
delay(100);
}
-
If timing is critical (and assuming you do not want to loose any message), you probably need some notification back that the message has been received correctly (possibly with a checksum value).Michel Keijzers– Michel Keijzers2017年04月20日 16:11:43 +00:00Commented Apr 20, 2017 at 16:11
1 Answer 1
Based on my experience with Xbee, I would say the limit is 50ms as Xbee can do the lowest sampling rate of 50ms.