I'm trying to debug a problem and need to reduce the speed of the clock on my Arduino Mega from 16MHz to 8MHz.
I can't find any simple way of doing so, so I wanted to know if there were any gurus here who knew if this was possible, as well as how to do it.
WITHOUT PRESCALER:
long int runTime;
void setup()
{
Serial.begin(9600); //Setting the data transfer rate
}
void loop()
{
delay(1000);
runTime = millis();
Serial.print("Runetime: ");
Serial.println(runTime);
delay(100);
exit(0);
}
OUTPUT: 999
WITH PRESCALER:
long int runTime;
void setup()
{
Serial.begin(19200); //Setting the data transfer rate for when CLK Freq. is Halfed
CLKPR = _BV(CLKPCE); // enable change of the clock prescaler
CLKPR = _BV(CLKPS0); // divide frequency by 2
}
void loop()
{
delay(1000);
runTime = millis();
Serial.print("Runetime: ");
Serial.println(runTime);
delay(100);
exit(0);
}
OUTPUT: 999
1 Answer 1
You can set the clock prescaler for that:
void setup() {
noInterrupts();
CLKPR = _BV(CLKPCE); // enable change of the clock prescaler
CLKPR = _BV(CLKPS0); // divide frequency by 2
interrupts();
}
This is explained in section 10.12 and 10.13 of the ATmega2560 datasheet.
Of course, changing the clock frequency will mess with the time-related
function (millis()
, delay()
and co.) and the baud rate of the serial
port.
Edit: Here is a small program to demonstrate the slowing of the clock:
//#define SLOW_CLOCK
void setup()
{
#ifdef SLOW_CLOCK
noInterrupts();
CLKPR = _BV(CLKPCE); // enable change of the clock prescaler
CLKPR = _BV(CLKPS0); // divide frequency by 2
interrupts();
#endif
pinMode(LED_BUILTIN, OUTPUT);
}
void loop()
{
digitalWrite(LED_BUILTIN, HIGH);
delay(500);
digitalWrite(LED_BUILTIN, LOW);
delay(500);
}
This makes the LED flash at 1 Hz. If you uncomment the line
#define SLOW_CLOCK
, it instead flashes at 0.5 Hz.
-
So when I added that I got a bunch of random symbols (i.e. †f憞f) and I'm not sure why. Are the setting different for an Arduino Mega?Isabel Alphonse– Isabel Alphonse2016年06月23日 15:49:04 +00:00Commented Jun 23, 2016 at 15:49
-
1Note that
delay(1000);
will now take 2 seconds, instead of 1.Gerben– Gerben2016年06月23日 15:49:20 +00:00Commented Jun 23, 2016 at 15:49 -
2You should also half the baud rate of your serial console, or double the baudrate in
Serial.begin()
.Gerben– Gerben2016年06月23日 15:49:47 +00:00Commented Jun 23, 2016 at 15:49 -
I wrote some test code and there didn't seem to be any change. Is this what @EdgarBonet was talking about? Am I not supposed to see a difference in the output of
millis()
? is so then is there another way for me to test if it's working.Isabel Alphonse– Isabel Alphonse2016年06月23日 16:13:46 +00:00Commented Jun 23, 2016 at 16:13 -
@IsabelAlphonse: What test did you do? The wacky symbols on the serial port kind of prove that the baud rate was unintentionally changed.Edgar Bonet– Edgar Bonet2016年06月23日 16:15:42 +00:00Commented Jun 23, 2016 at 16:15
noInterrupts()
andinterrupts()
. Otherwise there is a small possibility of this failing. Also, your test only proves the clocks ofdelay()
andmillis()
have changed consistently. Which is to be expected since it's the very same clock.