I have a PIC32MX795F512l
and because XC32
doesnt provide an inbuilt delay function, I am using Timer1
to generate a 1sec delay but there is a problem in it because it is generating more than 1sec delay. Can anyone help in getting exact 1sec delay in PIC32.
CODE:
#define SYSTEM_FREQUENCY 72000000L
void Delayms(unsigned t)
{
OpenTimer1(T1_ON | T1_PS_1_256, 0xFFFF);
while(t--)
{
WriteTimer1(0);
while(ReadTimer1() < SYSTEM_FREQUENCY/256/1000);
}
CloseTimer1();
}
and using it in while(1)
int main()
{
SYSTEMConfigPerformance(SYSTEM_FREQUENCY);
while(1)
{
PORTDbits.RD0 = 1;
Delayms(1000);
PORTDbits.RD0 = 0;
Delayms(1000);
}
}
-
\$\begingroup\$ Don't do something like this SYSTEM_FREQUENCY/256/1000, use parenthesis to be more readable. \$\endgroup\$Junior– Junior2015年08月20日 06:53:06 +00:00Commented Aug 20, 2015 at 6:53
-
\$\begingroup\$ How much more than a second? A millisecond? Two seconds? \$\endgroup\$gbarry– gbarry2015年08月20日 07:19:15 +00:00Commented Aug 20, 2015 at 7:19
-
\$\begingroup\$ it is more than 10secs \$\endgroup\$Aircraft– Aircraft2015年08月20日 07:24:18 +00:00Commented Aug 20, 2015 at 7:24
-
1\$\begingroup\$ Are you sure your timer runs at 72 MHz? If it runs with the internal 8MHz you'll wait 9 times longer (close to the 10 seconds) \$\endgroup\$Arsenal– Arsenal2015年08月20日 07:50:17 +00:00Commented Aug 20, 2015 at 7:50
-
1\$\begingroup\$ I have edited my code and included SYSTEMConfigPerformance(SYSTEM_FREQUENCY); and this is the code which I am using. \$\endgroup\$Aircraft– Aircraft2015年08月20日 08:58:57 +00:00Commented Aug 20, 2015 at 8:58
5 Answers 5
I finally designed a logic using timers which is giving me 1ms delay:
#define FCY 72000000L
#define FPB 36000000L
#pragma config POSCMOD=XT, FNOSC=PRIPLL
#pragma config FPLLIDIV=DIV_2, FPLLMUL=MUL_18, FPLLODIV=DIV_1
#pragma config FPBDIV=DIV_2, FWDTEN=OFF, CP=OFF, BWP=OFF
void Delayms( unsigned t)
{
T1CON = 0x8000;
while (t--)
{
TMR1 = 0;
while (TMR1 < FPB/1000);
}
}
I'm not familiar with PIC32 devices and the library provided by microchip, but from what I read in the peripheral library guide the call to SYSTEMConfigPerformance(SYSTEM_FREQUENCY);
does not set the actual clocks to 72MHz but just selects all other important features to give maximum performance for 72MHz (e.g. wait states, caches, peripheral clock pre divider).
So in your code the MCU will run with the default reset settings which are (if I understand correctly) 8MHz internal FRC clock with a divider of 2, so 4MHz. You could try to check this if you define SYSTEM_FREQUENCY
to 4000000L
and see if you get a 1s delay. (currently it should be 18 seconds instead of 1)
If I am not mistaken, and you actually want your device to run on 72MHz, you have to configure your oscillators first:
OSCConfig(OSC_FRC_PLL, OSC_PLL_MULT_18, OSC_PLL_POST_1, OSC_FRC_DIV_2);
Note that the PLL input frequency should not exceed 5 MHz (if I got that right) so you will use the 8MHz FRC divide that by 2 (OSC_FRC_DIV_2
), feed it into the PLL (OSC_FRC_PLL
) and multiply it by 18 (OSC_PLL_MULT_18
) with no post divider (OSC_PLL_POST_1
). With a result of 72MHz +/- 2% at 25°C (as the FRC is calibrated that way).
As I'm really not familiar, please double check those things and don't hold me responsible if you kill your chip, the first suggestion (defining the system frequency lower) should be safe.
-
\$\begingroup\$ You said that
PLL input frequency should not exceed 5 MHz
which is correct and datasheet also described that it should be between4 MHz<FIN<5 MHz
, so lets say if I have a 11MHZ oscillator then diving it by 2 will give5.5
which is more than 5MHZ and if we divide it by 3, will give3.6
which is below than 4MHZ. So what according to you will be good config for this. I am diving it by 3 and it is working OK but still just want to confirm. \$\endgroup\$Aircraft– Aircraft2015年12月07日 09:34:17 +00:00Commented Dec 7, 2015 at 9:34 -
1\$\begingroup\$ @CZAbhinav well it is out of specification either way, but reducing the frequency is less likely to break things than overclocking the circuits. So I'd probably go with the divide by 3 as well, if there is no other source available. Another approach is to use an internal RC clock and trim it (if possible) with an external oscillator in regular intervals to keep the frequency roughly constant. \$\endgroup\$Arsenal– Arsenal2015年12月07日 09:51:46 +00:00Commented Dec 7, 2015 at 9:51
I hope this helps.
#define CCLK (80000000L) // system clock
#define PBCLK (CCLK / 2) // peripheral bus clock
#define CCLK_MS (PBCLK / 1000) // used for millisecond delay
void delay_ms(const int32_t& wait_ms) {
auto startTime = ReadCoreTimer();
auto delayCount = wait_ms * CCLK_MS;
while (ReadCoreTimer() - startTime < delayCount)
;
}
If you cannot afford function calls to ReadCoreTimer(), use _CP0_GET_COUNT() instead. It is a macro within the plib libraries.
I am not a PIC programmer but I am sure the delay because of:
- 1000 looping when generating 1 second delay (decreasing t variable, compare, jump, etc)
- Time to open and close the timer.
You can do some experiments (like I did on AVR or MCS-51):
- Create a function for delaying one second, call it delay_s. It can be delay_ms(1000)
Execute delay_s(1000). Use your stopwatch (many smartphones have it) to measure the duration required in second. Record this duration as T. For the first experiment, T must be about 1000 (>1000) but not exactly 1000
Replace SYSTEM_FREQUENCY/256/1000 in your code with SYSTEM_FREQUENCY/256/T
- Repeat step 2 & 3 with varying T (if necessary) to get closely exact timer.
-
\$\begingroup\$ I find it hard to imagine that a MCU running at 72MHz will take 10 times of the aimed time (10 seconds instead of 1) just because of some opening / closing of a timer module and some small calculations. \$\endgroup\$Arsenal– Arsenal2015年08月20日 08:03:22 +00:00Commented Aug 20, 2015 at 8:03
-
\$\begingroup\$ @Arsenal, yes. May be because wrong programmer fuse setting (accidentally using internal clock), or defect in hardware (resonator or capacitor). My method will work if all the clock & hardware problems have been addressed. \$\endgroup\$Oka– Oka2015年08月20日 08:09:31 +00:00Commented Aug 20, 2015 at 8:09
If the total delay is just over 1 second, is it simply that the extra time is taken performing the calculated loop 1000 times. Couldn't you calculate the exact number of clock ticks for 1 second and have a single loop?
After your update indicating it is taking more than 10 seconds, I would suggest checking that the clock is running at the speed you think it is.
You are calculating the clock ticks (after prescaler) for 1ms - SYSTEM_FREQUENCY/256/1000, multiply that by 1000 will give ticks for 1s, so SYSTEM_FREQUENCY/256