i have made a function called _MY_Delay() which use 16bit timer/counter1 in ATMEGA16 , and used this function to blink a LED . the problem is i didn't work and i don't know the reason , here is my code :
#define F_CPU 1000000
#include <avr/io.h>
#include <avr/interrupt.h>
void _MY_Delay(int delay){
int n = (delay*F_CPU)/(1000*64) ; // number of counts required for the given delay
OCR1BL = n; // n =T(overflow time)*F_cpu / 64
OCR1BH = n >> 8 ;
while (!(TIFR & 1<<OCF1B));
}
int main(void)
{
DDRA = 0x00;
DDRA |= 0x01;
TCCR1B = (1<<CS10)|(1<<CS11); //divide by 64 (prescaler)
sei();
TIMSK |= 1<< OCIE1B ;
WDTCR = 0x00; // disable watchdog timer
while(1)
{
PORTA |= 0x01 ;
_MY_Delay(100);
PORTA &= ~(0x01);
_MY_Delay(100);
}
}
-
\$\begingroup\$ the WDTCR register should also be used to disable the watchdog \$\endgroup\$user3629249– user36292492015年03月25日 23:17:56 +00:00Commented Mar 25, 2015 at 23:17
-
\$\begingroup\$ it seems the clock source the 'Clock Select bits (CS12:0).' need to be set to a desired value to select which clock drives the timer, other wise the timer will not count. \$\endgroup\$user3629249– user36292492015年03月26日 00:25:34 +00:00Commented Mar 26, 2015 at 0:25
-
\$\begingroup\$ CS12 , CS11 and CS10 are used to select the desired prescaler according to a table in the datasheet , setting CS11 and CS10 will choose 64 prescaler and thats what i have done in the code. \$\endgroup\$Abdelrahman Elshafiey– Abdelrahman Elshafiey2015年03月26日 02:24:53 +00:00Commented Mar 26, 2015 at 2:24
3 Answers 3
The primary problem is with your arithmetic.
The argument to _MY_Delay()
is declared as int
, and you're passing in the value 100.
The first thing you do is divide the argument by 1000. The result of this division will always be zero.
You'll get more useful results if you do the multiplication first:
void _MY_Delay(int delay){
int n = (delay*F_CPU)/1000; // number of counts required for the given delay
...
}
... but make sure that the intermediate result won't overflow an integer. If so, use a long
.
-
\$\begingroup\$ but the result of the arithmetic won't be zero \$\endgroup\$Abdelrahman Elshafiey– Abdelrahman Elshafiey2015年03月25日 22:29:37 +00:00Commented Mar 25, 2015 at 22:29
-
\$\begingroup\$ any way i tried your advice and it didn't work too \$\endgroup\$Abdelrahman Elshafiey– Abdelrahman Elshafiey2015年03月25日 22:30:09 +00:00Commented Mar 25, 2015 at 22:30
-
\$\begingroup\$ @AbdelrahmanTarief
(delay*F_CPU)/1000
for a delay of 1000 and a clock of 1MHz results in 1000000. You can't assign such a value to a 16bit resister (such as OCR1B). The max value for a 16bit register is 65535 but 1000000 is 0xF4240 so when you assign this to a 16bit resiter it gets truncated to 0x4240 (=16960 decimal) \$\endgroup\$alexan_e– alexan_e2015年03月25日 22:35:04 +00:00Commented Mar 25, 2015 at 22:35 -
\$\begingroup\$ I know what are you talking about , so i modified the equation and divided the F_CPU by 64 of the prescaler , and it still doesn't work \$\endgroup\$Abdelrahman Elshafiey– Abdelrahman Elshafiey2015年03月25日 23:08:06 +00:00Commented Mar 25, 2015 at 23:08
The problem with your current code is that you enable the timer1 output compare B interrupt while you don't use an interrupt handler, this leads to a reset of the mcu.
Remove the following line from your code and it will work
TIMSK |= 1<< OCIE1B ; // enables timer1 output compare B interrupt
Regarding your delay function, since you don't use an interrupt and the execution waits for the timer to finish, you may as well use the _delay() function of the delay library with the same results (unless you just want to experiment with timers).
#include <util/delay.h>
void my_delay_ms(uint16_t n) {
while(n--) { // loop until 0
_delay_ms(1); // 1ms delay
}
}
It uses a 16bit variable so you can use it for delays up to 65535ms (65.5 sec) (you can use an 8bit variable for delays up to 255ms if you wish). There is a small overhead in the delay caused by the loop, but if you are interested in accurate delays then you should use a timer interrupt anyway.
As an example, for 0.5sec delay you call it as my_delay_ms(500)
I'm not familiar with the CPU so cannot speak to the register usage. However, this line: PORTA |= 0x00;
is not the way to turn a bit off rather use: PORTA &= ~(0x01);
As a side note: The C compiler(s) prepend one or two underscores on any names so there can be confusion in the compiler with names that already have a leading underscore
the document at: <http://www.atmel.com/images/doc2466.pdf>
contains the details of the chip and the register names and bit
definitions and the details of how to access those registers in C
amongst other things, the section titled "accessing 16-bit registers" it states
"To do a 16-bit write, the high byte must be written before the low byte."
however the posted code is performing the 16-bit write in the opposite order, I.E. junk is being written to the high byte of the 16-bit register
-
\$\begingroup\$ for the turning off the bit of port A , you are wright its my fault. do you mean by interrupt handler ISR function if so , i don't think this is going to work because the purpose of the delay is to stop the Controller for a specific time. for reseting the flag , the controller do this part for me. \$\endgroup\$Abdelrahman Elshafiey– Abdelrahman Elshafiey2015年03月25日 22:20:45 +00:00Commented Mar 25, 2015 at 22:20
-
\$\begingroup\$ you can forget what I previously said about interrupts as the code is neither enabling nor using them. Though that leaves me to wonder why the interrupt.h header file is being included \$\endgroup\$user3629249– user36292492015年03月26日 00:09:18 +00:00Commented Mar 26, 2015 at 0:09