I need to measure the frequency of square-wave that can vary between 0 and 1MHz, and has a resolution of 0.25Hz.
I havent decided on which controller yet but It will most likely be one of the 20pin Attiny's.
Normally how I would measure lower frequency signals would be by using two timers one configured in timer capture mode to interrupt on say the rising edges of the external signal and another timer set up to interrupt every second therefore the former timers counter register value after 1 second would be equal to the frequency of the signal.
However this method obviously wont work for capturing signals ranging between 0 and 1MHz with a resolution 0.25Hz for this I would need a 22Bit counter (AFAIK 8bit micros only have 8/16bit counters).
One Idea I had was to divide the signal before applying it to the micro but this would be impratical as the signal would have to be divided by 61 therefore the frequency could only be updated every 61 seconds where I would like it to be every few seconds.
Is there another method that would allow the frequency to be updated say every 4 seconds?
Update:
The simplest solution is to use an use an external interrupt or a timer capture to interrupt on the rising edge of the signal and have the isr
increment a variable of type long int
. Read the variable every 4 seconds (to allow for frequencies down to 0.25Hz to measured).
Update 2:
As pointed out by JustJeff an 8bit MCU will not be able to keep up with a 1MHz signal so that rules out interrupting on every rising edge and incrementing a long int
...
I have chosen the method suggested by timororr. Once I get around to implementing it Ill post back and share the results. Thanks to all for your suggestions.
Progress Report:
Iv'e started to test some of the ideas presented here. Firstly I tried vicatcu's code. There was a obvious problem of TCNT1 not been cleared after the frequency been calculated -not a big deal...
Then I noticed when debugging the code that about every 2 to 7 times the frequency was calculated timer 1's (the timer configured to count external events) overflow count would be short by two. I put this down to latency of Timer 0 ISR and decided to move the if statement block form the ISR to the main (see snippet below) and just set a flag in the ISR. Some debugging showed that the first measurement would be ok but with every subsequent reading Timer 1's overflow count would be over by 2. which I can't explain -I would have expected it to be under not over...
int main()
{
while(1)
{
if(global_task_timer_ms > 0 && (T0_overflow == 1))
{
global_task_timer_ms--;
T0_overflow = 0;
}
.....
}
}
Next I decided I would try to implement timrorrs suggestion. To generate the neccesary interval (of approx 15ms between each timer_isr interrupt) I would have to cascade the two 8-bit timers as the only 16-bit timer on the Atmega16 is being utilized to capture the rising edges of the external signal.
I thought this solution would work and be much more efficient as most of the overhead is shifted to the timers and only one short isr left for the cpu to handle. However it wasn't as accurate as I had hoped, measurements shifted back and forth by approx 70Hz which I wouldn't mind at high frequencies but its definitely not acceptable at lower frequencies. I didnt spend two much time analysing the problem but Im guessing the timer cascading arrangement is not so accurate as I have implemented a similar arrangement to timrorrs suggestion on a far slower 8051 controller that had 2 16-bit timers and the results were quite accurate.
I have now gone back to vicatcu's suggestion, but I have moved the frequency calculation into the Timer 0 isr (see snippet below), this code has produced consistent and reasonably accurate measurements. With a little calibaration accuracy should be approximatly +/-10Hz.
ISR(TIMER0_OVF_vect)
{
TCNT0 = TIMER0_PRELOAD; //Reload timer for 1KHz overflow rate
if(task_timer_ms > 0)
{
task_timer_ms--;
}
else
{
frequency_hz = 1.0 * TCNT1;
TCNT1 = 0;
frequency_hz += global_num_overflows * 65536.0;
global_num_overflows = 0;
frequency_hz /= (TASK_PERIOD_MS / 1000.0);
task_timer_ms = TASK_PERIOD_MS;
}
}
If anybody has any other suggestions Im open to them although but I rather not have to use ranges... Im also no longer to intent on getting 0.25% resolution, there doesnt seem much point with the level of accuracy that I have at the moment.
If possible I'd suggest selecting a microcontroller that supports a counter operation using the timer inputs; rather than manually incrementing a counter inside an ISR (which at high frequencies quickly ends up saturating the microcontroller activity) you allow the hardware to handle the counting. At this point your code simply becomes a matter of waiting for your periodic interrupt then calculating the frequency.
To extend the range and make the frequency counter more generalised (removing the need for multiple ranges at the expense of a little more work for the MCU) you could use the following technique.
Select a periodic interrupt rate that allows for measurement accuracy at the highest input frequency; this should take into account your counter size (you need to select the timer period such that the timer counter will not overflow at the maximum input frequency). For this example I'll assume that the input counter value can be read from the variable "timer_input_ctr".
Include a variable for counting periodic interrupts (should be initialised to 0 at startup); for this example I'll refer to this variable as "isr_count". The interrupt period is contained in the constant "isr_period".
Your periodic interrupt should be implemented as (C pseudo-code):
void timer_isr()
{
isr_count++;
if (timer_input_ctr > 0)
{
frequency = timer_input_ctr / (isr_count * isr_period).
timer_input_ctr = 0;
isr_count = 0;
}
}
Obviously this rough example relies on some floating point math that may not be compatible for low-end microcontrollers, there are techniques to overcome this but they are outside of the scope of this answer.