microcontrolleravratmelatmegaavr-studio5

Why should I calibrate the oscillator in AVR programming


I'm new to AVR programming. I found a sample code on web; for a simple USART communication with PC. I have a little doubt there.

The main loop starts like this;

void main(){
  OSCCAL_calibration(); 
  USARTinit();
  //start communicating with PC
}

I can't understand the reason for calibrating the oscillator, using OSCCAL_calibration(); function.


FUNCTIONS

OSCCAL_calibration() function

void OSCCAL_calibration(void){
    unsigned char calibrate = 0;
    int temp;
    unsigned char tempL;
    CLKPR = (1<<CLKPCE);
    CLKPR = (1<<CLKPS1) | (1<<CLKPS0);
    TIMSK2 = 0;
    ASSR = (1<<AS2);
    OCR2A = 200;
    TIMSK0 = 0;
    TCCR1B = (1<<CS10);
    TCCR2A = (1<<CS20);
    while((ASSR & 0x01) | (ASSR & 0x04));
    for(int i = 0; i < 10; i++)
    _delay_loop_2(30000);
    while(!calibrate){
        cli();
        TIFR1 = 0xFF;
        TIFR2 = 0xFF;
        TCNT1H = 0;
        TCNT1L = 0;
        TCNT2 = 0;
        while ( ! (TIFR2 && (1<<OCF2A)) );
        TCCR1B = 0; // stop timer1
        sei();
        if ( (TIFR1 && (1<<TOV1)) ){
            temp = 0xFFFF;
        }else{
            tempL = TCNT1L;
            temp = TCNT1H;
            temp = (temp << 8);
            temp += tempL;
        }
        if (temp > 6250){
            OSCCAL--;
        } else if (temp < 6120){
            OSCCAL++;
        }else
        calibrate = 1;
        TCCR1B = (1<<CS10);
    }
}

USARTinit() function

void USARTinit(){
    CLKPR = (1<<CLKPCE);
    CLKPR = (1<<CLKPS1);
    UBRR0H = 0;
    UBRR0L = 12;
    UCSR0A = (1<<U2X0);
    UCSR0B = (1<<RXEN0)|(1<<TXEN0)|(0<<RXCIE0)|(0<<UDRIE0);
    UCSR0C = (0<<UMSEL00)|(0<<UPM00)|(0<<USBS0)|(3<<UCSZ00)|(0<<UCPOL0);
}

I'm using Atmel Studio 6 and tested this with atmega2560 (actually, with my Arduino Mega). After a bit of changes, I could make it work. But it still works without the calibration function..

I'll itemize my questions as below.

  1. What do you really do as calibrating the oscillator?
  2. Is it a must?
  3. Is there a similar function in PIC micro-controllers? (I'm a bit experienced in PIC programming. But never knew about something like that)

Also got a little doubt;

Why do you set a clock pre-scalar in USARTinit() function before setting the baud rate? can't you set the baud rate without a pre-scalar (which means, pre-scalar = 1)

Is it to save power or something? But i tried with pre-scalar=1, it seems not working (still trying). Yeah i've calculated the baudrate properly (using the given equation in datasheet).


Solution

  • If you are doing any timing related communications outside the microcontroller (serial, pushing spi to limits, etc) or keeping time or whatever then you need a more accurate clock.

    It is not really about power, marginally perhaps, if the clock is a little slow then you use more power if a little fast then you save a little power.

    Many but not all microcontrollers offer an internal R/C oscillator so that you dont need to have an external oscillator (extra components, extra cost). This is not one family vs another (avr, msp430, pic, etc) some chips within a family have internal oscillators some dont. The PIC's I used back in the day required an external, dont know the family in that detail today. How the calibration happens also varies from family to family.