I have built a simple Microblaze system on a Kintex 7 on Avnet's MMP2 board. I am using Uartlite (v2.0) IP in this system and communicate to a PC using Teraterm (v4.85). The baudrate for the Uartlite component must be fixed at a particular value at the design stage. I have chosen 19200 bps for my design. I have written a very simple application that is sending 6 consecutive "At"
s to the PC through Teraterm, then get an ascii character from the Teraterm console and print the same. The problem is that, for every "At"
the Teraterm displays only one 'Ç'
on the console. The transmission parameters are as follows:
19200 bps; 8 data bits; 1 stop bit; No parity; No Flow-control
However, after a lot of head scratching and trial and errors, I discovered that if I change the baudrate in Teraterm to 38400 bps, I get the desired behaviour, i.e., I get the "At"
s on the console, get a char from console and print it back.
As far as I know the code (which is really very simple), I have not changed the baudrate. But somehow, I am sending data at a rate faster than the specified rate. The clock used is 100MHz. Since the baudrate MUST be specified in the design phase, how is it even possible to attain a higher baud rate? What have I done wrong?
The code goes like this:
#include <stdio.h>
#include "platform.h"
#include "xgpio_l.h"
#include "xintc_l.h"
#include "xparameters.h"
#include "xuartlite_l.h"
#define MAX_UART_BUFFER_LENGTH 16
u8 uart_rx_data = 0;
int main()
{
init_platform();
//Init GPIOs
//dip switch port as input
XGpio_WriteReg(XPAR_AXI_GPIO_0_BASEADDR, XGPIO_TRI_OFFSET, 0xFFFFFFFF);
char count_data = 0x0F;
while(1)
{
unsigned int dip_gpio_data = XGpio_ReadReg(XPAR_AXI_GPIO_0_BASEADDR, XGPIO_DATA_OFFSET) & 0x000000FF;
if(dip_gpio_data == 0)
{
u8 send_data[MAX_UART_BUFFER_LENGTH] = "AtAtAtAtAtAt";
u8 i = 0 ;
for (i = 0; i < MAX_UART_BUFFER_LENGTH; i++)
{
if(send_data[i] != '\0')
XUartLite_SendByte(XPAR_AXI_UARTLITE_0_BASEADDR, send_data[i]);
else
break;
}
//Rx something from Teraterm
uart_rx_data = XUartLite_RecvByte(XPAR_AXI_UARTLITE_0_BASEADDR);
//Send same thing back
XUartLite_SendByte(XPAR_AXI_UARTLITE_0_BASEADDR, uart_rx_data);
XUartLite_SendByte(XPAR_AXI_UARTLITE_0_BASEADDR, '\n');
XUartLite_SendByte(XPAR_AXI_UARTLITE_0_BASEADDR, '\r');
}
}
return 0;
}
EDIT:
Today, I tried to reduce the baudrate to 9600 at the UARTlite IP core, regenerated bitstream and ran again. The same result. If I set baudrate of Teraterm to 9600, it doesn't work. Teraterm gives an option of 14400 baudrate. I used this. I get 12 chars, but gibberish. If I exactly double it to 19200, it works perfectly. I will try with other baud rates as well and update. Please help.!! This is still the basic part of my design. Image below for reference:
EDIT2:
I tried the following:
Uninstalled Teraterm and re-installed
Installed Realterm
Installed hyperterminal
Tried with all 3. No change in the behaviour. :(
found the reason for the strange behaviour. It was a mistake from my side during design phase. According to my previous design, the Clocking wizard expects 100MHz as input and is designed to produce 100MHz at the output (no division is done). My UARTLite core also expects a 100MHz clock and applies the relevant divider values to generate a baudrate of 19200. But the board is supplying 200MHz. The clk_wizard assumes it is 100MHz and passes the clock without any division, and the chain continues till UARTLite. Here the UARTLite divides the 200MHz clk with the number it generated for a 100MHz clk. And thus I am transferring at double the speed.
Anyways, thank you for the time.