clinuxarchlinuxi2carchlinux-arm

Calculating the delay between write and read on I2C in Linux


I am currently working with I2C in Arch Linux Arm and not quite sure how to calculate the absolute minimum delay there is required between a write and a read. If i don't have this delay the read naturally does not come through. I have just applied usleep(1000) between the two commands, which works, but its just done empirically and has to be optimized to the real value (somehow). But how?.

Here is my code sample for the write_and_read function i am using:

int write_and_read(int handler, char *buffer, const int bytesToWrite, const int bytesToRead) {
    write(handler, buffer, bytesToWrite);
    usleep(1000);
    int r = read(handler, buffer, bytesToRead);
    if(r != bytesToRead) {
        return -1;
    }
    return 0;
}

Solution

  • Normally there's no need to wait. If your writing and reading function is threaded somehow in the background (why would you do that???) then synchronizating them is mandatory.

    I2C is a very simple linear communication and all the devices used my me was able to produce the output data within microsecs.

    Are you using 100kHz, 400kHz or 1MHz I2C?

    Edited: After some discuss I suggest you this to try:

    void dataRequest() {
        Wire.write(0x76);
        x = 0;
    }
    
    void dataReceive(int numBytes)
    {
        x = numBytes;
        for (int i = 0; i < numBytes; i++) {
            Wire.read();
        }
    }
    

    Where x is a global variable defined in the header then assigned 0 in the setup(). You may try to add a simple if condition into the main loop, e.g. if x > 0, then send something in serial.print() as a debug message, then reset x to 0.

    With this you are not blocking the I2C operation with the serial traffic.