cclockqsorttime.htime-t

clock() function doesn't measure qsort() time


I'm trying to write a program that takes some song data (~1000 lines) from a CSV file and puts it into a struct array. Each line from that CSV file contains: title of the song, artist and release year. String, string and int.

No problems when taking out the data.

The lines are correctly written inside my struct array (called s), and also the bubblesort part works fine. Calculating the execution time for bubblesort also works. I write that time in the memory.

I'm resetting the array after that, and the data is again inserted into the array struct. I'm using qsort() to sort the array. The function does what it is intended to do. The array is sorted the same as it was sorted by bubblesort. The problem is, time measured is 0,000....

That's my bubblesortTime() function that returns the execution time needed for the bubblesort... sort:

float bubbleSortTime(song *s, int length) {
    int flag;
    song aux;

    clock_t start, end;

    start = clock();

    do {
        flag = 0;
        for (int j = 0; j<length-1; j++) {
            if (s[j].release_year > s[j+1].release_year) {
                aux = s[j];
                s[j] = s[j+1];
                s[j+1] = aux;
                flag = 1;
            }
        }
    } while (flag == 1);

    end = clock();

    clock_t time = end-start;
    float finTime = (float)time/CLOCKS_PER_SEC;


    return finTime;
}

also, that's my qsortTime() function, together with comp():

int comp (const void * a, const void * b)
{
    song *s1 = (song *)a;
    song *s2 = (song *)b;

    return (s1->release_year - s2->release_year);
}

float qsortTime(song *s, int length) {

    clock_t start, end;

    start = clock();

    qsort(s, length, sizeof(s[0]), comp);

    end = clock();

    clock_t time = end-start;
    float finTime = (float)time/CLOCKS_PER_SEC;

    return finTime;
}

When displaying the two returned values in main(), using:

float time1 = bubbleSortTime(s, i);

/** 
reset code and more things here
**/

float time2 = qsortTime(s, i);
printf("%f %f", time1, time2);

(...) the output is: 0.062500 0.000000, stating the fact that the time for the bubblesort call was properly calculated, while the time for the qsort call wasn't.

I know that qsort is usually slower than bubblesort, and that's why I'm confused.

If you need me to provide more code, please leave a comment and I'll do it. Thank you so much!


Solution

  • Use a larger N.

    First try exercising just qsortTime() with larger N.

    with N = 100,000 and reported times of "0.062500 0.000000" and O(n*n) vs O(n*log n), the time difference between the 2 algorithms can well exceed a factor of 1000.

    For me, with N = 100,000 it was

    Bubble: 47.125
    QTSort: 0.016
    

    Also, be sure "reset code and more things here" includes giving qsortTime(s, i); an unsorted list and not the list sorted by bubbleSortTime().

    Example

    #define SS (100000 * 2)
    song s[SS];
    
    int main() {
      // for (int i = 0; i < SS; i++)
      //  s[i].release_year = rand() % 1000;
      // printf("Bubble: %g\n", bubbleSortTime(s, SS));
      for (int i = 0; i < SS; i++)
        s[i].release_year = rand() % 1000;
      printf("QTSort: %g\n", qsortTime(s, SS));
    }