cmemoryvalgrindfree

Valgrind Error: Freeing memory in the scope of function in C


I am receiving an error when using valgrind:

==696839== HEAP SUMMARY:
==696839==     in use at exit: 136 bytes in 1 blocks
==696839==   total heap usage: 6 allocs, 5 frees, 12,624 bytes allocated
==696839== 
==696839== 136 bytes in 1 blocks are definitely lost in loss record 1 of 1
==696839==    at 0x4848899: malloc (in /usr/libexec/valgrind/vgpreload_memcheck-amd64-linux.so)
==696839==    by 0x109A35: insert_word_into_table (dictionary.c:57)
==696839==    by 0x109BD9: dict_insert (dictionary.c:82)
==696839==    by 0x109606: main (spell_check.c:91)
==696839== 
==696839== LEAK SUMMARY:
==696839==    definitely lost: 136 bytes in 1 blocks
==696839==    indirectly lost: 0 bytes in 0 blocks
==696839==      possibly lost: 0 bytes in 0 blocks
==696839==    still reachable: 0 bytes in 0 blocks
==696839==         suppressed: 0 bytes in 0 blocks

Here is a basic, minimal version of the function where the error is occurring; I am iterating through an array of linked lists.

int insert_word_into_table(table_t *table, const char *word){
    int hsh_code = hash_code(word);
    int duplicateWordFlag = 0;

    //initialize the new node for the word
    list_node_t *newNode = malloc(sizeof(list_node_t));//line 57 that error is referring to
    //printf("%lu", sizeof(list_node_t)); checking the size, it is 136 bytes

    if (table->array[hsh_code] == NULL) {//if nothing is in the array
        table->array[hsh_code] = newNode;
        table->array[hsh_code]->next = NULL;
        strcpy(table->array[hsh_code]->word, word);
    } else {//if there is a node there
        list_node_t* i = table->array[hsh_code];
        while (i->next != NULL) {
            if (strcmp(word, i->word) == 0){
                duplicateWordFlag = -1;
            }
            i = i->next;
        }
        i->next = malloc(sizeof(list_node_t));
        strcpy(i->next->word, word);
        i->next->next = NULL;
    }
    return duplicateWordFlag;
}

I attempted to free the memory in the two spot I malloc'd in the function but that did not work either. I am a bit confused; if the memory is allocated in a function is it not freed when the program leaves the function, or is it on the heap? In any case at the end of my whole program I free the memory for the whole array of linked lists which I thought would take care of it.

Any advice or thoughts would be great. Thanks!

Edit 2 (the plot thickens):

So I changed the function to instead use

i->next = newNode;//new way

and the error persisted

This is how I am freeing memory, this function is called at the very end of my program:

void dict_free(dictionary_t *dict) {
    free(dict->table->array);//base array of linked lists
    free(dict->table);
    free(dict);
}

Edit 3: Solution

This fixed it; I guess freeing using the pointer to the struct was not enough and I had to free all the nodes.

void dict_free(dictionary_t *dict) {
    for (int i = 0; i < dict->table->length; i++) {
        free(dict->table->array[i]);
    }
    free(dict->table->array);
    free(dict->table);
    free(dict);
}

Update/Edit 4: (Realizing the magnitude of my error) Trying to free everything for real.

I realize now that I am not freeing everything (as some pointed out).

In my previous solution while it solved the immediate issue, it was not freeing all the memory (again, as some pointed out).

Here is my solution to iterate through all of the linked list and free all the memory that was allocated.

//free the table; a structure that is an array of linked lists
void table_free(table_t *table) {
        list_node_t *node;
        list_node_t *tempNode;
        for (int i = 0; i < table->length; i++) {
            if (table->array[i] != NULL) {
                if (table->array[i]->next == NULL) {//if only one node in linked list
                    free(table->array[i]);
                    continue;
                } else {
                    node = table->array[i];
                    while(node->next != NULL) {
                        tempNode = node->next;
                        free(node);
                        node = tempNode;
                    }
                }
            }
        }

        for (int i = 0; i < table->length; i++) {
            if (table->array[i] == NULL) { //free the spots I missed
                free(table->array[i]);
            }
        }
        free(table->array);
        free(table);
}

void dict_free(dictionary_t *dict) {
    //old way not freeing everything
    // for (int i = 0; i < dict->table->length; i++) {
    //     free(dict->table->array[i]);
    // }
    // free(dict->table->array);
    // free(dict->table);
    // free(dict);    

    //new attempt
    table_free(dict->table);
    free(dict);
}

I feel confident that this would free everything, based on what others were suggesting. But I am still receiving valgrind errors:

==4020884== HEAP SUMMARY:
==4020884==     in use at exit: 97,376 bytes in 716 blocks
==4020884==   total heap usage: 8,606 allocs, 7,890 frees, 1,209,040 bytes allocated
==4020884== 
==4020884== 97,376 bytes in 716 blocks are definitely lost in loss record 1 of 1
==4020884==    at 0x4848899: malloc (in /usr/libexec/valgrind/vgpreload_memcheck-amd64-linux.so)
==4020884==    by 0x109B32: insert_word_into_table (dictionary.c:56)
==4020884==    by 0x109C8E: dict_insert (dictionary.c:97)
==4020884==    by 0x10A16A: read_dict_from_text_file (dictionary.c:245)
==4020884==    by 0x10981A: main (spell_check.c:142)

My big question (as this is a larger program), if I free everything I allocated solely at the end is that good practice and sufficient to freeing all the memory? I allocate memory inside of various functions without freeing it.

Thank you to the community for their patience and helping me solve this.


Solution

  • if the memory is allocated in a function is it not freed when the program leaves the function, or is it on the heap?

    Memory allocated dynamically (such as via malloc) remains allocated until it is explicitly freed. "Heap" in this sense is not a C-language concept, but generally, yes, the memory is allocated on the heap on systems where that is meaningful.

    In any case at the end of my whole program I free the memory for the whole array of linked lists which I thought would take care of it.

    And that some memory does not get freed that way is a sign of a memory leak, which I presume is exactly the sort of thing you wanted Valgrind to detect for you. It appears to me that the problem is that the node you allocate at dictionary.c:57 is linked into the list only if table->array[hsh_code] == NULL evaluates to 1. Otherwise, the function returns without freeing it or retaining any pointer to it. This is a classic memory leak.

    Note that in the other branch, you allocate another node. It looks like you could instead use the one you've already allocated, which might solve your leak problem.


    The code now shown for how you free the hash table shows an additional deficiency. This ...

        free(dict->table->array);//base array of linked lists
    

    ... frees only the array wherein the node pointers are stored, not any of the allocated nodes themselves. Based on your insert_word_into_table() function, before freeing the array, you need at least to iterate over all the hash buckets, and for each one, free every linked list node associated with that bucket. Maybe something like this:

    for (size_t bucket = 0; bucket < NUMBER_OF_BUCKETS; bucket++) {
        for (list_node_t *node = dict->table->array[bucket]; node; ) {
            list_node_t *next = node->next;
            free(node);
            node = next;
        }
    }