computation-theory

Is it possible to write a program with thousand or fewer characters that can generate every possible 128-kilobyte file


I found this question in a German math textbook [Konkrete Mathematik (nicht nur) für Informatiker].

I know that this is not possible but I can not come up with a convincing argument. I think there are sequences of Bytes that can not be computed thus you need at least as many Bytes as you want to write to the file.

I hope somebody can provide a more accurate and convincing argument. Thanks.

I tried to find answers in the internet with no success. I hope to get a computation theoretical kind of answer.


Solution

  • It is possible -- as a demonstration, below is a 615-byte C program that will generate every possible file that is numBytes bytes long (where numBytes is a constant that you can set on line 7). It prints the contents of each possible file to stdout as a row of hexadecimal byte-values.

    Note that I set numBytes to 3 so that when you run the program you can see it run to completion in a reasonable amount of time. You can change numBytes to 128KB (128*1024) if you want, but then it will take a long time for the program to complete :)

    #include <stdio.h>
    #include <string.h>
    #include <stdlib.h>
    
    int main(int argc, char ** argv)
    {
       const int numBytes = 3; //128*1024;
    
       unsigned char * buf = (unsigned char *) malloc(numBytes);
       memset(buf, 0, numBytes);
    
       int keepGoing = 1;
       while(keepGoing)
       {  
          for (int i=0; i<numBytes; i++) printf("%02x ", buf[i]);
          printf("\n");
          
          unsigned char * op = buf+(numBytes-1);
          unsigned char * p  = op;
          while(*p == 0xFF)
          {  
             (*(--p))++;
             if ((p == buf)&&(*p == 0x00)) {keepGoing = 0; break;}
          }
          (*op)++;
       }
       free(buf);
    
       printf("done!\n");
    }