I found this amazing piece of work by Arthur Whitney - http://www.jsoftware.com/jwiki/Essays/Incunabulum
It compiled with a few warnings
$ gcc-4.7 incuna.c -o incuna.o
incuna.c: In function 'ma':
incuna.c:8:15: warning: incompatible implicit declaration of built-in function 'malloc' [enabled by default]
incuna.c: In function 'pi':
incuna.c:26:7: warning: incompatible implicit declaration of built-in function 'printf' [enabled by default]
incuna.c: In function 'nl':
incuna.c:26:24: warning: incompatible implicit declaration of built-in function 'printf' [enabled by default]
incuna.c: In function 'pr':
incuna.c:28:10: warning: incompatible implicit declaration of built-in function 'printf' [enabled by default]
incuna.c: In function 'ex':
incuna.c:35:36: warning: assignment makes integer from pointer without a cast [enabled by default]
incuna.c:35:25: warning: return makes pointer from integer without a cast [enabled by default]
incuna.c: In function 'noun':
incuna.c:37:57: warning: return makes integer from pointer without a cast [enabled by default]
incuna.c: In function 'wd':
incuna.c:39:21: warning: incompatible implicit declaration of built-in function 'strlen' [enabled by default]
But it segfaulted on entering a basic input 1 + 1
.
./incuna.o
warning: this program uses gets(), which is unsafe.
1 + 1
[1] 11525 segmentation fault ./incuna.o
I'm guessing this has something to do with the difference in C compiler since 1989.
How would I be able to run this? Can I get this working on recent Linux/Mac? or on a VirtualBox VM? or anything else?
My Google searches turned up nothing related.
It converts pointers to int
and long
and vice-versa. This breaks with 64-bit architectures in which they have different sizes.
Compile it for a 32-bit target. E.g., with “-arch i386” using clang/LLVM on Mac OS X.