I'm working on a Chip-8 emulator in C with the goal of having it be as cross platform and as small as possible for compatibility with embedded systems and systems with low specs (and to challenge myself), which means being able to use SDL, ncurses (when I get to that point) and something else as well. As such, I've been using unsigned chars in place of ints or unsigned ints, and I used "typedef unsigned char byte" to make it more convenient. Am I wasting my time, even with the goal of ideally making it compatible with very small systems, or would just using "typedef unsigned int byte" be enough without sacrificing performance?
You should just separate the interpreter from the io bindings.
Only the interpreter needs to be 8 bits portable.
Each target platforms will have a different set of technologies available for IO.
SDL or curses will give you some sort of portability between *nix and windows platforms but if available, you'll probably have 32bits integers, maybe have to deal with 16 bits. But surely not with 8 bit integers.
On the other hand, dealing with 8 bits and 16 bits bare metal processors, means you'll probably just hook your project directly to the graphic driver.