Why computer architecture based on von Neumann architecture is preferred over Harvard architecture, when designing personal computers; while Harvard architecture is used for designing microcomputer based computer systems and DSP based computer systems?
Well current CPU designs for PC's have both Harvard and Von Neumann elements (more Von Neumann though).
If you look at the L1 caches you would see that in AMD, ARM and Intel systems you have Instruction L1 Cache and Data L1 Cache, that can be accessed independently and in parallel. That's the Harvard part. However, in L2, L3 or in DRAM, data and codes are mixed. That's the Von Neumann part.
So why isn't a pure Harvard architecture adopted for PC's? My opinion is that it does not make sense. If you profile main majority of applications, you would see that the L1 Instruction Cache miss ratio is very small. This means that generally code size is not a problem. So it wouldn't make sense to design a fully separate path for code. Data can grow very large but code can't really.
In DSP's it makes sense to use separate code and data paths. That's because DSP's work mainly on "streaming data" meaning that the need for caching is rather small. Also the DSP codes can contain pre-computed coefficients that increase the code size. So there is a balance between data size and code size, meaning that it makes sense using a Harvard architecture.