When can a design pattern make your software worse?
I have seen a program where they used the facade pattern between the GUI and logic. They considered that no objects may be transported over this, so only primitive types were used which made it difficult to code.
For decades programmers have often spent time making software worse by applying practices they don't understand, rather than learning why that practice is good. The use of certain tools/keywords/frameworkes can make a programmer feel sophisticated, when they're just screwing things up. Some common examples:
This list could go on forever. This tendency has always been around and always will be, because we humans always look for shortcuts. We see this a lot with patterns now because patterns are currently popular. The root problem is the same: developers not having a respect for how complicated software development is, and thinking a blindly implemented pattern or practice is a substitute for learning and humility.
The solution? Difficult to say. You can't go to wrong though handing a junior developer Code Complete or some such, to help them understand this is all about applying principles to particular situations, and that great developers keep it simple.