javaspring-bootjunitenumsmockito

How can I test for an invalid enum type in a switch case?


I'm facing an issue where all the enum values are implemented in the switch statement, and I'm unable to force an UnsupportedOperationException to be thrown. I've read several articles, but nothing work for me

Here’s the method I want to test:

public ResponsePaginatedDto execute(DeviceType type) {
    return switch (type) {
        case APP -> handleApp();
        case WEB -> handleWeb();
        case DESKTOP -> handleDesktop();
        default -> throw new UnsupportedOperationException();
    };
}

And here's my enum:

public enum DeviceType {
    APP,
    WEB,
    DESKTOP;
}

What I Tried:

What I Was Expecting: I was expecting to find a way to test the UnsupportedOperationException without having to modify the original enum or introduce artificial values. My goal was to find a method that would allow me to simulate an unsupported enum value and trigger the exception in a clean and maintainable way.


Solution

  • This specific example - the fix

    The switch block here is used as an expression. That means you just delete the default block completely:

    This solves all your problems, as long as you are willing to accept MatchException instead of UnsupportedOperationException. It's a lot better, even: You can now get 100% test coverage, and you get compile-time warnings if you fail to cover an enum instead of a nebulous 'hopefully some error someday' situation, and it's less typing, and reading the switch code no longer incorrectly suggests that the 3 enum values you wrote a case for don't cover it all and that there is a 4th or 5th. After all, if there wasn't, why write a default?

    This specific example - bad code

    100% test coverage is a silly kludge. It doesn't mean your code is great. For example, here, you wrote quite bad code. If ever the impossible occurs and that UnsupportedOperationException is thrown, the very point of the error message part of an exception is to convey useful information pertinent to the case. The 'parameters that led to the error', more or less. Here, of course you convey the value! That should be throw new UnsupportedOperationException(type.name()) - the fact that it is not, is a bug, straight up. It's a bug you can't test, though. See? Your very snippet proves that chasing 100% test coverage is a bizarre thing to want to do. It does not get you some vaunted exalted 'no bugs at all possible' reward, and that last 1% demands that you delete defensive/fallback code entirely because you can't get that covered without ruining your code base and in fact increasing the odds of that problem by adding ways to make an impossible state (even if only so that your tests can cover the impossible state). 1

    There isn't a nice solution for all such scenarios!

    Here we were able to 'fix' the problem and get you nice(r) code than you had whilst also getting you other nice benefits and also getting you 100% test coverage.

    But things aren't always that nice: Sometimes, a fallback/defensive line of code is quite useful as a safeguard, but, you then face a dilemma. Either:

    100% code coverage is a tool. If using the tool demands that you make your code bad and harder to reason about, the right course of action is to conclude that the tool is bad in the scenario you have found yourself in. Not to bend over backwards and dying on that hill.

    A third way to go is to simply delete all fallbacks. Any code that attempts to trigger on a scenario you cannot fathom could instead be deleted. However, this is a really bad idea. Not just because it presumes you are omniscient. Also because fallbacks guard against misinterpretations of how the written code is to be used by future programmers.

    For example, let's say I have a method that returns the width of a rectangle, and I have some other code that accepts a rectangle and does some math on its length. You cannot fathom what a negative width of a rectangle even means so you decide to write something like:

    if (rectangle.getWidth() < 0) throw new IllegalArgumentException("Negative width rectangles not allowed");
    

    This is a good thing. Even if it is not possible to create rectangles with negative widths today, maybe tomorrow some enterprising soul adds that to the system and failed to take into account that the code that has that if needs to be updated to deal with it. Before you go: "Negative widths? What nonsense, such a ludicrous thing can never happen" - for a long time mathematicians thought the same of trying to square root a negative number, until someone went: Lemme just.. define some constant as value for the sqrt of minus 1 and see what happens.

    You don't know what you don't know and you cannot account for the unaccountable.

    Hence, defensive/fallback lines are a necessary thing in coding, they can be a good idea. And they are untestable.

    Remember your face and the hammer?

    Stop swinging it.


    [1] Unfortunately, MatchException doesn't print the uncovered value either. This is bad, and OpenJDK should fix this. However, it may give you the benefit of printing the uncovered value in a later release. This isn't just some hopeful pipe dream: It happened to NullPointerException: On modern JVMs, if an NPE is caused by a dereference (instead of a new NullPointerException(...)), the message of the generated NPE will include the textual representation of the expression that was null if it is simple enough to do so. That means if you had just left it at return person.getName(), you'd now get an NPE with text person, whereas if you had explicitly written if (person == null) throw new NullPointerException(); you don't. The 'lazy' programmer wins.