I currently really can't wrap my head around what is happening here. I have the following (minimal) code:
public class SillyCast {
interface B {}
interface D<T extends B> {}
class C<T extends D<?>> {}
class A<T extends B> extends C<D<T>> {}
List<Class<? extends C<?>>> list = new ArrayList<>();
void m() {
list.add((Class<? extends C<?>>) A.class);
}
}
When compiling (and running) this code in Eclipse everything works as expected but when doing it from the command line (Oracle JDK) I get the following error during compiling:
error: incompatible types: Class<A> cannot be converted to Class<? extends C<?>> list.add((Class<? extends C<?>>) A.class);
I know that the Oracle JDK not always behaves exactly the same as Eclipse JDT but this seems weird.
How can I add an instance of A.class
into my ArrayList? Intuitively it seems like this should be possible.
Why does the Oracle JDK and Eclipse JDT have different opinions about this cast? Why does the Oracle JDK have problems doing this cast.
At first I thought this might be a bug but I tested this on different Java versions (8 and 11) and the problem happens on all of them. Also this kind of cast seems way too "simple" as this could go unnoticed over multiple versions, so expect this to be done like this by design.
I think as @ferrouskid pointed out the reason why this isn't working is probably that, although being generally assumed, C<?>
is no supertype of C<D<?>>
.
Although this is more of a confusion tactic than a real solution, my current workaround is to just distract the Oracle compiler by assigning the A.class
to a variable before casting it (may require an @SuppressWarnings("unchecked")
)
Class<?> aa = A.class;
list.add((Class<? extends C<?>>) aa);
Disclaimer: This kind of operation is very dangerous because the cast may fail at runtime. So only do this if you have complete test coverage of the impacted methods.