I have a trait function which takes a reference to self and returns a 'static future. But I can't seem to use it. The compiler still insists that my future captures the lifetime of self even when it clearly doesn't:
use std::future::Future;
trait Foo: 'static {
fn bar(&self) -> impl Future<Output = Bar> + 'static;
}
struct Bar;
struct Baz;
fn bar_to_baz(Bar: Bar) -> Baz {
Baz
}
fn baz<T: Foo>(t: &T) -> impl Future<Output = Baz> + 'static {
let fut = t.bar();
async move { bar_to_baz(fut.await) }
}
results in
error[E0700]: hidden type for `impl Future<Output = Baz> + 'static` captures lifetime that does not appear in bounds
--> src/main.rs:17:5
|
15 | fn baz<T: Foo>(t: &T) -> impl Future<Output = Baz> + 'static {
| -- ----------------------------------- opaque type defined here
| |
| hidden type `{async block@src/main.rs:17:5: 17:41}` captures the anonymous lifetime defined here
16 | let fut = t.bar();
17 | async move { bar_to_baz(fut.await) }
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This is due to a design decision that was not required for soundness, but rather an attempt to avoid a footgun: when using return position impl Trait
, the trait is automatically made variant on any lifetime parameters so that, if you subsequently change the implementation of the function so that the lifetime parameter is captured by a new implementation of the trait, you won't end up breaking all the downstream code as a consequence.
Unfortunately, the implementation of this check seems to be somewhat buggy, or at least very unintuitive; my answer to the duplicate question has some demonstrations of the check acting oddly, causing apparently equivalent code to compile or not compile depending on the details of how it was implemented.
For what it's worth, I think it would make sense to be able to disable the automatic lifetime variance with an explicit + 'static
– if you're writing that explicitly, you probably don't intend to subsequently change the return type to something with a less static lifetime – but that isn't currently implemented in the Rust compiler (and there hasn't been significant activity on the relevant bug report recently).
Note that a possible workaround, which helps with some programs, is to constrain the generic parameter itself to be 'static
(i.e. fn baz<T: Foo + 'static>
); however, that will reduce the number of T
types with which the function works, so it is not applicable in all cases.
UPDATE: Apparently, this problem is being worked on, but the fix is still unstable for now. Rust has recently added a use<…>
syntax that can be used to specify exactly what a method captures, in cases like this one where the compiler infers it incorrectly. It's stable in some situations, but currently unstable in trait methods. On current nightly Rust, you can make your code work by adding the unstable precise_capturing_in_traits
feature, and declaring your trait as follows:
#![feature(precise_capturing_in_traits)]
use std::future::Future;
trait Foo: 'static {
fn bar(&self) -> impl Future<Output = Bar> + use<Self>;
}
Here, the use<Self>
specifies that the output captures only the Self
type, and not the lifetime of the &self
reference.
This is still an unstable feature, so the details might be subject to change before it appears in a stable version of Rust.