c++initializationconstexprstd-rangesstatic-assert

Is there a performance benefit to using c++'s std(::ranges)::uninitialized_... algorithms, and is it worth not having constexpr?


I'm implementing a container type that owns some memory which I am creating using std::make_unique_for_overwrite(). Given that this function is specifically returning a std::unique_ptr to uninitialized memory, I wanted to use the uninitialized memory algorithms provided by the standard. I only noticed that these functions weren't constexpr when trying to write tests using static_assert for my container, only to realize that my constexpr constructors weren't so constexpr after all due to my calls to std::ranges::uninitialized_copy().

My question is, what exactly is the difference between std::ranges::copy() and its uninitialized brother, std::ranges::uninitialized_copy(), and is there a significant enough perf difference to give up constexpr compatibility?


Solution

  • Given that this function is specifically returning a std::unique_ptr to uninitialized memory

    It isn't. It's a unique_ptr to a default-initialized object (or array or objects). This is not uninitialized: Using uninitialized_* algorithms will construct a new object over an existing object, possibly leaking memory if a destructor is missed.


    The "performance benefits" of these algorithms is that the alternative is usually to default construct then use the assignment operator. This is really a "correctness" benefit, since you can't expect generic objects to be default constructible, but as a side effect you avoid a (possibly costly) default constructor.


    And when you have two choices between a "constexpr-friendly" algorithm and a more performant algorithm, use if consteval and use both.