Would it be possible for a programming language to consistently have Rust-style ownership and lifetimes (for automatic memory management) while dropping the requirement that only one mutable reference to a piece of data can exist at any time (used to suppress data races)?
In other words, are Rust-style ownership and lifetimes and Rust-style borrow checking two separable concepts? Alternatively, are these two ideas inherently entangled at a semantic level?
Would it be possible for a programming language to consistently have Rust-style ownership and lifetimes (for automatic memory management) while dropping the requirement that only one mutable reference to a piece of data can exist at any time (used to suppress data races)?
A language can do anything so sure.
The problem is that dropping this requirement would be an UB-nest in a language like Rust: if you drop mutable references being unique then they have no purpose so you just have references (always mutable) and the only thing they do is be lexically scoped, this means you can hold a reference to a sub-part of an object, and have a second reference mutate the object such that the sub-part is invalidated (e.g. a reference to a vec item and clearing the vec[0]), and the first reference is now dangling, it points to garbage.
The way to solve that would be to… add a GC? And from that point on the value of "rust-style ownership and references" becomes… limited to nonexistent, because you need a GC non-lexical automated memory management and your references can keep objects alive so having all types be affine by default isn't very useful.
Now what can be useful (and what some languages explore) is for sub-normal types to be opt-in, so types would be normal by default but could be opted into being affine, linear, or even ordered, on a needs basis. This would be solely a type safety measure.
If so, are there any existing languages which achieve this?
Not to my knowledge.
If not, why not?
Because nobody's written one? Affine types by default are useful to Rust but they're not super useful in general so most of the research and design has focused around linear types, which provide more guarantees and are therefore more useful if only a small subset of your types are going to be sub-normal.
[0] which shows that "data races" are not solely about concurrency, it's an entire class of problems which occur commonly in sequential code (e.g. iterator invalidation)