I have a virtual Conda environment named dev that was created using the following YAML file:
# *** dev.yml ***
name: dev
channels:
- defaults # Check this channel first
- conda-forge # Fallback to conda-forge if packages are not available in defaults
dependencies:
- python==3.12
- numpy>=2.0.1
# ...more libraries without fixed versions
The environment was created without any dependency issues, and I successfully have NumPy at version 2.0.1. However, when I try to update the packages using the command:
conda update --all
I get the following output suggesting that NumPy will be downgraded:
The following packages will be DOWNGRADED:
numpy 2.0.1-py312h2809609_1 --> 1.26.4-py312h2809609_0
numpy-base 2.0.1-py312he1a6c75_1 --> 1.26.4-py312he1a6c75_0
Why is Conda trying to downgrade NumPy during the update? I understand that dependencies might conflict during an update, but I thought this would be caught during the initial environment creation which runs without dependency warnings.
Is there a way to conditionally update packages where dependencies are solvable while keeping the current NumPy version?
It is likely that an update to one of the packages might have pinned the newest version to use numpy less that v2. To override this behavior, you can pin a version during the update:
conda update --all numpy=2.*
Or you can pin the package by adding the package spec to the file at <environment>/conda-meta/pinned
before you do the update
echo numpy=2.* >>%CONDA_PREFIX%/conda-meta/pinned
conda update --all