pythonmachine-learningdeep-learningwandb

How do I print the wandb sweep url in python?


For runs I do:

wandb.run.get_url()

how do I do the same but for sweeps given the sweep_id?


fulls sample run:

"""
Main Idea:
- create sweep with a sweep config & get sweep_id for the agents (note, this creates a sweep in wandb's website)
- create agent to run a setting of hps by giving it the sweep_id (that mataches the sweep in the wandb website)
- keep running agents with sweep_id until you're done

note:
    - Each individual training session with a specific set of hyperparameters in a sweep is considered a wandb run.

ref:
    - read: https://docs.wandb.ai/guides/sweeps
"""

import wandb
from pprint import pprint
import math
import torch

sweep_config: dict = {
    "project": "playground",
    "entity": "your_wanbd_username",
    "name": "my-ultimate-sweep",
    "metric":
        {"name": "train_loss",
         "goal": "minimize"}
    ,
    "method": "random",
    "parameters": None,  # not set yet
}

parameters = {
    'optimizer': {
        'values': ['adam', 'adafactor']}
    ,
    'scheduler': {
        'values': ['cosine', 'none']}  # todo, think how to do
    ,
    'lr': {
        "distribution": "log_uniform_values",
        "min": 1e-6,
        "max": 0.2}
    ,
    'batch_size': {
        # integers between 32 and 256
        # with evenly-distributed logarithms
        'distribution': 'q_log_uniform_values',
        'q': 8,
        'min': 32,
        'max': 256,
    }
    ,
    # it's often the case that some hps we don't want to vary in the run e.g. num_its
    'num_its': {'value': 5}
}
sweep_config['parameters'] = parameters
pprint(sweep_config)

# create sweep in wandb's website & get sweep_id to create agents that run a single agent with a set of hps
sweep_id = wandb.sweep(sweep_config)
print(f'{sweep_id=}')


def my_train_func():
    # read the current value of parameter "a" from wandb.config
    # I don't think we need the group since the sweep name is already the group
    run = wandb.init(config=sweep_config)
    print(f'{run=}')
    pprint(f'{wandb.config=}')
    lr = wandb.config.lr
    num_its = wandb.config.num_its

    train_loss: float = 8.0 + torch.rand(1).item()
    for i in range(num_its):
        # get a random update step from the range [0.0, 1.0] using torch
        update_step: float = lr * torch.rand(1).item()
        wandb.log({"lr": lr, "train_loss": train_loss - update_step})
    run.finish()


# run the sweep, The cell below will launch an agent that runs train 5 times, usingly the randomly-generated hyperparameter values returned by the Sweep Controller.
wandb.agent(sweep_id, function=my_train_func, count=5)

cross: https://community.wandb.ai/t/how-do-i-print-the-wandb-sweep-url-in-python/4133


Solution

  • You can do it from the run obj after wandb.init. It seems that obj is very useful, see the docs url or it's code

            run = wandb.init(mode=mode)
        ...
            print(f'{run.get_sweep_url()=}')
    

    https://github.com/wandb/wandb/blob/c4726707ed83ebb270a2cf84c4fd17b8684ff699/wandb/sdk/wandb_run.py#L1122-L1130

    these also work I think:

    def get_sweep_url(sweep_config: dict, sweep_id: str) -> str:
        sweep_url = f"Sweep URL: https://wandb.ai/{sweep_config['entity']}/{sweep_config['project']}/sweeps/{sweep_id}"
        return sweep_url
    
    
    def get_sweep_url(entity: str, project: str, sweep_id: str) -> str:
        """
    
        https://wandb.ai/{username}/{project}/sweeps/{sweep_id}
        """
        api = wandb.Api()
        sweep = api.sweep(f'{entity}/{project}/{sweep_id}')
        return sweep.url
    
    
    def get_sweep_config(path2sweep_config: str) -> dict:
        """ Get sweep config from path """
        config_path = Path(path2sweep_config).expanduser()
        with open(config_path, 'r') as file:
            sweep_config = yaml.safe_load(file)
        return sweep_config
    

    https://docs.wandb.ai/ref/python/run?_gl=1*80ki1e*_ga*MTYwMTE3MDYzNS4xNjUyMjI2MTE1*_ga_JH1SJHJQXJ*MTY4ODU5NDI0NS4zMDAuMS4xNjg4NTk1MDg3LjU5LjAuMA..