I'd used MLflow and logged parameters using the function below (from pydataberlin).
def train(alpha=0.5, l1_ratio=0.5):
# train a model with given parameters
warnings.filterwarnings("ignore")
np.random.seed(40)
# Read the wine-quality csv file (make sure you're running this from the root of MLflow!)
data_path = "data/wine-quality.csv"
train_x, train_y, test_x, test_y = load_data(data_path)
# Useful for multiple runs (only doing one run in this sample notebook)
with mlflow.start_run():
# Execute ElasticNet
lr = ElasticNet(alpha=alpha, l1_ratio=l1_ratio, random_state=42)
lr.fit(train_x, train_y)
# Evaluate Metrics
predicted_qualities = lr.predict(test_x)
(rmse, mae, r2) = eval_metrics(test_y, predicted_qualities)
# Print out metrics
print("Elasticnet model (alpha=%f, l1_ratio=%f):" % (alpha, l1_ratio))
print(" RMSE: %s" % rmse)
print(" MAE: %s" % mae)
print(" R2: %s" % r2)
# Log parameter, metrics, and model to MLflow
mlflow.log_param(key="alpha", value=alpha)
mlflow.log_param(key="l1_ratio", value=l1_ratio)
mlflow.log_metric(key="rmse", value=rmse)
mlflow.log_metrics({"mae": mae, "r2": r2})
mlflow.log_artifact(data_path)
print("Save to: {}".format(mlflow.get_artifact_uri()))
mlflow.sklearn.log_model(lr, "model")
Once I run train()
with its parameters, in UI I cannot see Artifacts, but I can see models and its parameters and Metric.
In artifact tab it's written No Artifacts Recorded Use the log artifact APIs to store file outputs from MLflow runs.
But in finder in models folders all Artifacts existe with models Pickle.
help
Had a similar issue. In my case, I solved it by running mlflow ui
inside the mlruns
directory of your experiment.
See the full discussion on Github here
Hope it helps!