python-3.xdockerkuberneteskubeflow-pipelineskfp

passing data between Kubeflow pipelines containers


So I am wondering how can I pass data frame then model between Kubeflow containers which I deployed locally using k8s. Right now I am using v2 sdk and func_to_container_op decorator but I am also interested in how to do it using docker files and creating containers from docker

I found out I should use Input and OutPut function and artifacts but I am not sure how it works in case of passing local files.


Solution

  • Local file cannot be directly passed between component. You can either serialize it to a string or wrap it into an Artifact. For Artifact, you can first pass the url or path of the Artifact output to a container argument, which allows you to write content to this file inside the container:

    https://www.kubeflow.org/docs/components/pipelines/v2/author-a-pipeline/components/#3-custom-container-components