I am using the tensorflow/serving image and run into this error trying run the docker file.
Step 1: > pulled the tensorflow serving image
docker pull tensorflow/pull
Step2: Stored my models into a (saved-models) folder within my project folder(detection-potato-lite)
Elijah-A-W@DESKTOP-34M2E8U MINGW64 /d/myn/ML_Prediction_Project/New_folder/detection-potato-lite
$ ls
Api/ 'Data Eda'/ models.config Plant/ saved-models/
Step 3: Created the models.config file in the Project folder to render the models dynamically using tensorflow serving
model_config_list {
config {
name: 'potatoes_model'
base_path: '/detection-potato-lite/saved-models'
model_platform: 'tensorflow'
model_version_policy: {all: {}}
}
Step4: Get an error(No such file/directory) when trying run the image in Windows Powershell
PS C:\Users\Elijah-A-W> docker run -t --rm -p 8501:8501 -v D:/myn/ML_Prediction_Project/New_Folder/detection-potato-lite:/detection-potato-disease tensorflow/serving --rest_api_port=8501 --model_config_file=/detection-potato-lite/models.config
Failed to start server. Error: Not found: /detection-potato-lite/models.config; No such file or directory
I see two problems here:
EDIT:
By using -v D:/myn/ML_Prediction_Project/New_Folder/detection-potato-lite:/detection-potato-disease
in your docker run command, you are binding your local directory (D:/myn/ML_Prediction_Project/New_Folder/detection-potato-lite) to a directory within the Docker container (/detection-potato-disease).
This means that when your application runs inside the Docker container, all your files/folders that were on that local directory now exist in '/detection-potato-disease'.
So you have two options:
/detection-potato-lite/saved-models
to /detection-potato-disease/saved-models
AND modify '--model_config_file' in your docker run command from --model_config_file=/detection-potato-lite/models.config
to --model_config_file=/detection-potato-disease/models.config
docker run -t --rm -p 8501:8501 -v D:/myn/ML_Prediction_Project/New_folder/detection-potato-lite:/detection-potato-lite tensorflow/serving --rest_api_port=8501 --model_config_file=/detection-potato-lite/models.config
EDIT 2:
You'll need to add another closing }
bracket on line 7 of the models.config file after the previous fix I have mentioned.