tensorflow

Can a model trained on gpu used on cpu for inference and vice versa?


I was wondering if a model trained on the GPU could be use to run inference with the cpu ? (And vice versa) Thanks to you!


Solution

  • You can do it as long as your model doesn't have explicit device allocations. IE, if your model has blocks like with tf.device('gpu:0'), it'll complain when you run it on model without GPU.

    In such cases you must make sure your imported model doesn't have explicit device assignments, for instance, but using clear_devices argument in import_meta_graph