machine-learningdeep-learningnlpgpukaggle

How to use both gpus in kaggle for training in pytorch?


I was training a model in kaggle gpu. But as I can see only one GPU is working. enter image description here I use the ordinary method for training like

device = torch.device('cuda') if torch.cuda.is_available() else torch.device('cpu')
model = model.to(device)

How can I use both the gpus?


Solution

  • Using multiple GPUs is specific to machine learning libraries. I stumbled upon the same problem while doing image segmentation in Pytorch. The solution is to use the module torch.nn.DataParallel() with the model. The given code can be changed as follows:

    device = torch.device('cuda') if torch.cuda.is_available() else torch.device('cpu')
    model = torch.nn.DataParallel(model, device_ids = [0,1]).to(device)
    

    here, the device_ids is the index of GPUs. Suppose if you have 4 GPUs then it would be device_ids = [0,1,2,3] or whatever the index it maybe.

    And the result of using both GPUs is here!.

    PS: This is my first post in the prestigious stack overflow, please do share your comments and views.