pythonmachine-learningnlpnlg

How do I train gpt 2 from scratch?


I want to train gpt 2 from scratch but there is only fine-tuning approach based on pretrained models in articles I found. I've used this https://github.com/nshepperd/gpt-2 for train with existing model. Should I edit these Python scripts to train from scratch?


Solution

  • I found the answer in https://github.com/nshepperd/gpt-2/issues/11

    If you want to not use the released model at all, for instance because you want to train a model with incompatible hyperparameters, it should be sufficient to just skip the restore from the released model checkpoint (around train.py:164-177) on your first run so the parameters will all be randomly initialized.