1 min readSep 10, 2019
Hi Jason Guo,
No, there are all together 4 different versions of the model based on the number of parameters. nshepperd’s version allows us to fine-tune the model with own dataset. The following is the explanation and statement from the official gpt2 site:
“ We have currently released small (124M parameter), medium (355M parameter), and large (774M parameter) versions of GPT-2*, with only the full model as of yet unreleased. We have also released a dataset for researchers to study their behaviors.”
Hope it helps you!