Ng Wai Foong
1 min readSep 10, 2019

--

Hi Jason Guo,

No, there are all together 4 different versions of the model based on the number of parameters. nshepperd’s version allows us to fine-tune the model with own dataset. The following is the explanation and statement from the official gpt2 site:

“ We have currently released small (124M parameter), medium (355M parameter), and large (774M parameter) versions of GPT-2*, with only the full model as of yet unreleased. We have also released a dataset for researchers to study their behaviors.”

Hope it helps you!

--

--

Ng Wai Foong
Ng Wai Foong

Written by Ng Wai Foong

Senior AI Engineer@Yoozoo | Content Writer #NLP #datascience #programming #machinelearning | Linkedin: https://www.linkedin.com/in/wai-foong-ng-694619185/

No responses yet