--- license: mit base_model: openai-community/gpt2-large --- # Null-GPT2-Large ## Description This is a GPT2-LARGE Model, but only with the architecture, no pre-trained weights, biases, attention, etc. This is useful for researchers who want to play with training the model (not tuning). Generated via the github repo [Model Architecture Generator](https://github.com/ivanhe123/Model-Architecture-Generator) ## Use First go into the directory of the model, ``` git clone https://github.com/ivanhe123/Model-Architecture-Generator ``` ``` python -m randomnize_params -in "./Null-GPT2-Large" -out path_model_out ``` path_model_out is just the output path of the newly randomnized model.