Null-GPT2-Large / README.md
Inoob's picture
Update README.md
6f349ed verified
|
raw
history blame contribute delete
No virus
693 Bytes
metadata
license: mit
base_model: openai-community/gpt2-large

Null-GPT2-Large

Description

This is a GPT2-LARGE Model, but only with the architecture, no pre-trained weights, biases, attention, etc.

This is useful for researchers who want to play with training the model (not tuning).

Generated via the github repo Model Architecture Generator

Use

First go into the directory of the model,

git clone https://github.com/ivanhe123/Model-Architecture-Generator
python -m randomnize_params -in "./Null-GPT2-Large" -out path_model_out

path_model_out is just the output path of the newly randomnized model.