Edit model card

GPT-NoSleep-1.5b

This is the largest release of GPT-NoSleep; a finetuned version of GPT2-XL on the 'reddit-nosleep-posts' dataset. Smaller releases include:

And the accompanying prompt generator can be found here:

Training Procedure

This was trained on the 'reddt-nosleep-posts' dataset, on Google Colab. This model was trained for 2 epochs with learning rate 1e-2. Special thanks for Skyler for helping to train this large of a model!

Biases & Limitations

This likely contains the same biases and limitations as the original GPT2 that it is based on, and additionally heavy biases from the dataset. It can generate output that is not meant for all audiences, seeing as it's purpose is to generate horror stories.

Intended Use

This model is meant for fun, nothing else.

Downloads last month
16
Safetensors
Model size
1.61B params
Tensor type
FP16
·
BOOL
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train DarwinAnim8or/GPT-NoSleep-1.5b

Space using DarwinAnim8or/GPT-NoSleep-1.5b 1