Edit model card

πŸ‡ΉπŸ‡­ OpenThaiGPT 1.0.0-beta

πŸ‡ΉπŸ‡­ OpenThaiGPT Version 1.0.0-beta is a Thai language 7B-parameter LLaMA v2 Chat model finetuned to follow Thai translated instructions and extend more than 24,554 most popular Thai words vocabularies into LLM's dictionary for turbo speed.

---- Lora Adapter Format of OpenThaiGPT 1.0.0-beta ----

Upgrade from OpenThaiGPT 1.0.0-alpha

  • Add more than 24,554 most popular Thai words vocabularies into LLM's dictionary and re-pretrain embedding layers which make it generate Thai text 10 times faster than previous version.

Pretrain Model

Support

License

Source Code: License Apache Software License 2.0.
Weight: Research and Commercial uses.

Code and Weight

Web Demo: https://demo-beta.openthaigpt.aieat.or.th/
Colab Demo: https://colab.research.google.com/drive/1NkmAJHItpqu34Tur9wCFc97A6JzKR8xo?usp=sharing
Finetune Code: https://github.com/OpenThaiGPT/openthaigpt-finetune-010beta
Inference Code: https://github.com/OpenThaiGPT/openthaigpt
Weight (Lora Adapter): https://huggingface.co/openthaigpt/openthaigpt-1.0.0-beta-7b-chat
Weight (Huggingface Checkpoint): https://huggingface.co/openthaigpt/openthaigpt-1.0.0-beta-7b-chat-ckpt-hf

Sponsors

Pantip.com, ThaiSC, Promes

Powered by

OpenThaiGPT Volunteers, Artificial Intelligence Entrepreneur Association of Thailand (AIEAT), and Artificial Intelligence Association of Thailand (AIAT)

Authors

Disclaimer: Provided responses are not guaranteed.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train openthaigpt/openthaigpt-1.0.0-beta-7b-chat