Upload folder using huggingface_hub (#1)
Browse files- 0b3417ab82c083c3eb7deabcaa0bac1812774f58e5c2f0ffb69845b8ccd28136 (823e3d7445ccb35877c36b0510fe2a8e82c629a8)
- 58c01dd032c4bdf16eaeb0f9d0086ee9cb5d413f19f887dc022a2f6b4eff890f (07cbb667c1e2fdfa28ff6989f774aacbca718ec1)
- 0c602f1ca1edf5d5b7a4b47c1da8343b6ac111579f5953a71a1d6f9cd806c869 (820e52ef2af6d0e02c7287e925e0747281d70e0b)
- 5bd10eb16bde5f63d946d52a63aec28c1dcc01c4c0fdea6a52d17d3367c1bfae (fb4d3a191eb59f2bab3dc5ed1787d78f7dbc9d22)
- eba0747fc33ab9c20549b4ced91209cbe9b612994d9a230632885046272f8706 (3cee1c24dcd17a2fbb63faa5273c5d535ca99078)
- 568598357ea84c71e59a9a4419096423236b574d58f16a4a31bb82312a300fa4 (416f4ff5a02375df1cf1bfd3e0495c3b5f60e09d)
- 82e61277e6e45c57d7a5f794b85a5fa2f004ec5f6909268823f2753d98817cd9 (95ccb4c86e860537b0e7d68f048b4ac1bf3a0485)
- 13c4b88ce036cc11f71273e954be9adb99ef4d3df794246e8a5b17620c31b51f (5d4d3ebea3d10b0a41a8ab1937874e6930a9ff38)
- cded3cf6dcd64a41007651efa15981f53e9023d63df04a1d1d7259aeb8a92e27 (030a216223404f2783bdbba4c0148e31ac1a2d9a)
- bd314fd6c642c92eba845482c4a6b4c0e58cbbac79b13c9dfa9cac41c8b40841 (b1f343dd5d715b8dddce5bb622869639eb90de19)
- 9e383fd4aadc860cae2bc0dbe7ed48a56d445db3353fa1b0fb61dfcdb8a7c022 (d17594cc06c94c3ad3a830f07e108ea7579b3c89)
- .gitattributes +11 -0
- Phi-3-Context-Obedient-RAG.Q2_K.gguf +3 -0
- Phi-3-Context-Obedient-RAG.Q3_K_L.gguf +3 -0
- Phi-3-Context-Obedient-RAG.Q3_K_M.gguf +3 -0
- Phi-3-Context-Obedient-RAG.Q3_K_S.gguf +3 -0
- Phi-3-Context-Obedient-RAG.Q4_K_M.gguf +3 -0
- Phi-3-Context-Obedient-RAG.Q4_K_S.gguf +3 -0
- Phi-3-Context-Obedient-RAG.Q5_K_M.gguf +3 -0
- Phi-3-Context-Obedient-RAG.Q5_K_S.gguf +3 -0
- Phi-3-Context-Obedient-RAG.Q6_K.gguf +3 -0
- Phi-3-Context-Obedient-RAG.Q8_0.gguf +3 -0
- Phi-3-Context-Obedient-RAG.fp16.gguf +3 -0
- README.md +55 -0
@@ -33,3 +33,14 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
36 |
+
Phi-3-Context-Obedient-RAG.Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
|
37 |
+
Phi-3-Context-Obedient-RAG.Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
38 |
+
Phi-3-Context-Obedient-RAG.Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
39 |
+
Phi-3-Context-Obedient-RAG.Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
40 |
+
Phi-3-Context-Obedient-RAG.Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
41 |
+
Phi-3-Context-Obedient-RAG.Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
42 |
+
Phi-3-Context-Obedient-RAG.Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
43 |
+
Phi-3-Context-Obedient-RAG.Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
44 |
+
Phi-3-Context-Obedient-RAG.Q8_0.gguf filter=lfs diff=lfs merge=lfs -text
|
45 |
+
Phi-3-Context-Obedient-RAG.fp16.gguf filter=lfs diff=lfs merge=lfs -text
|
46 |
+
Phi-3-Context-Obedient-RAG.Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ee36c9742dd34dcb04ee5524f3e90ba1f60bd481fe422ba41e03d53d28292298
|
3 |
+
size 1416202976
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:36be5a75cdf9744521ff42b29c10732a4a4ffc56dea9ebcfcaa4c6674ae02d24
|
3 |
+
size 2087596256
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1d99432ce09346eea60dc39aeeb534e50828845026709b23d0066d0d81d3d9d2
|
3 |
+
size 1955475680
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:02448ae2ee625f896974f48476f99f18f93a4fa6b954c5ab5d88062760be6abf
|
3 |
+
size 1681797344
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:9148035c45196c366ca1e5becf1d7b5e132b7bfac129c1cd61ea9867bf9b81bf
|
3 |
+
size 2393231072
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1924d70a54a540fa598891110743c9f029fef875cddf78ff1ff66b7f4aa7a0c5
|
3 |
+
size 2188758752
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:5ada5418271b482618857899b6e42e3ec558e172236cea209cf1cd6b1f63784a
|
3 |
+
size 2815274720
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:8cbeff49f1ff520551e5e0b9ff9d8b3f17c067cae1d33ee3d4ce66860450a510
|
3 |
+
size 2641473248
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:b230f4ccba881e9f4db50c2844ba562cd52a0b65e1cad4f4a41188f7efb21b03
|
3 |
+
size 3135851744
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c3f2f525b783dbd86c941cd3f9096cc933c15b15ca83eb50f2db5a72d411262a
|
3 |
+
size 4061221088
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c9671e22eab5ae821d22d48b3de035b6b67d37e2e943988a22b986711d4a5046
|
3 |
+
size 7643295904
|
@@ -0,0 +1,55 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
tags:
|
3 |
+
- quantized
|
4 |
+
- 2-bit
|
5 |
+
- 3-bit
|
6 |
+
- 4-bit
|
7 |
+
- 5-bit
|
8 |
+
- 6-bit
|
9 |
+
- 8-bit
|
10 |
+
- GGUF
|
11 |
+
- transformers
|
12 |
+
- safetensors
|
13 |
+
- phi3
|
14 |
+
- text-generation
|
15 |
+
- conversational
|
16 |
+
- custom_code
|
17 |
+
- license:cc-by-sa-4.0
|
18 |
+
- autotrain_compatible
|
19 |
+
- endpoints_compatible
|
20 |
+
- region:us
|
21 |
+
- text-generation
|
22 |
+
model_name: Phi-3-Context-Obedient-RAG-GGUF
|
23 |
+
base_model: TroyDoesAI/Phi-3-Context-Obedient-RAG
|
24 |
+
inference: false
|
25 |
+
model_creator: TroyDoesAI
|
26 |
+
pipeline_tag: text-generation
|
27 |
+
quantized_by: MaziyarPanahi
|
28 |
+
---
|
29 |
+
# [MaziyarPanahi/Phi-3-Context-Obedient-RAG-GGUF](https://huggingface.co/MaziyarPanahi/Phi-3-Context-Obedient-RAG-GGUF)
|
30 |
+
- Model creator: [TroyDoesAI](https://huggingface.co/TroyDoesAI)
|
31 |
+
- Original model: [TroyDoesAI/Phi-3-Context-Obedient-RAG](https://huggingface.co/TroyDoesAI/Phi-3-Context-Obedient-RAG)
|
32 |
+
|
33 |
+
## Description
|
34 |
+
[MaziyarPanahi/Phi-3-Context-Obedient-RAG-GGUF](https://huggingface.co/MaziyarPanahi/Phi-3-Context-Obedient-RAG-GGUF) contains GGUF format model files for [TroyDoesAI/Phi-3-Context-Obedient-RAG](https://huggingface.co/TroyDoesAI/Phi-3-Context-Obedient-RAG).
|
35 |
+
|
36 |
+
### About GGUF
|
37 |
+
|
38 |
+
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
|
39 |
+
|
40 |
+
Here is an incomplete list of clients and libraries that are known to support GGUF:
|
41 |
+
|
42 |
+
* [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option.
|
43 |
+
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
|
44 |
+
* [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
|
45 |
+
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
|
46 |
+
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
|
47 |
+
* [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
|
48 |
+
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection.
|
49 |
+
* [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
|
50 |
+
* [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use.
|
51 |
+
* [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
|
52 |
+
|
53 |
+
## Special thanks
|
54 |
+
|
55 |
+
🙏 Special thanks to [Georgi Gerganov](https://github.com/ggerganov) and the whole team working on [llama.cpp](https://github.com/ggerganov/llama.cpp/) for making all of this possible.
|