GGUF
code
Inference Endpoints
imatrix
conversational
CISCai commited on
Commit
6341fa7
1 Parent(s): cd4f4cd
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -86,7 +86,7 @@ Refer to the Provided Files table below to see what files use which methods, and
86
  | [DeepSeek-Coder-V2-Lite-Instruct.IQ2_XXS.gguf](https://huggingface.co/CISCai/DeepSeek-Coder-V2-Lite-Instruct-SOTA-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct.IQ2_XXS.gguf) | IQ2_XXS | 2 | 5.1 GB| 6.1 GB | very small, high quality loss |
87
  | [DeepSeek-Coder-V2-Lite-Instruct.IQ2_XS.gguf](https://huggingface.co/CISCai/DeepSeek-Coder-V2-Lite-Instruct-SOTA-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct.IQ2_XS.gguf) | IQ2_XS | 2 | 5.4 GB| 6.4 GB | very small, high quality loss |
88
  | [DeepSeek-Coder-V2-Lite-Instruct.IQ2_S.gguf](https://huggingface.co/CISCai/DeepSeek-Coder-V2-Lite-Instruct-SOTA-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct.IQ2_S.gguf) | IQ2_S | 2 | 5.4 GB| 6.4 GB | small, substantial quality loss |
89
- | [DeepSeek-Coder-V2-Lite-Instruct.IQ2_M.gguf](https://huggingface.co/CISCai/DeepSeek-Coder-V2-Lite-Instruct-SOTA-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct.IQ2_M.gguf) | IQ2_M | 2 | 5.7 GB| 5.7 GB | small, greater quality loss |
90
  | [DeepSeek-Coder-V2-Lite-Instruct.IQ3_XXS.gguf](https://huggingface.co/CISCai/DeepSeek-Coder-V2-Lite-Instruct-SOTA-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct.IQ3_XXS.gguf) | IQ3_XXS | 3 | 6.3 GB| 7.3 GB | very small, high quality loss |
91
  | [DeepSeek-Coder-V2-Lite-Instruct.IQ3_XS.gguf](https://huggingface.co/CISCai/DeepSeek-Coder-V2-Lite-Instruct-SOTA-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct.IQ3_XS.gguf) | IQ3_XS | 3 | 6.5 GB| 7.5 GB | small, substantial quality loss |
92
  | [DeepSeek-Coder-V2-Lite-Instruct.IQ3_S.gguf](https://huggingface.co/CISCai/DeepSeek-Coder-V2-Lite-Instruct-SOTA-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct.IQ3_S.gguf) | IQ3_S | 3 | 6.8 GB| 7.8 GB | small, greater quality loss |
 
86
  | [DeepSeek-Coder-V2-Lite-Instruct.IQ2_XXS.gguf](https://huggingface.co/CISCai/DeepSeek-Coder-V2-Lite-Instruct-SOTA-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct.IQ2_XXS.gguf) | IQ2_XXS | 2 | 5.1 GB| 6.1 GB | very small, high quality loss |
87
  | [DeepSeek-Coder-V2-Lite-Instruct.IQ2_XS.gguf](https://huggingface.co/CISCai/DeepSeek-Coder-V2-Lite-Instruct-SOTA-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct.IQ2_XS.gguf) | IQ2_XS | 2 | 5.4 GB| 6.4 GB | very small, high quality loss |
88
  | [DeepSeek-Coder-V2-Lite-Instruct.IQ2_S.gguf](https://huggingface.co/CISCai/DeepSeek-Coder-V2-Lite-Instruct-SOTA-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct.IQ2_S.gguf) | IQ2_S | 2 | 5.4 GB| 6.4 GB | small, substantial quality loss |
89
+ | [DeepSeek-Coder-V2-Lite-Instruct.IQ2_M.gguf](https://huggingface.co/CISCai/DeepSeek-Coder-V2-Lite-Instruct-SOTA-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct.IQ2_M.gguf) | IQ2_M | 2 | 5.7 GB| 6.7 GB | small, greater quality loss |
90
  | [DeepSeek-Coder-V2-Lite-Instruct.IQ3_XXS.gguf](https://huggingface.co/CISCai/DeepSeek-Coder-V2-Lite-Instruct-SOTA-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct.IQ3_XXS.gguf) | IQ3_XXS | 3 | 6.3 GB| 7.3 GB | very small, high quality loss |
91
  | [DeepSeek-Coder-V2-Lite-Instruct.IQ3_XS.gguf](https://huggingface.co/CISCai/DeepSeek-Coder-V2-Lite-Instruct-SOTA-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct.IQ3_XS.gguf) | IQ3_XS | 3 | 6.5 GB| 7.5 GB | small, substantial quality loss |
92
  | [DeepSeek-Coder-V2-Lite-Instruct.IQ3_S.gguf](https://huggingface.co/CISCai/DeepSeek-Coder-V2-Lite-Instruct-SOTA-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct.IQ3_S.gguf) | IQ3_S | 3 | 6.8 GB| 7.8 GB | small, greater quality loss |