Maani commited on
Commit
17fefd3
1 Parent(s): 5c99029

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -6
README.md CHANGED
@@ -2,6 +2,10 @@
2
  language:
3
  - en
4
  - es
 
 
 
 
5
  library_name: transformers
6
  license: apache-2.0
7
  pipeline_tag: text-generation
@@ -15,12 +19,12 @@ pipeline_tag: text-generation
15
 
16
  Excelling in writing.
17
 
18
- A fine tune of H2O Danube 1.8b on HQ DATA™ from Pixie Zehir.
19
 
20
  ## Model Details
21
 
22
  - **Developed by:** [Maani x BLNKBLK]
23
- - **Language(s) (NLP):** [English, Spanish]
24
  - **License:** [Apache 2.0]
25
  - **Finetuned from model :** [h2oai/h2o-danube-1.8b-chat]
26
 
@@ -32,7 +36,7 @@ Model is created for research purposes, it can and will hallucinate, use with ca
32
  ## Usage
33
 
34
  ```bash
35
- pip install transformers==4.36.1
36
  ```
37
  ```python
38
  import torch
@@ -43,8 +47,7 @@ pipe = pipeline(
43
  torch_dtype=torch.bfloat16,
44
  device_map="auto",
45
  )
46
- # We use the HF Tokenizer chat template to format each message
47
- # https://huggingface.co/docs/transformers/main/en/chat_templating
48
  messages = [
49
  {"role": "user", "content": "Write a haiku."},
50
  ]
@@ -59,4 +62,4 @@ res = pipe(
59
  )
60
  print(res[0]["generated_text"])
61
  # <|prompt|>Write a haiku.</s><|answer|> In the windowless room, Digital dreams consume, Unseen sun sets on a white rabbit's ears: [...]
62
- ```
 
2
  language:
3
  - en
4
  - es
5
+ - fr
6
+ - fa
7
+ - hi
8
+ - th
9
  library_name: transformers
10
  license: apache-2.0
11
  pipeline_tag: text-generation
 
19
 
20
  Excelling in writing.
21
 
22
+ A fine tune of LLAMA 3.2 1B on HQ DATA™ from Pixie Zehir.
23
 
24
  ## Model Details
25
 
26
  - **Developed by:** [Maani x BLNKBLK]
27
+ - **Language(s) (NLP):** [English, Spanish, French, Persian, Hindi, Thai]
28
  - **License:** [Apache 2.0]
29
  - **Finetuned from model :** [h2oai/h2o-danube-1.8b-chat]
30
 
 
36
  ## Usage
37
 
38
  ```bash
39
+ pip install transformers==4.45.0
40
  ```
41
  ```python
42
  import torch
 
47
  torch_dtype=torch.bfloat16,
48
  device_map="auto",
49
  )
50
+
 
51
  messages = [
52
  {"role": "user", "content": "Write a haiku."},
53
  ]
 
62
  )
63
  print(res[0]["generated_text"])
64
  # <|prompt|>Write a haiku.</s><|answer|> In the windowless room, Digital dreams consume, Unseen sun sets on a white rabbit's ears: [...]
65
+ ```