Dragneel commited on
Commit
d6b7d99
1 Parent(s): 8e44f09

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +20 -2
README.md CHANGED
@@ -23,6 +23,24 @@ datasets:
23
  - **License:** apache-2.0
24
  - **Finetuned from model :** unsloth/Phi-3-mini-4k-instruct-bnb-4bit
25
 
26
- This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
27
 
28
- [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
23
  - **License:** apache-2.0
24
  - **Finetuned from model :** unsloth/Phi-3-mini-4k-instruct-bnb-4bit
25
 
26
+ # Use The Model
27
 
28
+ from transformers import AutoTokenizer, AutoModelForCausalLM
29
+
30
+ ## Load the tokenizer and model
31
+ tokenizer = AutoTokenizer.from_pretrained("Dragneel/Phi-3-mini-Nepali-Text-Summarization-f16")
32
+ model = AutoModelForCausalLM.from_pretrained("Dragneel/Phi-3-mini-Nepali-Text-Summarization-f16")
33
+
34
+ ## Example input text
35
+ input_text = "Your input text here."
36
+
37
+ ## Tokenize the input text
38
+ input_ids = tokenizer.encode(input_text, return_tensors='pt')
39
+
40
+ ## Generate text with adjusted parameters
41
+ outputs = model.generate(input_ids, max_new_tokens=50)
42
+
43
+ ## Decode the generated tokens
44
+ generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
45
+
46
+ print(generated_text)