please download from huggingface using this link: https://huggingface.co/Milkyway-islander/Movie_Reivews_Llama-3-8B/tree/main
library_name: transformers tags:
- code license: llama3 language:
- en pipeline_tag: text-generation
model_id = "Milkyway-islander/Movie_Reivews_Llama-3-8B"
Input Models input text only.
Output Models generate text and code only.
This model is trained and fine tuned on 1500 movie reviews from IMDB movie review dataset. It aims to generate highly human like movie reviews. This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by: [Amber Zhan]
- Funded by [optional]: [More Information Needed]
- Shared by [optional]: [More Information Needed]
- Model type: [Text Generation]
- Language(s) (NLP): [English]
- License: [More Information Needed]
- Finetuned from model [optional]: [Llama3-8b]
- Repository: [More Information Needed]
- Paper [optional]: [More Information Needed]
- Demo [optional]: [More Information Needed]
You can run conversational inference by loading model directly
from transformers import AutoTokenizer, AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto", quantization_config=quantization_config, #quantization is optional attn_implementation= "flash_attention_2", force_download=True, ) tokenizer = AutoTokenizer.from_pretrained(model_id,trust_remote_code=True) tokenizer.pad_token = tokenizer.eos_token tokenizer.padding_side = "right"
inputs = tokenizer(prompt_text, return_tensors="pt", padding=True, truncation=True, max_length=4096).to("cuda") input_ids = inputs['input_ids'] num_input_tokens = input_ids.shape[1] attention_mask = inputs['attention_mask'] # Ensure the attention mask is generated
prompt_text = ""
output = model.generate( **inputs, max_length=4096 + num_input_tokens, # Adjust max_length to account for prompt tokens pad_token_id=tokenizer.eos_token_id )
response = tokenizer.decode(output[0][num_input_tokens:], skip_special_tokens=True)
print(response)
[More Information Needed]
[More Information Needed]
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
Use the code below to get started with the model.
[More Information Needed]
[More Information Needed]
[More Information Needed]
- Training regime: [More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type: [More Information Needed]
- Hours used: [More Information Needed]
- Cloud Provider: [More Information Needed]
- Compute Region: [More Information Needed]
- Carbon Emitted: [More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
BibTeX:
[More Information Needed]
APA:
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]