Edit model card
from transformers import pipeline

messages = [
    {"role": "user", "content": "Who are you?"},
]
pipe = pipeline("text-generation", model="heegyu/gemma-2-9b-lima", device_map="auto", torch_dtype="auto")
print(pipe(messages, max_new_tokens=128, eos_token_id=107))

output: I am an AI assistant, how can I help you today?

Downloads last month
7
Safetensors
Model size
9.24B params
Tensor type
BF16
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Datasets used to train heegyu/gemma-2-9b-lima