File size: 3,523 Bytes
f60c100
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c89478f
f60c100
 
 
 
 
 
 
adfa436
 
f60c100
 
 
 
 
 
 
 
 
 
 
 
f63341f
f60c100
 
 
 
9905fdd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f60c100
5dfc69e
9905fdd
f60c100
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
---
license: cc-by-sa-4.0
metrics:
- accuracy
pipeline_tag: text-generation
tags:
- code
---

A capable language model for text to SQL generation for Postgres, Redshift and Snowflake that is on-par with the most capable generalist frontier models.

![image/png](https://cdn-uploads.huggingface.co/production/uploads/603bbad3fd770a9997b57cb6/h52Z_OKYBaDDQMFZyU5pF.png)

## Model Description

Developed by: Defog, Inc
Model type: [Text to SQL]
License: [CC-by-SA-4.0]
Finetuned from model: [Meta-Llama-3-8B-Instruct]

## defog/llama-3-sqlcoder-8b for CTranslate2

**The model is quantized version of the [defog/llama-3-sqlcoder-8b](https://huggingface.co/defog/llama-3-sqlcoder-8b) with int8_float16 quantization and can be used in [CTranslate2](https://github.com/OpenNMT/CTranslate2).**



## How to use

```pip install ctranslate2```

This repository for use with [CTranslate2](https://github.com/OpenNMT/CTranslate2).

### Use with CTranslate2

This example code is obtained from [CTranslate2_transformers](https://opennmt.net/CTranslate2/guides/transformers.html#mpt) and [tokenizer AutoTokenizer](https://huggingface.co/docs/transformers/main_classes/tokenizer).  
More detailed information about the `generate_batch` methon can be found at [CTranslate2_Generator.generate_batch](https://opennmt.net/CTranslate2/python/ctranslate2.Generator.html#ctranslate2.Generator.generate_batch).

```python
import ctranslate2
import transformers

from huggingface_hub import snapshot_download
model_id = "ByteForge/Defog_llama-3-sqlcoder-8b-ct2-int8_float16"
model_path = snapshot_download(model_id)
model = ctranslate2.Generator(model_path)
tokenizer = transformers.AutoTokenizer.from_pretrained(model_id)

prompt="""
CREATE TABLE stadium (
    stadium_id number,
    location text,
    name text,
    capacity number,
    highest number,
    lowest number,
    average number
)

CREATE TABLE singer (
    singer_id number,
    name text,
    country text,
    song_name text,
    song_release_year text,
    age number,
    is_male others
)

CREATE TABLE concert (
    concert_id number,
    concert_name text,
    theme text,
    stadium_id text,
    year text
)

CREATE TABLE singer_in_concert (
    concert_id number,
    singer_id text
)

-- Using valid SQLite, answer the following questions for the tables provided above.

-- What is the maximum, the average, and the minimum capacity of stadiums ? (Generate 1 Sql query. No explaination needed)

answer:
"""

messages = [
    {"role": "system", "content": "You are SQL Expert. Given a input question and schema, answer with correct sql query"},
    {"role": "user", "content": prompt},
]

input_ids = tokenizer.apply_chat_template(
    messages, 
    tokenize=False, 
    add_generation_prompt=True
)

terminators = [
    tokenizer.eos_token_id,
    tokenizer.convert_tokens_to_ids("<|eot_id|>")
]

input_tokens = tokenizer.convert_ids_to_tokens(tokenizer.encode(input_ids))

results = model.generate_batch([input_tokens], include_prompt_in_result=False, max_length=256, sampling_temperature=0.6, sampling_topp=0.9, end_token=terminators)
output = tokenizer.decode(results[0].sequences_ids[0])

print(output)
```

## Ideal prompt and inference parameters
Set temperature to 0, and do not do sampling.

## Evaluation
This model was evaluated on SQL-Eval, a PostgreSQL based evaluation framework developed by Defog for testing and alignment of model capabilities.

You can read more about the methodology behind SQLEval [here](https://defog.ai/blog/open-sourcing-sqleval/).