--- license: apache-2.0 tags: - merge - mergekit - lazymergekit - mistralai/Mistral-7B-Instruct-v0.2 - cognitivecomputations/dolphin-2.8-mistral-7b-v02 - mlx base_model: - mistralai/Mistral-7B-Instruct-v0.2 - cognitivecomputations/dolphin-2.8-mistral-7b-v02 --- # mlx-community/pandafish-dt-7b-4bit This model was converted to MLX format from [`ichigoberry/pandafish-2-7b-32k`]() using mlx-lm version **0.6.0**. Refer to the [original model card](https://huggingface.co/ichigoberry/pandafish-2-7b-32k) for more details on the model. ## Use with mlx ```bash pip install mlx-lm ``` ```python from mlx_lm import load, generate model, tokenizer = load("mlx-community/pandafish-dt-7b-4bit") response = generate(model, tokenizer, prompt="hello", verbose=True) ```