Getting ModuleNotFoundError: No module named 'transformers_modules.moss-moon-003-sft-int4.custom_autotune'

#4
by karfly - opened

Get this error whiles executing model = AutoModelForCausalLM.from_pretrained(MODEL_PATH, trust_remote_code=True).half().cuda()

Full traceback:

Traceback (most recent call last):
  File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/app/basaran/__main__.py", line 40, in <module>
    stream_model = load_model(
  File "/app/basaran/model.py", line 342, in load_model
    model = AutoModelForCausalLM.from_pretrained(name_or_path, trust_remote_code=True).half().cuda()
  File "/usr/local/lib/python3.8/dist-packages/transformers/models/auto/auto_factory.py", line 466, in from_pretrained
    return model_class.from_pretrained(
  File "/usr/local/lib/python3.8/dist-packages/transformers/modeling_utils.py", line 2629, in from_pretrained
    model = cls(config, *model_args, **model_kwargs)
  File "/root/.cache/huggingface/modules/transformers_modules/moss-moon-003-sft-int4/modeling_moss.py", line 608, in __init__
    self.quantize(config.wbits, config.groupsize)
  File "/root/.cache/huggingface/modules/transformers_modules/moss-moon-003-sft-int4/modeling_moss.py", line 732, in quantize
    from .quantization import quantize_with_gptq
  File "/root/.cache/huggingface/modules/transformers_modules/moss-moon-003-sft-int4/quantization.py", line 8, in <module>
    from .custom_autotune import *
ModuleNotFoundError: No module named 'transformers_modules.moss-moon-003-sft-int4.custom_autotune'

How to solve it?

You could try this

import sys
sys.path.append('/root/.cache/huggingface/modules')

Sign up or log in to comment