Safetensors
English
falcon_mamba
4-bit precision
bitsandbytes

PackageNotFoundError: No package metadata was found for bitsandbytes

#2
by Adamsadx - opened

I just copy the template, and run it. And I get the error. orz, pls help me.

model = AutoModelForCausalLM.from_pretrained("./falcon-mamba-7b-instruct-4bit", device_map="auto")


PackageNotFoundError Traceback (most recent call last)
Cell In[4], line 1
----> 1 model = AutoModelForCausalLM.from_pretrained("./falcon-mamba-7b-instruct-4bit", device_map="auto")

File c:\Users\76551.conda\envs\mamba\lib\site-packages\transformers\models\auto\auto_factory.py:564, in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
562 elif type(config) in cls._model_mapping.keys():
563 model_class = _get_model_class(config, cls._model_mapping)
--> 564 return model_class.from_pretrained(
565 pretrained_model_name_or_path, *model_args, config=config, **hub_kwargs, **kwargs
566 )
567 raise ValueError(
568 f"Unrecognized configuration class {config.class} for this kind of AutoModel: {cls.name}.\n"
569 f"Model type should be one of {', '.join(c.name for c in cls._model_mapping.keys())}."
570 )

File c:\Users\76551.conda\envs\mamba\lib\site-packages\transformers\modeling_utils.py:3389, in PreTrainedModel.from_pretrained(cls, pretrained_model_name_or_path, config, cache_dir, ignore_mismatched_sizes, force_download, local_files_only, token, revision, use_safetensors, *model_args, **kwargs)
3387 if pre_quantized or quantization_config is not None:
3388 if pre_quantized:
-> 3389 config.quantization_config = AutoHfQuantizer.merge_quantization_configs(
3390 config.quantization_config, quantization_config
3391 )
3392 else:
3393 config.quantization_config = quantization_config
...
546 return dist
547 else:
--> 548 raise PackageNotFoundError(name)

PackageNotFoundError: No package metadata was found for bitsandbytes

Technology Innovation Institute org

Hi @Adamsadx
You need to run this model using a GPU that is compatible with bitsandbytes library as mentioned in the model card. If you have that, make sure to run pip install bitsandbytes before trying to load the model

Sign up or log in to comment