[증상]
colab에서 평가 돌렸을 때 아래와 같이 에러남
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py", line 1238, in hf_hub_download
metadata = get_hf_file_metadata(
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py", line 1631, in get_hf_file_metadata
r = _request_wrapper(
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py", line 385, in _request_wrapper
response = _request_wrapper(
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py", line 409, in _request_wrapper
hf_raise_for_status(response)
File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_errors.py", line 296, in hf_raise_for_status
raise EntryNotFoundError(message, response) from e
huggingface_hub.utils._errors.EntryNotFoundError: 404 Client Error. (Request ID: Root=1-65d2f58b-6bbe66047d0d87c019aa16ba;557e17ef-bc31-490e-bd8b-52d4921f268b)
Entry Not Found for url: https://huggingface.co/cashbook/SOLAR-Platypus-10.7B-v1-kjw/resolve/main/config.json.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/bin/lm_eval", line 8, in <module>
sys.exit(cli_evaluate())
File "/content/drive/MyDrive/FastCampus-LLM/lm-evaluation-harness/lm_eval/__main__.py", line 279, in cli_evaluate
results = evaluator.simple_evaluate(
File "/content/drive/MyDrive/FastCampus-LLM/lm-evaluation-harness/lm_eval/utils.py", line 288, in _wrapper
return fn(*args, **kwargs)
File "/content/drive/MyDrive/FastCampus-LLM/lm-evaluation-harness/lm_eval/evaluator.py", line 123, in simple_evaluate
lm = lm_eval.api.registry.get_model(model).create_from_arg_string(
File "/content/drive/MyDrive/FastCampus-LLM/lm-evaluation-harness/lm_eval/api/model.py", line 134, in create_from_arg_string
return cls(**args, **args2)
File "/content/drive/MyDrive/FastCampus-LLM/lm-evaluation-harness/lm_eval/models/huggingface.py", line 187, in __init__
self._get_config(
File "/content/drive/MyDrive/FastCampus-LLM/lm-evaluation-harness/lm_eval/models/huggingface.py", line 444, in _get_config
self._config = transformers.AutoConfig.from_pretrained(
File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py", line 1048, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/configuration_utils.py", line 622, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/configuration_utils.py", line 677, in _get_config_dict
resolved_config_file = cached_file(
File "/usr/local/lib/python3.10/dist-packages/transformers/utils/hub.py", line 481, in cached_file
raise EnvironmentError(
OSError: cashbook/SOLAR-Platypus-10.7B-v1-kjw does not appear to have a file named config.json. Checkout 'https://huggingface.co/cashbook/SOLAR-Platypus-10.7B-v1-kjw/main' for available files.
[해결]
config.json이 없어서 그럼.
허깅페이스에서도 이 파일 없으면 모델 없다고 안올라감.
모델 업로드할 때 필요한 파일들 중 config.json만 누락되어 있어서,
구글 드라이브에 올라와 있는 거 받아서 수동으로 이것만 올리니 해결.
'LLM' 카테고리의 다른 글
[Fine-tuning] Zero-shot 평가 (영어) (0) | 2024.02.20 |
---|---|
BART (Bidirectional Auto-Regressive Transformer) (0) | 2024.02.20 |