Bug 8066 - [ModelScope][容器镜像]镜像缺少transformers相关依赖,导致modelscope社区自带用例执行失败。
Summary: [ModelScope][容器镜像]镜像缺少transformers相关依赖,导致modelscope社区自带用例执行失败。
Status: NEW
Alias: None
Product: Anolis OS 8
Classification: Anolis OS
Component: Images&Installations (show other bugs) Images&Installations
Version: 8.8
Hardware: All Linux
: P3-Medium S3-normal
Target Milestone: ---
Assignee: zhongling
QA Contact: shuming
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2024-01-26 18:16 UTC by yunmeng365524
Modified: 2024-01-29 15:09 UTC (History)
1 user (show)

See Also:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description yunmeng365524 2024-01-26 18:16:00 UTC
Description of problem:
镜像缺少transformers相关依赖,导致modelscope社区自带用例执行失败。

Version-Release number of selected component (if applicable):
容器镜像信息:
[root@localhost modelscope]# docker images
REPOSITORY                                     TAG          IMAGE ID       CREATED       SIZE
registry.openanolis.cn/openanolis/modelscope   1.10.0-an8   736efcaf9a7c   5 weeks ago   25.3GB


How reproducible:
以gpu的方式创建并启动容器
docker create --gpus all -it -v /tmp:/tmp 736efcaf9a7c
进入容器后clone modelscope社区代码
git clone https://github.com/modelscope/modelscope.git
cd modelscope

执行测试用例:
[root@1a186991ac23 modelscope]# python3 /tmp/modelscope/tests/utils/test_hf_util.py
2024-01-26 17:01:32,381 - modelscope - INFO - PyTorch version 2.0.1+cu118 Found.
2024-01-26 17:01:32,385 - modelscope - INFO - TensorFlow version 2.9.2 Found.
2024-01-26 17:01:32,385 - modelscope - INFO - Loading ast index from /root/.cache/modelscope/ast_indexer
2024-01-26 17:01:32,522 - modelscope - INFO - Loading done! Current index file version is 1.10.0, with md5 da292646adab6334a8f0cd8a272bf9b1 and a total number of 946 components indexed
Traceback (most recent call last):
  File "/opt/conda/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1382, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/opt/conda/lib/python3.8/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 843, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/opt/conda/lib/python3.8/site-packages/transformers/models/auto/image_processing_auto.py", line 26, in <module>
    from ...image_processing_utils import ImageProcessingMixin
  File "/opt/conda/lib/python3.8/site-packages/transformers/image_processing_utils.py", line 28, in <module>
    from .image_transforms import center_crop, normalize, rescale
  File "/opt/conda/lib/python3.8/site-packages/transformers/image_transforms.py", line 47, in <module>
    import tensorflow as tf
  File "/opt/conda/lib/python3.8/site-packages/tensorflow/__init__.py", line 37, in <module>
    from tensorflow.python.tools import module_util as _module_util
  File "/opt/conda/lib/python3.8/site-packages/tensorflow/python/__init__.py", line 45, in <module>
    from tensorflow.python.feature_column import feature_column_lib as feature_column
  File "/opt/conda/lib/python3.8/site-packages/tensorflow/python/feature_column/feature_column_lib.py", line 18, in <module>
    from tensorflow.python.feature_column.feature_column import *
  File "/opt/conda/lib/python3.8/site-packages/tensorflow/python/feature_column/feature_column.py", line 143, in <module>
    from tensorflow.python.layers import base
  File "/opt/conda/lib/python3.8/site-packages/tensorflow/python/layers/base.py", line 16, in <module>
    from tensorflow.python.keras.legacy_tf_layers import base
  File "/opt/conda/lib/python3.8/site-packages/tensorflow/python/keras/__init__.py", line 25, in <module>
    from tensorflow.python.keras import models
  File "/opt/conda/lib/python3.8/site-packages/tensorflow/python/keras/models.py", line 22, in <module>
    from tensorflow.python.keras.engine import functional
  File "/opt/conda/lib/python3.8/site-packages/tensorflow/python/keras/engine/functional.py", line 32, in <module>
    from tensorflow.python.keras.engine import training as training_lib
  File "/opt/conda/lib/python3.8/site-packages/tensorflow/python/keras/engine/training.py", line 52, in <module>
    from tensorflow.python.keras.saving import hdf5_format
  File "/opt/conda/lib/python3.8/site-packages/tensorflow/python/keras/saving/hdf5_format.py", line 37, in <module>
    import h5py
  File "/opt/conda/lib/python3.8/site-packages/h5py/__init__.py", line 46, in <module>
    from ._conv import register_converters as _register_converters
  File "h5py/h5t.pxd", line 14, in init h5py._conv
  File "h5py/h5t.pyx", line 293, in init h5py.h5t
  File "/opt/conda/lib/python3.8/site-packages/numpy/__init__.py", line 320, in __getattr__
    raise AttributeError("module {!r} has no attribute "
AttributeError: module 'numpy' has no attribute 'typeDict'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.8/site-packages/modelscope/utils/import_utils.py", line 451, in _get_module
    return importlib.import_module('.' + module_name, self.__name__)
  File "/opt/conda/lib/python3.8/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 843, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/opt/conda/lib/python3.8/site-packages/modelscope/utils/hf_util.py", line 5, in <module>
    from transformers import AutoImageProcessor as AutoImageProcessorHF
  File "<frozen importlib._bootstrap>", line 1039, in _handle_fromlist
  File "/opt/conda/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1373, in __getattr__
    value = getattr(module, name)
  File "/opt/conda/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1372, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/opt/conda/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1384, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.models.auto.image_processing_auto because of the following error (look up to see its traceback):
module 'numpy' has no attribute 'typeDict'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/tmp/modelscope/tests/utils/test_hf_util.py", line 7, in <module>
    from modelscope import (AutoConfig, AutoModel, AutoModelForCausalLM,
  File "<frozen importlib._bootstrap>", line 1039, in _handle_fromlist
  File "/opt/conda/lib/python3.8/site-packages/modelscope/utils/import_utils.py", line 434, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/opt/conda/lib/python3.8/site-packages/modelscope/utils/import_utils.py", line 453, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import modelscope.utils.hf_util because of the following error (look up to see its traceback):
Failed to import transformers.models.auto.image_processing_auto because of the following error (look up to see its traceback):
module 'numpy' has no attribute 'typeDict'
[root@1a186991ac23 modelscope]#


Steps to Reproduce:
同上

Actual results:
用例执行失败

Expected results:
用例执行成功

Additional info:
对比modelscope 官方ubuntu镜像的执行结果:
root@ee8906738fa2:/tmp/modelscope# python3 /tmp/modelscope/tests/utils/test_hf_util.py
2024-01-26 17:01:03,767 - modelscope - INFO - PyTorch version 2.1.0+cu118 Found.
2024-01-26 17:01:03,769 - modelscope - INFO - TensorFlow version 2.14.0 Found.
2024-01-26 17:01:03,769 - modelscope - INFO - Loading ast index from /mnt/workspace/.cache/modelscope/ast_indexer
2024-01-26 17:01:03,815 - modelscope - INFO - Loading done! Current index file version is 1.10.0, with md5 44f0b88effe82ceea94a98cf99709694 and a total number of 946 components indexed
2024-01-26 17:01:04.113310: E tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:9342] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2024-01-26 17:01:04.113353: E tensorflow/compiler/xla/stream_executor/cuda/cuda_fft.cc:609] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2024-01-26 17:01:04.113385: E tensorflow/compiler/xla/stream_executor/cuda/cuda_blas.cc:1518] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2024-01-26 17:01:04.123578: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2024-01-26 17:01:05.247654: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
2024-01-26 17:01:06,136 - modelscope - INFO - Use user-specified model revision: v1.0.3
Downloading: 100%|███████████████████████████████████████████| 154k/154k [00:00<00:00, 7.60MB/s]
Downloading: 100%|███████████████████████████████████████████| 183k/183k [00:00<00:00, 8.68MB/s]
Downloading: 100%|███████████████████████████████████████████| 142k/142k [00:00<00:00, 7.83MB/s]
Downloading: 100%|███████████████████████████████████████████| 196k/196k [00:00<00:00, 9.04MB/s]
Downloading: 100%|█████████████████████████████████████████████| 773/773 [00:00<00:00, 9.37MB/s]
Downloading: 100%|█████████████████████████████████████████████| 215/215 [00:00<00:00, 2.63MB/s]
Downloading: 100%|█████████████████████████████████████████| 1.46k/1.46k [00:00<00:00, 14.0MB/s]
Downloading: 100%|█████████████████████████████████████████████| 284/284 [00:00<00:00, 3.48MB/s]
Downloading: 100%|█████████████████████████████████████████| 20.7k/20.7k [00:00<00:00, 3.37MB/s]
Downloading: 100%|█████████████████████████████████████████| 3.16k/3.16k [00:00<00:00, 39.9MB/s]
Downloading: 100%|█████████████████████████████████████████| 20.6k/20.6k [00:00<00:00, 3.76MB/s]
Downloading: 100%|█████████████████████████████████████████| 13.9k/13.9k [00:00<00:00, 2.27MB/s]
Downloading: 100%|████████████████████████████████████████████| 72.0/72.0 [00:00<00:00, 694kB/s]
Downloading: 100%|█████████████████████████████████████████| 8.52k/8.52k [00:00<00:00, 44.1MB/s]
Downloading: 100%|█████████████████████████████████████████| 2.00M/2.00M [00:00<00:00, 35.8MB/s]
Downloading: 100%|█████████████████████████████████████████████| 819/819 [00:00<00:00, 7.58MB/s]
2024-01-26 17:01:09,426 - modelscope - INFO - Use user-specified model revision: v1.0.3
.2024-01-26 17:01:10,110 - modelscope - WARNING - Model revision not specified, use revision: v1.0.7
Downloading: 100%|███████████████████████████████████████████| 757k/757k [00:00<00:00, 22.9MB/s]
Downloading: 100%|█████████████████████████████████████████████| 656/656 [00:00<00:00, 7.66MB/s]
Downloading: 100%|█████████████████████████████████████████████| 216/216 [00:00<00:00, 2.63MB/s]
Downloading: 100%|█████████████████████████████████████████| 2.29k/2.29k [00:00<00:00, 28.1MB/s]
Downloading: 100%|█████████████████████████████████████████████| 132/132 [00:00<00:00, 1.56MB/s]
Downloading: 100%|███████████████████████████████████████████| 171k/171k [00:00<00:00, 6.37MB/s]
Downloading: 100%|█████████████████████████████████████████| 27.9k/27.9k [00:00<00:00, 3.89MB/s]
Downloading: 100%|█████████████████████████████████████████| 2.74k/2.74k [00:00<00:00, 34.3MB/s]
Downloading: 100%|████████████████████████████████████████▉| 13.0G/13.0G [19:08<00:00, 12.2MB/s]
Downloading: 100%|█████████████████████████████████████████| 1.13k/1.13k [00:00<00:00, 8.87MB/s]
Downloading: 100%|█████████████████████████████████████████| 12.1k/12.1k [00:00<00:00, 1.91MB/s]
Downloading: 100%|█████████████████████████████████████████████| 411/411 [00:00<00:00, 4.61MB/s]
Downloading: 100%|█████████████████████████████████████████| 9.35k/9.35k [00:00<00:00, 50.3MB/s]
Downloading: 100%|█████████████████████████████████████████| 1.08M/1.08M [00:00<00:00, 12.7MB/s]
Downloading: 100%|█████████████████████████████████████████████| 802/802 [00:00<00:00, 8.90MB/s]
Downloading: 100%|███████████████████████████████████████████| 149k/149k [00:00<00:00, 8.03MB/s]
.2024-01-26 17:22:19,228 - modelscope - INFO - Use user-specified model revision: v1.0.3
Downloading: 100%|█████████████████████████████████████████████| 716/716 [00:00<00:00, 7.22MB/s]
Downloading: 100%|█████████████████████████████████████████████| 215/215 [00:00<00:00, 2.44MB/s]
Downloading: 100%|█████████████████████████████████████████| 2.39k/2.39k [00:00<00:00, 29.0MB/s]
Downloading: 100%|█████████████████████████████████████████████| 285/285 [00:00<00:00, 3.43MB/s]
Downloading: 100%|█████████████████████████████████████████| 2.90k/2.90k [00:00<00:00, 34.7MB/s]
Downloading: 100%|█████████████████████████████████████████| 32.3k/32.3k [00:00<00:00, 2.57MB/s]
Downloading: 100%|█████████████████████████████████████████| 3.28k/3.28k [00:00<00:00, 40.9MB/s]
Downloading: 100%|█████████████████████████████████████████| 8.93k/8.93k [00:00<00:00, 52.5MB/s]
Downloading: 100%|█████████████████████████████████████████| 9.24k/9.24k [00:00<00:00, 54.7MB/s]
Downloading: 100%|█████████████████████████████████████████████| 548/548 [00:00<00:00, 6.47MB/s]
Downloading: 100%|█████████████████████████████████████████| 9.39k/9.39k [00:00<00:00, 55.7MB/s]
Downloading: 100%|█████████████████████████████████████████| 1.91M/1.91M [00:00<00:00, 7.47MB/s]
Downloading: 100%|█████████████████████████████████████████████| 795/795 [00:00<00:00, 9.58MB/s]
..2024-01-26 17:22:22,969 - modelscope - INFO - Use user-specified model revision: v1.0.1
Downloading: 100%|█████████████████████████████████████████| 21.7M/21.7M [00:01<00:00, 16.2MB/s]
Downloading: 100%|█████████████████████████████████████████████| 427/427 [00:00<00:00, 3.92MB/s]
Downloading: 100%|█████████████████████████████████████████████| 302/302 [00:00<00:00, 3.40MB/s]
Downloading: 100%|█████████████████████████████████████████| 1.13k/1.13k [00:00<00:00, 13.6MB/s]
Downloading: 100%|█████████████████████████████████████████████| 124/124 [00:00<00:00, 1.48MB/s]
Downloading: 100%|█████████████████████████████████████████| 10.4k/10.4k [00:00<00:00, 59.0MB/s]
Downloading: 100%|█████████████████████████████████████████| 8.35k/8.35k [00:00<00:00, 49.7MB/s]
Downloading: 100%|███████████████████████████████████████████| 2.00/2.00 [00:00<00:00, 24.3kB/s]
Downloading: 100%|███████████████████████████████████████████| 488k/488k [00:00<00:00, 12.1MB/s]
Downloading: 100%|█████████████████████████████████████████████| 141/141 [00:00<00:00, 1.70MB/s]
You are using the default legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. This is expected, and simply means that the `legacy` (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set `legacy=False`. This should only be set if you understand what it means, and thouroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565
2024-01-26 17:22:27,876 - modelscope - INFO - Use user-specified model revision: v1.0.1
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.9MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.8MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.8MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.8MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.9MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.9MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.9MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.8MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.8MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.9MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.9MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.8MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.9MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.9MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.8MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:32<00:00, 12.6MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.7MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.7MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:32<00:00, 12.6MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:32<00:00, 12.6MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:32<00:00, 12.6MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.8MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:32<00:00, 12.6MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.7MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.7MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:32<00:00, 12.6MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:32<00:00, 12.6MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:32<00:00, 12.6MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:32<00:00, 12.6MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.7MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:31<00:00, 12.7MB/s]
Downloading: 100%|███████████████████████████████████████████| 386M/386M [00:32<00:00, 12.6MB/s]
Downloading: 100%|██████████████████████████████████████████▉| 500M/500M [00:42<00:00, 12.4MB/s]
Downloading: 100%|█████████████████████████████████████████| 24.9k/24.9k [00:00<00:00, 3.24MB/s]
Loading checkpoint shards: 100%|████████████████████████████████| 33/33 [00:18<00:00,  1.79it/s]
.
----------------------------------------------------------------------
Ran 5 tests in 2471.450s

OK