text2light

按照https://github.com/FrozenBurning/Text2Light 中描述的运行时出现下面的报错。



(text2light) D:\biye\Text2Light-master>python text2light.py -rg logs/global_sampler_clip -rl logs/local_sampler_outdoor --outdir ./generated_panorama --text "brown wooden hallway during night time" --clip clip_emb.npy --sritmo ./logs/sritmo.pth --sr_factor 4
Resuming from global sampler ckpt...
logdir:logs/global_sampler_clip
logs/global_sampler_clip\checkpoints\last.ckpt
Deleting the first-stage restore-ckpt path from the config...
Working with z of shape (1, 256, 16, 16) = 65536 dimensions.
Working with z of shape (1, 256, 16, 16) = 65536 dimensions.
Traceback (most recent call last):
  File "text2light.py", line 296, in <module>
    global_sampler = load_model(config, ckpt, gpu, eval_mode)
  File "text2light.py", line 259, in load_model
    model = load_model_from_config(config.model, state_dict, gpu=gpu, eval_mode=eval_mode)["model"]
  File "text2light.py", line 242, in load_model_from_config
    model = instantiate_from_config(config)
  File "D:\biye\Text2Light-master\taming\util.py", line 28, in instantiate_from_config
    return get_obj_from_str(config["target"])(**config.get("params", dict()))
  File "D:\biye\Text2Light-master\taming\models\global_sampler.py", line 127, in __init__
    super().__init__(transformer_config, first_stage_config, cond_stage_config, permuter_config, ckpt_path, ignore_keys, first_stage_key, cond_stage_key, downsample_cond_size, pkeep, sos_token, unconditional)
  File "D:\biye\Text2Light-master\taming\models\base_sampler.py", line 39, in __init__
    self.transformer = instantiate_from_config(config=transformer_config)
  File "D:\biye\Text2Light-master\taming\util.py", line 28, in instantiate_from_config
    return get_obj_from_str(config["target"])(**config.get("params", dict()))
  File "D:\biye\Text2Light-master\taming\util.py", line 23, in get_obj_from_str
    return getattr(importlib.import_module(module, package=None), cls)
  File "C:\Users\Admin\.conda\envs\text2light\lib\importlib\__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "", line 1014, in _gcd_import
  File "", line 991, in _find_and_load
  File "", line 975, in _find_and_load_unlocked
  File "", line 671, in _load_unlocked
  File "", line 843, in exec_module
  File "", line 219, in _call_with_frames_removed
  File "D:\biye\Text2Light-master\taming\modules\transformer\mingpt.py", line 17, in <module>
    from transformers import top_k_top_p_filtering
  File "C:\Users\Admin\.conda\envs\text2light\lib\site-packages\transformers\__init__.py", line 43, in <module>
    from . import dependency_versions_check
  File "C:\Users\Admin\.conda\envs\text2light\lib\site-packages\transformers\dependency_versions_check.py", line 41, in <module>
    require_version_core(deps[pkg])
  File "C:\Users\Admin\.conda\envs\text2light\lib\site-packages\transformers\utils\versions.py", line 94, in require_version_core
    return require_version(requirement, hint)
  File "C:\Users\Admin\.conda\envs\text2light\lib\site-packages\transformers\utils\versions.py", line 85, in require_version
    if want_ver is not None and not ops[op](version.parse(got_ver), version.parse(want_ver)):
  File "C:\Users\Admin\.conda\envs\text2light\lib\site-packages\packaging\version.py", line 52, in parse
    return Version(version)
  File "C:\Users\Admin\.conda\envs\text2light\lib\site-packages\packaging\version.py", line 197, in __init__
    raise InvalidVersion(f"Invalid version: '{version}'")
packaging.version.InvalidVersion: Invalid version: '0.10.1,<0.11'

回答不易,求求您采纳点赞哦 感激不尽

  • 根据报错信息,看起来是在导入 transformers 模块时出现了问题,导致程序无法继续运行。具体来说,报错信息中指出了 packaging.version.InvalidVersion: Invalid version: '0.10.1,<0.11',这可能是因为 transformers 模块依赖的某个包的版本不兼容造成的。

  • 尝试升级或降级 transformers 模块可能有助于解决这个问题。可以尝试使用 pip install transformers==4.4.2 命令将 transformers 模块降级到4.4.2版本,或者尝试升级到最新版本,例如使用 pip install transformers==4.14.0 命令升级到4.14.0版本。

  • 如果升降级 transformers 模块后问题仍然存在,那么可能是其他依赖库的版本与 transformers 模块存在冲突,需要进一步排查和解决。