七十二时plus 2024-03-03 05:42 采纳率: 33.3%
浏览 319
已结题

安装llama-cpp-python==0.1.83失败

我安装了visual studio2022中的c++工具集,然后安装llama-cpp-python==0.2.23成功
但是尝试安装llama-cpp-python==0.1.83失败,错误如下,望不吝赐教!

pip install llama-cpp-python==0.1.83 --verbose

Using pip 23.3.1 from D:\install\Anaconda3\envs\python310\lib\site-packages\pip (python 3.10)
Collecting llama-cpp-python==0.1.83
  Using cached llama_cpp_python-0.1.83.tar.gz (1.8 MB)
  Successfully installed cmake-3.28.3 distro-1.9.0 ninja-1.11.1.1 packaging-23.2 scikit-build-0.17.6 setuptools-69.1.1 tomli-2.0.1 wheel-0.42.0
  Installing build dependencies ... done
  Running command Getting requirements to build wheel
  running egg_info
  writing llama_cpp_python.egg-info\PKG-INFO

Requirement already satisfied: typing-extensions>=4.5.0 in d:\install\anaconda3\envs\python310\lib\site-packages (from llama-cpp-python==0.1.83) (4.10.0)
Requirement already satisfied: numpy>=1.20.0 in d:\install\anaconda3\envs\python310\lib\site-packages (from llama-cpp-python==0.1.83) (1.26.4)
Requirement already satisfied: diskcache>=5.6.1 in d:\install\anaconda3\envs\python310\lib\site-packages (from llama-cpp-python==0.1.83) (5.6.3)
Building wheels for collected packages: llama-cpp-python
  Running command Building wheel for llama-cpp-python (pyproject.toml)

  -- Trying 'Ninja (Visual Studio 17 2022 x64 v143)' generator - success
  --------------------------------------------------------------------------------

  Not searching for unused variables given on the command line.
  -- The C compiler identification is MSVC 19.39.33521.0
  -- The CXX compiler identification is MSVC 19.39.33521.0
  -- Detecting C compiler ABI info
  -- Detecting C compiler ABI info - done
  -- Check for working C compiler: D:/install/vs2022/VC/Tools/MSVC/14.39.33519/bin/Hostx86/x64/cl.exe - skipped
  -- Detecting C compile features
  -- Detecting C compile features - done
  -- Detecting CXX compiler ABI info
  -- Detecting CXX compiler ABI info - done
  -- Check for working CXX compiler: D:/install/vs2022/VC/Tools/MSVC/14.39.33519/bin/Hostx86/x64/cl.exe - skipped
  -- Detecting CXX compile features
  -- Detecting CXX compile features - done
  -- Found Git: D:/install/Git/Git/cmd/git.exe (found version "2.31.1.windows.1")
  fatal: not a git repository (or any of the parent directories): .git
  fatal: not a git repository (or any of the parent directories): .git
  CMake Warning at vendor/llama.cpp/CMakeLists.txt:118 (message):
    Git repository not found; to enable automatic generation of build info,
    make sure Git is installed and the project is a Git repository.


  -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
  -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
  -- Looking for pthread_create in pthreads
  -- Looking for pthread_create in pthreads - not found
  -- Looking for pthread_create in pthread
  -- Looking for pthread_create in pthread - not found
  -- Found Threads: TRUE
  -- CMAKE_SYSTEM_PROCESSOR: AMD64
  -- x86 detected
  -- Configuring done (3.4s)
  -- Generating done (0.1s)
  -- Build files have been written to: C:/Users/23618/AppData/Local/Temp/pip-install-sxkzepw8/llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754/_skbuild/win-amd64-3.10/cmake-build
  [1/11] Building CXX object vendor\llama.cpp\CMakeFiles\llama.dir\llama.cpp.obj
  FAILED: vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.obj
  D:\install\vs2022\VC\Tools\MSVC\14.39.33519\bin\Hostx86\x64\cl.exe  /nologo /TP -DGGML_USE_K_QUANTS -DLLAMA_BUILD -DLLAMA_SHARED -D_CRT_SECURE_NO_WARNINGS -Dllama_EXPORTS -IC:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\vendor\llama.cpp\. /DWIN32 /D_WINDOWS /EHsc /O2 /Ob2 /DNDEBUG -MD /arch:AVX2 /showIncludes /Fovendor\llama.cpp\CMakeFiles\llama.dir\llama.cpp.obj /Fdvendor\llama.cpp\CMakeFiles\llama.dir\ /FS -c C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\vendor\llama.cpp\llama.cpp
  D:\install\mingw\include\unistd.h(40): warning C4068: 未知的杂注“GCC”
  D:\install\mingw\include\msvcrtver.h(35): warning C4068: 未知的杂注“GCC”
  D:\install\mingw\include\w32api.h(35): warning C4068: 未知的杂注“GCC”
  D:\install\mingw\include\io.h(38): warning C4068: 未知的杂注“GCC”
  D:\install\mingw\include\stdint.h(34): warning C4068: 未知的杂注“GCC”
  D:\install\mingw\include\io.h(94): warning C4005: “FILENAME_MAX”: 宏重定义
  D:\Windows Kits\10\include\10.0.22621.0\ucrt\stdio.h(63): note: 参见“FILENAME_MAX”的前一个定义
  D:\install\mingw\include\io.h(201): warning C4229: 使用了记时错误: 忽略数据上的修饰符
  [2/11] Building C object vendor\llama.cpp\CMakeFiles\ggml.dir\ggml-alloc.c.obj
  [3/11] Building C object vendor\llama.cpp\CMakeFiles\ggml.dir\k_quants.c.obj
  [4/11] Building CXX object vendor\llama.cpp\common\CMakeFiles\common.dir\grammar-parser.cpp.obj
  [5/11] Building CXX object vendor\llama.cpp\common\CMakeFiles\common.dir\console.cpp.obj
  [6/11] Building C object vendor\llama.cpp\CMakeFiles\ggml.dir\ggml.c.obj
  [7/11] Building CXX object vendor\llama.cpp\common\CMakeFiles\common.dir\common.cpp.obj
  C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\vendor\llama.cpp\common\common.cpp(1008): warning C4477: “fprintf”: 格式字符串“%ld”需要类型“long”的参数,但可变参数 1 拥有了类型“const size_t”
  C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\vendor\llama.cpp\common\common.cpp(1008): note: 请考虑在格式字符串中使用“%zd”
  ninja: build stopped: subcommand failed.
  Traceback (most recent call last):
    File "C:\Users\23618\AppData\Local\Temp\pip-build-env-guvl62t8\overlay\Lib\site-packages\skbuild\setuptools_wrap.py", line 674, in setup
      cmkr.make(make_args, install_target=cmake_install_target, env=env)
    File "C:\Users\23618\AppData\Local\Temp\pip-build-env-guvl62t8\overlay\Lib\site-packages\skbuild\cmaker.py", line 697, in make
      self.make_impl(clargs=clargs, config=config, source_dir=source_dir, install_target=install_target, env=env)
    File "C:\Users\23618\AppData\Local\Temp\pip-build-env-guvl62t8\overlay\Lib\site-packages\skbuild\cmaker.py", line 742, in make_impl
      raise SKBuildError(msg)

  An error occurred while building with CMake.
    Command:
      'C:\Users\23618\AppData\Local\Temp\pip-build-env-guvl62t8\overlay\Lib\site-packages\cmake\data\bin/cmake.exe' --build . --target install --config Release --
    Install target:
      install
    Source directory:
      C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754
    Working directory:
      C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\_skbuild\win-amd64-3.10\cmake-build
  Please check the install target is valid and see CMake's output for more information.

  error: subprocess-exited-with-error

  × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> See above for output.

  note: This error originates from a subprocess, and is likely not a problem with pip.
  full command: 'D:\install\Anaconda3\envs\python310\python.exe' 'D:\install\Anaconda3\envs\python310\lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py' build_wheel 'C:\Users\23618\AppData\Local\Temp\tmp0juy06rg'
  cwd: C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754
  Building wheel for llama-cpp-python (pyproject.toml) ... error
  ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects


  • 写回答

22条回答 默认 最新

  • GISer Liu 2024-03-03 05:43
    关注
    获得0.45元问题酬金

    该回答引用自GPT-3.5,由博主GISer Liu编写:

    根据您提供的错误信息,问题出现在构建名为 llama-cpp-python 的包时,出现了构建错误,导致无法生成 wheel
    文件。由于这是一个包构建错误,我们需要详细分析可能的原因并提出解决方案。
    首先,我们需要检查 llama-cpp-python 包的依赖项以及其是否与当前环境兼容。可能的原因之一是 llama-cpp-python
    包依赖的某些库与您当前使用的 Python 版本或操作系统不兼容。您可以通过查看该包的文档或源代码来确定其依赖项,并确保这些依赖项在您的环境中可用。
    其次,检查构建过程中是否存在其他错误或警告信息。有时,构建过程中的警告信息可能会暗示出潜在的问题,例如缺少某些库或依赖项。
    另外,您还可以尝试以下解决方案:

    1. 更新依赖项版本 :尝试更新 llama-cpp-python 包的依赖项版本,可能存在与旧版本不兼容的问题。
    2. 查看环境配置 :检查当前环境配置,确保与 llama-cpp-python 包的要求相匹配,包括 Python 版本、操作系统版本等。
    3. 查找已知问题 :搜索 llama-cpp-python 包的相关问题和解决方案,可能其他用户已经遇到过类似的问题并提供了解决方案。
    4. 联系包作者 :如果以上方法无法解决问题,您可以尝试联系 llama-cpp-python 包的作者或维护者,向他们报告问题并寻求帮助。
    5. 尝试手动构建 :尝试手动下载 llama-cpp-python 包的源代码,并在本地进行构建,以便更详细地查看构建过程中的错误信息,并可能解决问题。
    6. 考虑使用其他包 :如果无法解决问题,您还可以考虑使用类似功能的其他包或库来替代 llama-cpp-python 包。
      最后,如果您在解决问题时遇到了具体的错误或警告信息,请提供详细的信息,以便我们能够更准确地帮助您解决问题。

    如果该回答解决了您的问题,请采纳!如果没有,请参考以下方案进行修订

    用户答题指南

    评论

报告相同问题?

问题事件

  • 系统已结题 3月11日
  • 创建了问题 3月3日

悬赏问题

  • ¥15 系统 24h2 专业工作站版,浏览文件夹的图库,视频,图片之类的怎样删除?
  • ¥15 怎么把512还原为520格式
  • ¥15 MATLAB的动态模态分解出现错误,以CFX非定常模拟结果为快照
  • ¥15 求高通平台Softsim调试经验
  • ¥15 canal如何实现将mysql多张表(月表)采集入库到目标表中(一张表)?
  • ¥15 wpf ScrollViewer实现冻结左侧宽度w范围内的视图
  • ¥15 栅极驱动低侧烧毁MOSFET
  • ¥30 写segy数据时出错3
  • ¥100 linux下qt运行QCefView demo报错
  • ¥50 F1C100S下的红外解码IR_RX驱动问题