七十二时plus 2024-03-03 05:42 采纳率: 33.3%
浏览 497
已结题

安装llama-cpp-python==0.1.83失败

我安装了visual studio2022中的c++工具集,然后安装llama-cpp-python==0.2.23成功
但是尝试安装llama-cpp-python==0.1.83失败,错误如下,望不吝赐教!

pip install llama-cpp-python==0.1.83 --verbose

Using pip 23.3.1 from D:\install\Anaconda3\envs\python310\lib\site-packages\pip (python 3.10)
Collecting llama-cpp-python==0.1.83
  Using cached llama_cpp_python-0.1.83.tar.gz (1.8 MB)
  Successfully installed cmake-3.28.3 distro-1.9.0 ninja-1.11.1.1 packaging-23.2 scikit-build-0.17.6 setuptools-69.1.1 tomli-2.0.1 wheel-0.42.0
  Installing build dependencies ... done
  Running command Getting requirements to build wheel
  running egg_info
  writing llama_cpp_python.egg-info\PKG-INFO

Requirement already satisfied: typing-extensions>=4.5.0 in d:\install\anaconda3\envs\python310\lib\site-packages (from llama-cpp-python==0.1.83) (4.10.0)
Requirement already satisfied: numpy>=1.20.0 in d:\install\anaconda3\envs\python310\lib\site-packages (from llama-cpp-python==0.1.83) (1.26.4)
Requirement already satisfied: diskcache>=5.6.1 in d:\install\anaconda3\envs\python310\lib\site-packages (from llama-cpp-python==0.1.83) (5.6.3)
Building wheels for collected packages: llama-cpp-python
  Running command Building wheel for llama-cpp-python (pyproject.toml)

  -- Trying 'Ninja (Visual Studio 17 2022 x64 v143)' generator - success
  --------------------------------------------------------------------------------

  Not searching for unused variables given on the command line.
  -- The C compiler identification is MSVC 19.39.33521.0
  -- The CXX compiler identification is MSVC 19.39.33521.0
  -- Detecting C compiler ABI info
  -- Detecting C compiler ABI info - done
  -- Check for working C compiler: D:/install/vs2022/VC/Tools/MSVC/14.39.33519/bin/Hostx86/x64/cl.exe - skipped
  -- Detecting C compile features
  -- Detecting C compile features - done
  -- Detecting CXX compiler ABI info
  -- Detecting CXX compiler ABI info - done
  -- Check for working CXX compiler: D:/install/vs2022/VC/Tools/MSVC/14.39.33519/bin/Hostx86/x64/cl.exe - skipped
  -- Detecting CXX compile features
  -- Detecting CXX compile features - done
  -- Found Git: D:/install/Git/Git/cmd/git.exe (found version "2.31.1.windows.1")
  fatal: not a git repository (or any of the parent directories): .git
  fatal: not a git repository (or any of the parent directories): .git
  CMake Warning at vendor/llama.cpp/CMakeLists.txt:118 (message):
    Git repository not found; to enable automatic generation of build info,
    make sure Git is installed and the project is a Git repository.


  -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
  -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
  -- Looking for pthread_create in pthreads
  -- Looking for pthread_create in pthreads - not found
  -- Looking for pthread_create in pthread
  -- Looking for pthread_create in pthread - not found
  -- Found Threads: TRUE
  -- CMAKE_SYSTEM_PROCESSOR: AMD64
  -- x86 detected
  -- Configuring done (3.4s)
  -- Generating done (0.1s)
  -- Build files have been written to: C:/Users/23618/AppData/Local/Temp/pip-install-sxkzepw8/llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754/_skbuild/win-amd64-3.10/cmake-build
  [1/11] Building CXX object vendor\llama.cpp\CMakeFiles\llama.dir\llama.cpp.obj
  FAILED: vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.obj
  D:\install\vs2022\VC\Tools\MSVC\14.39.33519\bin\Hostx86\x64\cl.exe  /nologo /TP -DGGML_USE_K_QUANTS -DLLAMA_BUILD -DLLAMA_SHARED -D_CRT_SECURE_NO_WARNINGS -Dllama_EXPORTS -IC:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\vendor\llama.cpp\. /DWIN32 /D_WINDOWS /EHsc /O2 /Ob2 /DNDEBUG -MD /arch:AVX2 /showIncludes /Fovendor\llama.cpp\CMakeFiles\llama.dir\llama.cpp.obj /Fdvendor\llama.cpp\CMakeFiles\llama.dir\ /FS -c C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\vendor\llama.cpp\llama.cpp
  D:\install\mingw\include\unistd.h(40): warning C4068: 未知的杂注“GCC”
  D:\install\mingw\include\msvcrtver.h(35): warning C4068: 未知的杂注“GCC”
  D:\install\mingw\include\w32api.h(35): warning C4068: 未知的杂注“GCC”
  D:\install\mingw\include\io.h(38): warning C4068: 未知的杂注“GCC”
  D:\install\mingw\include\stdint.h(34): warning C4068: 未知的杂注“GCC”
  D:\install\mingw\include\io.h(94): warning C4005: “FILENAME_MAX”: 宏重定义
  D:\Windows Kits\10\include\10.0.22621.0\ucrt\stdio.h(63): note: 参见“FILENAME_MAX”的前一个定义
  D:\install\mingw\include\io.h(201): warning C4229: 使用了记时错误: 忽略数据上的修饰符
  [2/11] Building C object vendor\llama.cpp\CMakeFiles\ggml.dir\ggml-alloc.c.obj
  [3/11] Building C object vendor\llama.cpp\CMakeFiles\ggml.dir\k_quants.c.obj
  [4/11] Building CXX object vendor\llama.cpp\common\CMakeFiles\common.dir\grammar-parser.cpp.obj
  [5/11] Building CXX object vendor\llama.cpp\common\CMakeFiles\common.dir\console.cpp.obj
  [6/11] Building C object vendor\llama.cpp\CMakeFiles\ggml.dir\ggml.c.obj
  [7/11] Building CXX object vendor\llama.cpp\common\CMakeFiles\common.dir\common.cpp.obj
  C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\vendor\llama.cpp\common\common.cpp(1008): warning C4477: “fprintf”: 格式字符串“%ld”需要类型“long”的参数,但可变参数 1 拥有了类型“const size_t”
  C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\vendor\llama.cpp\common\common.cpp(1008): note: 请考虑在格式字符串中使用“%zd”
  ninja: build stopped: subcommand failed.
  Traceback (most recent call last):
    File "C:\Users\23618\AppData\Local\Temp\pip-build-env-guvl62t8\overlay\Lib\site-packages\skbuild\setuptools_wrap.py", line 674, in setup
      cmkr.make(make_args, install_target=cmake_install_target, env=env)
    File "C:\Users\23618\AppData\Local\Temp\pip-build-env-guvl62t8\overlay\Lib\site-packages\skbuild\cmaker.py", line 697, in make
      self.make_impl(clargs=clargs, config=config, source_dir=source_dir, install_target=install_target, env=env)
    File "C:\Users\23618\AppData\Local\Temp\pip-build-env-guvl62t8\overlay\Lib\site-packages\skbuild\cmaker.py", line 742, in make_impl
      raise SKBuildError(msg)

  An error occurred while building with CMake.
    Command:
      'C:\Users\23618\AppData\Local\Temp\pip-build-env-guvl62t8\overlay\Lib\site-packages\cmake\data\bin/cmake.exe' --build . --target install --config Release --
    Install target:
      install
    Source directory:
      C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754
    Working directory:
      C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\_skbuild\win-amd64-3.10\cmake-build
  Please check the install target is valid and see CMake's output for more information.

  error: subprocess-exited-with-error

  × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> See above for output.

  note: This error originates from a subprocess, and is likely not a problem with pip.
  full command: 'D:\install\Anaconda3\envs\python310\python.exe' 'D:\install\Anaconda3\envs\python310\lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py' build_wheel 'C:\Users\23618\AppData\Local\Temp\tmp0juy06rg'
  cwd: C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754
  Building wheel for llama-cpp-python (pyproject.toml) ... error
  ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects


  • 写回答

22条回答 默认 最新

  • 专家-赤兔[在线] 优质创作者: 编程框架技术领域 2024-03-03 05:43
    关注
    获得0.45元问题酬金

    引自免费微信小程序:皆我百晓生

    在您提供的错误信息中,尝试安装llama-cpp-python==0.1.83时遇到了构建失败的问题。以下是问题分析和可能的解决步骤:

    1. 编译警告与错误

      • 在编译过程中,C++编译器(MSVC)报告了一些警告,例如未知杂注“GCC”、格式字符串不匹配(fprintf函数使用了 %ld,但参数是 size_t 类型)、以及一个宏重定义(FILENAME_MAX)。
      • 这些警告虽然不会直接导致构建失败,但可能是代码未针对 MSVC 编译器最佳优化的一个信号。
    2. 构建失败的关键点

      • 错误发生在编译llama.cpp文件时:FAILED: vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.obj
      • 结尾处显示了一个ninja: build stopped: subcommand failed.,这表示 ninja 构建系统由于子命令执行失败而停止构建。
    3. 可能的原因

      • 该版本的llama-cpp-python包可能没有适配 Visual Studio 2022 的 C++ 工具集,或者对 Windows 平台的支持不够完善,特别是在处理不同编译器差异方面。
      • 版本间的依赖或接口更改可能导致了与特定版本的 Python 或其他库兼容性问题。
    4. 解决方法建议

      • 阅读 llamacpp 的官方文档或 GitHub 仓库以了解此版本是否支持你的环境配置(包括Windows、Python 3.10、Visual Studio 2022)。
      • 尝试修改项目中的CMakeLists.txt文件,消除编译器警告,并确保所有编译选项和宏定义正确无误。
      • 如果该项目依赖于特定的编译器或工具链版本,请检查并确认已正确安装并被pip构建过程识别。
      • 联系项目的维护者或查看其GitHub仓库的issue列表,看看是否有其他人遇到类似问题并找到了解决方案。
    5. 临时替代方案

      • 您已经成功安装了llama-cpp-python==0.2.23,这意味着该版本在您的环境中能够正常工作。如果新版本提供了旧版本的所有功能并且可以满足您的需求,您可以考虑继续使用这个成功的版本而不是回退到0.1.83。
    评论 编辑记录

报告相同问题?

问题事件

  • 系统已结题 3月11日
  • 创建了问题 3月3日