我安装了visual studio2022中的c++工具集,然后安装llama-cpp-python==0.2.23成功
但是尝试安装llama-cpp-python==0.1.83失败,错误如下,望不吝赐教!
pip install llama-cpp-python==0.1.83 --verbose
Using pip 23.3.1 from D:\install\Anaconda3\envs\python310\lib\site-packages\pip (python 3.10)
Collecting llama-cpp-python==0.1.83
Using cached llama_cpp_python-0.1.83.tar.gz (1.8 MB)
Successfully installed cmake-3.28.3 distro-1.9.0 ninja-1.11.1.1 packaging-23.2 scikit-build-0.17.6 setuptools-69.1.1 tomli-2.0.1 wheel-0.42.0
Installing build dependencies ... done
Running command Getting requirements to build wheel
running egg_info
writing llama_cpp_python.egg-info\PKG-INFO
Requirement already satisfied: typing-extensions>=4.5.0 in d:\install\anaconda3\envs\python310\lib\site-packages (from llama-cpp-python==0.1.83) (4.10.0)
Requirement already satisfied: numpy>=1.20.0 in d:\install\anaconda3\envs\python310\lib\site-packages (from llama-cpp-python==0.1.83) (1.26.4)
Requirement already satisfied: diskcache>=5.6.1 in d:\install\anaconda3\envs\python310\lib\site-packages (from llama-cpp-python==0.1.83) (5.6.3)
Building wheels for collected packages: llama-cpp-python
Running command Building wheel for llama-cpp-python (pyproject.toml)
-- Trying 'Ninja (Visual Studio 17 2022 x64 v143)' generator - success
--------------------------------------------------------------------------------
Not searching for unused variables given on the command line.
-- The C compiler identification is MSVC 19.39.33521.0
-- The CXX compiler identification is MSVC 19.39.33521.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: D:/install/vs2022/VC/Tools/MSVC/14.39.33519/bin/Hostx86/x64/cl.exe - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: D:/install/vs2022/VC/Tools/MSVC/14.39.33519/bin/Hostx86/x64/cl.exe - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: D:/install/Git/Git/cmd/git.exe (found version "2.31.1.windows.1")
fatal: not a git repository (or any of the parent directories): .git
fatal: not a git repository (or any of the parent directories): .git
CMake Warning at vendor/llama.cpp/CMakeLists.txt:118 (message):
Git repository not found; to enable automatic generation of build info,
make sure Git is installed and the project is a Git repository.
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - not found
-- Found Threads: TRUE
-- CMAKE_SYSTEM_PROCESSOR: AMD64
-- x86 detected
-- Configuring done (3.4s)
-- Generating done (0.1s)
-- Build files have been written to: C:/Users/23618/AppData/Local/Temp/pip-install-sxkzepw8/llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754/_skbuild/win-amd64-3.10/cmake-build
[1/11] Building CXX object vendor\llama.cpp\CMakeFiles\llama.dir\llama.cpp.obj
FAILED: vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.obj
D:\install\vs2022\VC\Tools\MSVC\14.39.33519\bin\Hostx86\x64\cl.exe /nologo /TP -DGGML_USE_K_QUANTS -DLLAMA_BUILD -DLLAMA_SHARED -D_CRT_SECURE_NO_WARNINGS -Dllama_EXPORTS -IC:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\vendor\llama.cpp\. /DWIN32 /D_WINDOWS /EHsc /O2 /Ob2 /DNDEBUG -MD /arch:AVX2 /showIncludes /Fovendor\llama.cpp\CMakeFiles\llama.dir\llama.cpp.obj /Fdvendor\llama.cpp\CMakeFiles\llama.dir\ /FS -c C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\vendor\llama.cpp\llama.cpp
D:\install\mingw\include\unistd.h(40): warning C4068: 未知的杂注“GCC”
D:\install\mingw\include\msvcrtver.h(35): warning C4068: 未知的杂注“GCC”
D:\install\mingw\include\w32api.h(35): warning C4068: 未知的杂注“GCC”
D:\install\mingw\include\io.h(38): warning C4068: 未知的杂注“GCC”
D:\install\mingw\include\stdint.h(34): warning C4068: 未知的杂注“GCC”
D:\install\mingw\include\io.h(94): warning C4005: “FILENAME_MAX”: 宏重定义
D:\Windows Kits\10\include\10.0.22621.0\ucrt\stdio.h(63): note: 参见“FILENAME_MAX”的前一个定义
D:\install\mingw\include\io.h(201): warning C4229: 使用了记时错误: 忽略数据上的修饰符
[2/11] Building C object vendor\llama.cpp\CMakeFiles\ggml.dir\ggml-alloc.c.obj
[3/11] Building C object vendor\llama.cpp\CMakeFiles\ggml.dir\k_quants.c.obj
[4/11] Building CXX object vendor\llama.cpp\common\CMakeFiles\common.dir\grammar-parser.cpp.obj
[5/11] Building CXX object vendor\llama.cpp\common\CMakeFiles\common.dir\console.cpp.obj
[6/11] Building C object vendor\llama.cpp\CMakeFiles\ggml.dir\ggml.c.obj
[7/11] Building CXX object vendor\llama.cpp\common\CMakeFiles\common.dir\common.cpp.obj
C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\vendor\llama.cpp\common\common.cpp(1008): warning C4477: “fprintf”: 格式字符串“%ld”需要类型“long”的参数,但可变参数 1 拥有了类型“const size_t”
C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\vendor\llama.cpp\common\common.cpp(1008): note: 请考虑在格式字符串中使用“%zd”
ninja: build stopped: subcommand failed.
Traceback (most recent call last):
File "C:\Users\23618\AppData\Local\Temp\pip-build-env-guvl62t8\overlay\Lib\site-packages\skbuild\setuptools_wrap.py", line 674, in setup
cmkr.make(make_args, install_target=cmake_install_target, env=env)
File "C:\Users\23618\AppData\Local\Temp\pip-build-env-guvl62t8\overlay\Lib\site-packages\skbuild\cmaker.py", line 697, in make
self.make_impl(clargs=clargs, config=config, source_dir=source_dir, install_target=install_target, env=env)
File "C:\Users\23618\AppData\Local\Temp\pip-build-env-guvl62t8\overlay\Lib\site-packages\skbuild\cmaker.py", line 742, in make_impl
raise SKBuildError(msg)
An error occurred while building with CMake.
Command:
'C:\Users\23618\AppData\Local\Temp\pip-build-env-guvl62t8\overlay\Lib\site-packages\cmake\data\bin/cmake.exe' --build . --target install --config Release --
Install target:
install
Source directory:
C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754
Working directory:
C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754\_skbuild\win-amd64-3.10\cmake-build
Please check the install target is valid and see CMake's output for more information.
error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
full command: 'D:\install\Anaconda3\envs\python310\python.exe' 'D:\install\Anaconda3\envs\python310\lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py' build_wheel 'C:\Users\23618\AppData\Local\Temp\tmp0juy06rg'
cwd: C:\Users\23618\AppData\Local\Temp\pip-install-sxkzepw8\llama-cpp-python_2b7ae0b70a2844b7924b4ea517d62754
Building wheel for llama-cpp-python (pyproject.toml) ... error
ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects