You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using the vulkan feature on both llama_cpp and whisper-rs from the same crate, build fails with many error messages similiar to this one:
/bin/ld: /home/user/Programming/Rust/tests/problem_isolation/vulkan-ai/target/debug/deps/libllama_cpp_sys-6172acf8774b4f66.rlib(3487caebe0aa41d4-ggml-vulkan.o):/home/user/.cargo/registry/src/index.crates.io-6f17d22bba15001f/llama_cpp_sys-0.3.2/./thirdparty/llama.cpp/ggml-vulkan-shaders.hpp:69574: multiple definition of `sqr_f32_data'; /home/user/Programming/Rust/tests/problem_isolation/vulkan-ai/target/debug/deps/libwhisper_rs_sys-a2fb035baa881db4.rlib(ggml-vulkan-shaders.cpp.o):/home/user/Programming/Rust/tests/problem_isolation/vulkan-ai/target/debug/build/whisper-rs-sys-ea85f180824249f0/out/build/ggml/src/ggml-vulkan-shaders.cpp:126068: first defined here
collect2: error: ld returned 1 exit status
Beware, that just creating a crate with dependencies on llama_cpp and whisper-rs with vulkan features isn't enough to reproduce this error, because (I guess) rust compiler optimizes the linking away so you need to actually use both crates in the code.
The problem is that both crates links to ggml and that result in duplicate functions.
You have different options:
Link only in one crate (modify the code of one crate in build.rs) but you will need to make sure that both can work with the same linked ggml (should be same version)
Link both but rename symbols of one of them before link (complicated but can work and inefficient)
The rusty solution for future is to create ggml-rs crate that both whisper-rs and llama-rs uses. I started a while ago to work on it in ggml-rs but it require more work
Link only in one crate (modify the code of one crate in build.rs) but you will need to make sure that both can work with the same linked ggml (should be same version)
Yeah, I'm trying that to see if it works
Link both but rename symbols of one of them before link (complicated but can work and inefficient)
Interestingly enough, llama_cpp-rs has already been doing this and it works well - that's the reason that llama and whisper crates can work together at all. The problem is, that it doesn't work for some vulkan-specific code in ggml, so the error appears only if both of them use vulkan.
When using the
vulkan
feature on bothllama_cpp
andwhisper-rs
from the same crate, build fails with many error messages similiar to this one:To reproduce
Beware, that just creating a crate with dependencies on
llama_cpp
andwhisper-rs
withvulkan
features isn't enough to reproduce this error, because (I guess) rust compiler optimizes the linking away so you need to actually use both crates in the code.I have made a simple crate to reproduce this error. Build it with
git clone https://github.com/FireFragment/vulkan-ai-linkage-error cd vulkan-ai-linkage-error cargo b
If you use Nix, you can reproduce the full environment I got the build failure with
git clone https://github.com/FireFragment/vulkan-ai-linkage-error cd vulkan-ai-linkage-error nix develop -c cargo b
The text was updated successfully, but these errors were encountered: