Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Linking error with llama_cpp-rs when using vulkan: multiple definitions of symbol #194

Open
FireFragment opened this issue Dec 8, 2024 · 2 comments

Comments

@FireFragment
Copy link

When using the vulkan feature on both llama_cpp and whisper-rs from the same crate, build fails with many error messages similiar to this one:

/bin/ld: /home/user/Programming/Rust/tests/problem_isolation/vulkan-ai/target/debug/deps/libllama_cpp_sys-6172acf8774b4f66.rlib(3487caebe0aa41d4-ggml-vulkan.o):/home/user/.cargo/registry/src/index.crates.io-6f17d22bba15001f/llama_cpp_sys-0.3.2/./thirdparty/llama.cpp/ggml-vulkan-shaders.hpp:69574: multiple definition of `sqr_f32_data'; /home/user/Programming/Rust/tests/problem_isolation/vulkan-ai/target/debug/deps/libwhisper_rs_sys-a2fb035baa881db4.rlib(ggml-vulkan-shaders.cpp.o):/home/user/Programming/Rust/tests/problem_isolation/vulkan-ai/target/debug/build/whisper-rs-sys-ea85f180824249f0/out/build/ggml/src/ggml-vulkan-shaders.cpp:126068: first defined here
collect2: error: ld returned 1 exit status

Note: See also similar issue I've made in llama_cpp-rs repo: edgenai/llama_cpp-rs#95

To reproduce

Beware, that just creating a crate with dependencies on llama_cpp and whisper-rs with vulkan features isn't enough to reproduce this error, because (I guess) rust compiler optimizes the linking away so you need to actually use both crates in the code.

I have made a simple crate to reproduce this error. Build it with

git clone https://github.com/FireFragment/vulkan-ai-linkage-error
cd vulkan-ai-linkage-error
cargo b

If you use Nix, you can reproduce the full environment I got the build failure with

git clone https://github.com/FireFragment/vulkan-ai-linkage-error
cd vulkan-ai-linkage-error
nix develop -c cargo b
@thewh1teagle
Copy link
Contributor

thewh1teagle commented Dec 8, 2024

The problem is that both crates links to ggml and that result in duplicate functions.
You have different options:

  • Link only in one crate (modify the code of one crate in build.rs) but you will need to make sure that both can work with the same linked ggml (should be same version)
  • Link both but rename symbols of one of them before link (complicated but can work and inefficient)

The rusty solution for future is to create ggml-rs crate that both whisper-rs and llama-rs uses. I started a while ago to work on it in ggml-rs but it require more work

@FireFragment
Copy link
Author

FireFragment commented Dec 17, 2024

Thanks @thewh1teagle !

Link only in one crate (modify the code of one crate in build.rs) but you will need to make sure that both can work with the same linked ggml (should be same version)

Yeah, I'm trying that to see if it works

Link both but rename symbols of one of them before link (complicated but can work and inefficient)

Interestingly enough, llama_cpp-rs has already been doing this and it works well - that's the reason that llama and whisper crates can work together at all. The problem is, that it doesn't work for some vulkan-specific code in ggml, so the error appears only if both of them use vulkan.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants