Skip to content

Revise backend to use llama-cpp-python instead of transformers #45

Revise backend to use llama-cpp-python instead of transformers

Revise backend to use llama-cpp-python instead of transformers #45

package-release (cuda, windows-latest, cuda.txt)

succeeded Dec 3, 2024 in 10m 19s