Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementation doubt #1889

Open
lmbelo opened this issue Jan 7, 2025 · 1 comment
Open

Implementation doubt #1889

lmbelo opened this issue Jan 7, 2025 · 1 comment

Comments

@lmbelo
Copy link

lmbelo commented Jan 7, 2025

**** | DUMB QUESTION WARNING | ****

I’ve got a dumb question, but I’m curious: why doesn’t this library use LlamaCpp’s standard chat templates (llama_chat_apply_template) and instead go for a completely new implementation? I see it adds a lot of extra features, like function calling, but aren’t most (or at least a lot) of the chat formats already covered in LlamaCpp? Same goes for grammar and speculative stuff, maybe?

@sriramsowmithri9807
Copy link

Great question!! While LlamaCpp’s llama_chat_apply_template covers a lot of basics, this library offers more flexibility with features like function calling, grammar constraints, and speculative sampling things not fully addressed by the standard templates. Think of it as a multi-tool for more complex use cases.

Cheers,
sriram.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants