Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: use_beam_search not a supported parameter #15

Closed
1 task done
blfletcher opened this issue Jan 19, 2025 · 2 comments
Closed
1 task done

[Bug]: use_beam_search not a supported parameter #15

blfletcher opened this issue Jan 19, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@blfletcher
Copy link

Your current environment

https://github.com/OpenBMB/vllm

Model Input Dumps

No response

🐛 Describe the bug

Your documentation suggests using the following SamplingParams:

sampling_params = SamplingParams(
    stop_token_ids=stop_token_ids, 
    use_beam_search=True,
    temperature=0, 
    best_of=3,
    max_tokens=1024
)

outputs = llm.generate(inputs, sampling_params=sampling_params)

But your forked copy of sampling_params.py does not include use_beam_search as a parameter:

class SamplingParams(

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
@blfletcher blfletcher added the bug Something isn't working label Jan 19, 2025
@HwwwwwwwH
Copy link
Collaborator

HwwwwwwwH commented Jan 19, 2025

I checked the examples of vLLM just now. In new version of vLLM, they build a new class to run beam search. If you want to use beam search, you should check this class:

# vllm/sampling_parameters.py
class BeamSearchParams(
        msgspec.Struct,
        omit_defaults=True,  # type: ignore[call-arg]
        # required for @cached_property.
        dict=True):  # type: ignore[call-arg]
    """Beam search parameters for text generation."""
    beam_width: int
    max_tokens: int
    ignore_eos: bool = False
    temperature: float = 0.0
    length_penalty: float = 1.0
    include_stop_str_in_output: bool = False

@blfletcher
Copy link
Author

Thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants