Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Passing runtime value args to a tool created by subclassing BaseTool: tool.invoke doesn't see the runtime arg even when provided #29412

Open
4 tasks done
shruthiR-fauna opened this issue Jan 24, 2025 · 0 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: core Related to langchain-core

Comments

@shruthiR-fauna
Copy link

Discussed in #29411

Originally posted by shruthiR-fauna January 24, 2025

Checked other resources

  • I added a very descriptive title to this question.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.

Commit to Help

  • I commit to help with one of those options 👆

Example Code

from typing import Annotated, Any, Dict, List, Type

from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import chain
from langchain_core.tools import BaseTool, InjectedToolArg
from langchain_openai import ChatOpenAI
from pydantic import BaseModel, Field


class RobotDogStateManager:
    # Simulated state management for the robot dog
    def __init__(self):
        self.battery_level = 100
        self.current_location = ""


class GoToObjectInput(BaseModel):
    """
    Input for going to an object, visible to the model
    """

    object: str = Field(description="The object to go to")


class GoToObjectTool(BaseTool):
    name: str = "go_to_object"
    description: str = "Instruct the robot dog to go to a specific object"
    args_schema: Type[BaseModel] = GoToObjectInput

    def _run(
        self,
        object: str,
        # Use InjectedToolArg for arguments the model shouldn't see
        robot_state: Annotated[RobotDogStateManager, InjectedToolArg],
        battery_threshold: Annotated[float, InjectedToolArg] = 20.0,
    ) -> str:
        """
        Go to an object, with additional context not visible to the model
        """
        # Check battery level before proceeding
        if robot_state.battery_level < battery_threshold:
            return (
                f"Cannot go to {object}. Battery too low: {robot_state.battery_level}%"
            )

        # Update robot's location
        robot_state.current_location = object
        return f"Successfully went to {object}"



# Example usage
def main():
    # Create a shared state manager
    robot_state = RobotDogStateManager()
    robot_state.battery_level = 10

    # Create the tool
    go_to_object_tool = GoToObjectTool()

    # Input and output schemas for the tool
    print(f"Input schema for tool: {go_to_object_tool.get_input_schema().schema()}")
    
    # Try to invoke the tool
    go_to_object_tool.invoke({'object': 'kitchen', 'robot_state': robot_state})


if __name__ == "__main__":
    main()

Description

I'm trying to use the LangChain library to create an agent that can serve as the coding system for a robot dog. The minimal example provided above provides a toy example of the model I'm using.
I've created a tool go_to_object that can be bound to an LLM that supports tool calling. Two arguments to the tool can only be provided at runtime so they are annotated with the InjectedToolArg in the _run method.
However, when I call this tool with go_to_object_tool.invoke where I supply all the required args, I see the following error
*** TypeError: GoToObjectTool._run() missing 1 required positional argument: 'robot_state'
I've spent a lot of time digging through the API and the docs. I think this exposes a bug in how tool_args and tool_kwargs are parsed.

System Info

System Information

OS: Linux
OS Version: #52~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Mon Dec 9 15:00:52 UTC 2
Python Version: 3.10.12 (main, Jan 17 2025, 14:35:34) [GCC 11.4.0]

Package Information

langchain_core: 0.3.10
langchain: 0.3.3
langchain_community: 0.3.2
langsmith: 0.1.133
langchain_fireworks: 0.2.1
langchain_google_genai: 2.0.1
langchain_groq: 0.2.0
langchain_ollama: 0.2.0
langchain_openai: 0.2.2
langchain_text_splitters: 0.3.0

Optional packages not installed

langgraph
langserve

Other Dependencies

aiohttp: 3.10.9
async-timeout: 4.0.3
dataclasses-json: 0.6.7
fireworks-ai: 0.15.4
google-generativeai: 0.8.3
groq: 0.11.0
httpx: 0.27.2
jsonpatch: 1.33
numpy: 1.26.4
ollama: 0.3.3
openai: 1.51.2
orjson: 3.10.7
packaging: 24.1
pillow: 11.0.0
pydantic: 2.9.2
pydantic-settings: 2.5.2
PyYAML: 6.0.2
requests: 2.32.3
requests-toolbelt: 1.0.0
SQLAlchemy: 2.0.35
tenacity: 8.5.0
tiktoken: 0.8.0
typing-extensions: 4.12.2

@dosubot dosubot bot added Ɑ: core Related to langchain-core 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Jan 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: core Related to langchain-core
Projects
None yet
Development

No branches or pull requests

1 participant