Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Support Box AI features #877

Merged
merged 4 commits into from
Jul 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
76 changes: 76 additions & 0 deletions boxsdk/client/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -1692,3 +1692,79 @@ def get_sign_template(
session=self._session,
response_object=response.json(),
)

@api_call
def send_ai_question(
self,
items: Iterable,
prompt: str,
mode: Optional[str] = None
) -> Any:
"""
Sends an AI request to supported LLMs and returns an answer specifically focused on the user's
question given the provided context.

:param items:
The items to be processed by the LLM, often files.
:param prompt:
The prompt provided by the client to be answered by the LLM.
The prompt's length is limited to 10000 characters.
:param mode:
The mode specifies if this request is for a single or multiple items.
If you select single_item_qa the items array can have one element only.
Selecting multiple_item_qa allows you to provide up to 25 items.

Value is one of `multiple_item_qa`, `single_item_qa`
:returns:
A response including the answer from the LLM.
"""
url = self._session.get_url('ai/ask')
if mode is None:
mode = ('single_item_qa' if len(items) == 1 else 'multiple_item_qa')
body = {
'items': items,
'prompt': prompt,
'mode': mode
}

box_response = self._session.post(url, data=json.dumps(body))
response = box_response.json()
return self.translator.translate(
session=self._session,
response_object=response,
)

@api_call
def send_ai_text_gen(
self,
dialogue_history: Iterable,
items: Iterable,
prompt: str,
):
"""
Sends an AI request to supported LLMs and returns an answer specifically focused on the creation of new text.

:param dialogue_history:
The history of prompts and answers previously passed to the LLM.
This provides additional context to the LLM in generating the response.
:param items:
The items to be processed by the LLM, often files. The array can include exactly one element.
:param prompt:
The prompt provided by the client to be answered by the LLM.
The prompt's length is limited to 10000 characters.
:returns:
A response including the generated text from the LLM.
"""
url = self._session.get_url('ai/text_gen')
body = {
'dialogue_history': dialogue_history,
'items': items,
'prompt': prompt
}

box_response = self._session.post(url, data=json.dumps(body))
response = box_response.json()
return self.translator.translate(
session=self._session,
response_object=response,
)
71 changes: 71 additions & 0 deletions docs/usage/ai.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
AI
==

AI allows to send an intelligence request to supported large language models and returns an answer based on the provided prompt and items.

<!-- START doctoc generated TOC please keep comment here to allow auto update -->
<!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE -->

- [Send AI request](#send-ai-request)
- [Send AI text generation request](#send-ai-text-generation-request)

<!-- END doctoc generated TOC please keep comment here to allow auto update -->

Send AI request
------------------------

Calling the [`client.send_ai_question(items, prompt, mode)`][send-ai-question] method will send an AI request to the supported large language models. The `items` parameter is a list of items to be processed by the LLM, often files. The `prompt` provided by the client to be answered by the LLM. The prompt's length is limited to 10000 characters. The `mode` specifies if this request is for a single or multiple items. If you select `single_item_qa` the items array can have one element only. Selecting `multiple_item_qa` allows you to provide up to 25 items.



<!-- sample post_ai_ask -->
```python
items = [{
"id": "1582915952443",
"type": "file",
"content": "More information about public APIs"
}]
answer = client.send_ai_question(
items=items,
prompt="What is this file?",
mode="single_item_qa"
)
print(answer)
```

NOTE: The AI endpoint may return a 412 status code if you use for your request a file which has just been updated to the box.
It usually takes a few seconds for the file to be indexed and available for the AI endpoint.

[send-ai-question]: https://box-python-sdk.readthedocs.io/en/latest/boxsdk.client.html#boxsdk.client.client.Client.send_ai_question

Send AI text generation request
------------------------

Calling the [`client.send_ai_text_gen(dialogue_history, items, prompt)`][send-ai-text-gen] method will send an AI text generation request to the supported large language models. The `dialogue_history` parameter is history of prompts and answers previously passed to the LLM. This provides additional context to the LLM in generating the response. The `items` parameter is a list of items to be processed by the LLM, often files. The `prompt` provided by the client to be answered by the LLM. The prompt's length is limited to 10000 characters.

<!-- sample post_ai_text_gen -->
```python
items = [{
"id": "1582915952443",
"type": "file",
"content": "More information about public APIs"
}]
dialogue_history = [{
"prompt": "Make my email about public APIs sound more professional",
"answer": "Here is the first draft of your professional email about public APIs",
"created_at": "2013-12-12T10:53:43-08:00"
},
{
"prompt": "Can you add some more information?",
"answer": "Public API schemas provide necessary information to integrate with APIs...",
"created_at": "2013-12-12T11:20:43-08:00"
}]
answer = client.send_ai_text_gen(
dialogue_history=dialogue_history,
items=items,
prompt="Write an email to a client about the importance of public APIs."
)
print(answer)
```

[send-ai-text-gen]: https://box-python-sdk.readthedocs.io/en/latest/boxsdk.client.html#boxsdk.client.client.Client.send_ai_text_gen
56 changes: 56 additions & 0 deletions test/integration_new/object/ai_itest.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
from datetime import datetime

import pytest

from test.integration_new import CLIENT
from test.integration_new.context_managers.box_test_folder import BoxTestFolder
from test.integration_new.context_managers.box_test_file import BoxTestFile

FOLDER_TESTS_DIRECTORY_NAME = 'folder-integration-tests'


@pytest.fixture(scope='module', autouse=True)
def parent_folder():
with BoxTestFolder(name=f'{FOLDER_TESTS_DIRECTORY_NAME} {datetime.now()}') as folder:
yield folder


def test_send_ai_question(parent_folder, small_file_path):
with BoxTestFile(parent_folder=parent_folder, file_path=small_file_path) as file:
items = [{
'id': file.id,
'type': 'file',
'content': 'The sun raises in the east.'
}]
answer = CLIENT.send_ai_question(
items=items,
prompt='Which direction does the sun raise?',
mode='single_item_qa'
)
assert 'east' in answer['answer'].lower()
assert answer['completion_reason'] == 'done'


def test_send_ai_text_gen(parent_folder, small_file_path):
with BoxTestFile(parent_folder=parent_folder, file_path=small_file_path) as file:
items = [{
'id': file.id,
'type': 'file',
'content': 'The sun raises in the east.'
}]
dialogue_history = [{
'prompt': 'How does the sun rise?',
'answer': 'The sun raises in the east.',
'created_at': '2013-12-12T10:53:43-08:00'
}, {
'prompt': 'How many hours does it take for the sun to rise?',
'answer': 'It takes 24 hours for the sun to rise.',
'created_at': '2013-12-12T11:20:43-08:00'
}]
answer = CLIENT.send_ai_text_gen(
dialogue_history=dialogue_history,
items=items,
prompt='Which direction does the sun raise?'
)
assert 'east' in answer['answer'].lower()
assert answer['completion_reason'] == 'done'
66 changes: 66 additions & 0 deletions test/unit/client/test_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -1766,6 +1766,16 @@ def mock_sign_template_response():
return mock_sign_template


@pytest.fixture(scope='module')
def mock_ai_question_response():
mock_ai_question_response = {
'answer': 'Public APIs are important because of key and important reasons.',
'completion_reason': 'done',
'created_at': '2021-04-26T08:12:13.982Z',
}
return mock_ai_question_response


def test_get_sign_requests(mock_client, mock_box_session, mock_sign_request_response):
expected_url = f'{API.BASE_API_URL}/sign_requests'

Expand Down Expand Up @@ -1906,3 +1916,59 @@ def test_get_sign_templates(mock_client, mock_box_session, mock_sign_template_re
assert isinstance(sign_template, SignTemplate)
assert sign_template.id == '93153068-5420-467b-b8ef-8e54bfb7be42'
assert sign_template.name == 'important-file.pdf'


def test_send_ai_question(mock_client, mock_box_session, mock_ai_question_response):
expected_url = f'{API.BASE_API_URL}/ai/ask'
mock_box_session.post.return_value.json.return_value = mock_ai_question_response

items = [{
'type': 'file',
'id': '12345'
}]
question = 'Why are public APIs important?'
mode = 'single_item_qa'

answer = mock_client.send_ai_question(items, question, mode)

mock_box_session.post.assert_called_once_with(expected_url, data=json.dumps({
'items': items,
'prompt': question,
'mode': mode
}))
assert answer['answer'] == 'Public APIs are important because of key and important reasons.'
assert answer['completion_reason'] == 'done'
assert answer['created_at'] == '2021-04-26T08:12:13.982Z'


def test_send_ai_text_gen(mock_client, mock_box_session, mock_ai_question_response):
expected_url = f'{API.BASE_API_URL}/ai/text_gen'
mock_box_session.post.return_value.json.return_value = mock_ai_question_response

items = [{
'type': 'file',
'id': '12345'
}]
dialogue_history = [{
"prompt": "Make my email about public APIs sound more professional",
"answer": "Here is the first draft of your professional email about public APIs",
"created_at": "2013-12-12T10:53:43-08:00"
}, {
"prompt": "Can you add some more information?",
"answer": "Public API schemas provide necessary information to integrate with APIs...",
"created_at": "2013-12-12T11:20:43-08:00"
}]
answer = mock_client.send_ai_text_gen(
dialogue_history=dialogue_history,
items=items,
prompt="Write an email to a client about the importance of public APIs."
)

mock_box_session.post.assert_called_once_with(expected_url, data=json.dumps({
'dialogue_history': dialogue_history,
'items': items,
'prompt': "Write an email to a client about the importance of public APIs."
}))
assert answer['answer'] == 'Public APIs are important because of key and important reasons.'
assert answer['completion_reason'] == 'done'
assert answer['created_at'] == '2021-04-26T08:12:13.982Z'
Loading