You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently I notice that for every query to Open AI you send all the pre defined contexts along with the query text to get the results. I think the Open AI API is stateless and that is why you need to send the entire context every time.
In that case how would this solution scale when this system evolves or in context of other cases (I have some in mind that I am currently exploring) where there might be thousands (if not even more) and even complex predefined contexts? Will it have to send all of them every time to get desired results? Are there plans of keeping some state in the cloud, may be with some kind of session id or cookie etc. or some other mechanism so that you don't have to send the entire context every time?
The text was updated successfully, but these errors were encountered:
However current limitation of this feature restricts the capability to current interactive chat session. We want to be able to do it (may be as a separate model, deployment, endpoint) so that it can be queried externally using APIs.
Currently I notice that for every query to Open AI you send all the pre defined contexts along with the query text to get the results. I think the Open AI API is stateless and that is why you need to send the entire context every time.
In that case how would this solution scale when this system evolves or in context of other cases (I have some in mind that I am currently exploring) where there might be thousands (if not even more) and even complex predefined contexts? Will it have to send all of them every time to get desired results? Are there plans of keeping some state in the cloud, may be with some kind of session id or cookie etc. or some other mechanism so that you don't have to send the entire context every time?
The text was updated successfully, but these errors were encountered: