Test
This endpoint will test the configured model.
Request
Use the following request to set up the LLM:
The "ThreadId" can be blank with the first message.
Response
If successful, the request will return the reply in the response with the following format:
Write the "ThreadId" that the reply will provide to you into your next call in order to continue with the conversation (making the LLM aware of the previous messages)
Response Codes
This endpoint will reply with the following standard HTTP codes.
Code
Meaning
200
Ok. The request has been successfully fulfilled.
400
Bad Request. The supplied token is invalid or does not have enough permissions.
404
Not Found. The account specified by the token could not be found.
Last updated