Test

This endpoint will test the configured model.

Request

Use the following request to set up the LLM:

curl --location --request POST 'https://publicapi.xenioo.com/llm/test' \
--header 'Authorization: Bearer <BOT_AUTH_TOKEN>' \
--header 'Content-Type: application/json'
--data-raw '{
	"Text":"<USER PROMPT>"	,
	"SplitParagraphs": <BOOLEAN>	,
	"ThreadId": "<THREAD ID>"	,		
}'

The "ThreadId" can be blank with the first message.

Response

If successful, the request will return the reply in the response with the following format:

{
    "ThreadId": "acd0c03a-f91b-4189-a5b6-39e95cd794d2",
    "Usage": {
        "PromptTokens": 1149,
        "CompletionTokens": 24,
        "Total": 1173
    },
    "Paragraphs": [
        "I am an AI assistant designed to help and provide information to users. How can I assist you today?"
    ],
    "Elapsed": 0.7142745,
    "Failed": false
}
```

Write the "ThreadId" that the reply will provide to you into your next call in order to continue with the conversation (making the LLM aware of the previous messages)

Response Codes

This endpoint will reply with the following standard HTTP codes.

Code

Meaning

200

Ok. The request has been successfully fulfilled.

400

Bad Request. The supplied token is invalid or does not have enough permissions.

404

Not Found. The account specified by the token could not be found.

Last updated