Classify chunks with streaming results
POST /knowledge-bases/{knowledgeBaseId}/classify/stream
Same as /classify but streams results as Server-Sent Events. Each event contains a single classified chunk.
Authorizations
Section titled “Authorizations ”Parameters
Section titled “ Parameters ”Path Parameters
Section titled “Path Parameters ”ID of the knowledge base
ID of the knowledge base
Request Body
Section titled “Request Body ”object
Chunks to classify directly (alternative to query)
object
Unique identifier for the chunk
Text content of the chunk
Query to fetch chunks (alternative to providing chunks)
Options for fetching chunks via query
object
Maximum number of chunks to fetch via query
Embedding model to use for query (defaults to primary)
Use a built-in classification preset
Simple classification labels (e.g., [‘supporting’, ‘contradicting’, ‘neutral’])
Detailed label definitions with natural language guidance
object
Custom JSON Schema for advanced classification (power users)
object
Classification behavior options
object
Additional context to help the LLM classify chunks accurately
Number of chunks to classify per LLM call (default: 5)
Responses
Section titled “ Responses ”Streaming classification results (SSE)
Server-Sent Events stream with chunk classification results
Bad Request - Validation error or invalid input
object
Unauthorized - Authentication required or invalid token
object
Forbidden - Insufficient permissions
object
Not Found - Resource does not exist