Qore OpenAiDataProvider Module Reference 1.5
Loading...
Searching...
No Matches

Introduction to the OpenAiDataProvider Module

The OpenAiDataProvider module provides a dataproviderintro "data provider" API for OpenAI services.

Examples

Responses with Tool Calls and Structured Output

%requires OpenAiDataProvider
OpenAiRestConnection rc = get_connection("openai");
OpenAiDataProvider::OpenAiDataProvider prov(rc.get());
AbstractDataProvider resp = prov.getChildProvider("responses").getChildProvider("create");
hash<auto> result = resp.doRequest({
"model": "gpt-4o-mini",
"input": (
{
"input_item": {
"role": "user",
"content": (
{
"type": "input_text",
"text": "Return the sum of 2 and 3",
},
),
},
},
),
"tools": (
{
"function": {
"name": "sum",
"description": "Returns a+b",
"parameters": {
"type": "object",
"properties": {
"a": {"type": "integer"},
"b": {"type": "integer"},
},
"required": ("a", "b"),
},
},
},
),
"text": {
"format": {
"type": "json_schema",
"json_schema": {
"name": "SumAnswer",
"schema": {
"type": "object",
"properties": {
"answer": {"type": "integer"},
},
"required": ("answer",),
},
"strict": True,
},
},
},
});
# Tool call handling and tool result submission:
if (result.tool_calls) {
hash<auto> call = result.tool_calls[0];
# ...call your tool with call.name and call.arguments...
hash<auto> tool_result = {
"type": "tool_result",
"tool_call_id": call.call_id,
"output": "{\"answer\":5}",
};
# Tool results can be sent as additional input items (role=tool)
hash<auto> follow_up = resp.doRequest({
"model": "gpt-4o-mini",
"previous_response_id": result.id,
"input": (
{
"input_item": {
"role": "tool",
"content": (tool_result,),
},
},
),
});
}
# Convenience helpers for downstream use:
list<hash<auto>> tool_calls = result.tool_calls;
list<hash<auto>> tool_results = result.tool_results;
string text = result.output_text;

Conversation Context Store

class MyContextStore inherits OpenAiConversationContextStore {
*hash<auto> loadContext(string key) { return get_from_store(key); }
nothing saveContext(string key, hash<auto> ctxt) { save_to_store(key, ctxt); }
}
OpenAiRestConnection rc = get_connection("openai");
OpenAiDataProvider::OpenAiDataProvider prov(rc.get());
prov.setConversationContextStore(new MyContextStore());
AbstractDataProvider resp = prov.getChildProvider("responses").getChildProvider("create");
hash<auto> result = resp.doRequest({
"model": "gpt-4o-mini",
"input": (
{
"input_item": {
"role": "user",
"content": (
{
"type": "input_text",
"text": "Continue our previous conversation",
},
),
},
},
),
"conversation_key": "user-42",
});

Conversation Context Store Callbacks

OpenAiConversationContextStoreCallbacks store(
sub (string key) { return get_from_store(key); },
nothing sub (string key, hash<auto> ctxt) { save_to_store(key, ctxt); }
);
OpenAiRestConnection rc = get_connection("openai");
OpenAiDataProvider::OpenAiDataProvider prov(rc.get());
prov.setConversationContextStore(store);

Streaming Responses

%requires OpenAiDataProvider
OpenAiRestConnection rc = get_connection("openai");
OpenAiDataProvider::OpenAiDataProvider prov(rc.get());
AbstractDataProvider stream = prov.getChildProvider("responses").getChildProvider("stream");
stream.registerObserver(myObserver);
stream.observersReady();
stream.doRequest({
"model": "gpt-4o-mini",
"input": (
{
"input_item": {
"role": "user",
"content": (
{
"type": "input_text",
"text": "Stream a response",
},
),
},
},
),
});

For Creator WebSocket integrations, the action-based OpenAiResponseActionSession class provides a transport-agnostic interface for response creation and streaming with normalized rich content output.

Stream Event Payloads

The stream provider emits SSE events with a `type` field matching the Responses API event types. A few common examples:

  • `response.output_text.delta`
  • `response.tool_call.delta`
  • `response.completed`

The emitted event hash also includes `event` and `id` when provided by the server. The `responses/stream` provider will persist conversation context on `response.completed` when a `conversation_key` is used.

Release Notes

OpenAiDataProvider v1.5

  • added OpenAiResponseActionSession for action-based response creation and streaming
  • added typed response output items for tool calls, tool results, and output content parts
  • added response stream data provider for Server-Sent Events streaming
  • response stream requests now accept a model argument for responses API compatibility
  • action session text normalization now tolerates missing output format hints
  • added conversation context store hooks for response continuity
  • added action catalog entry for streaming responses
  • added responses API input content support for text, image, audio, and file parts
  • added conversation context helpers for response continuity
  • missing APIs and actions to handle image analysis (issue 4947)
  • added APIs to handle image generation

OpenAiDataProvider v1.0

  • initial release of the module

Migration Notes

  • Response output items are now typed; downstream code should prefer `output_text`, `tool_calls`, and `tool_results` instead of parsing raw hashes.
  • Response streaming is exposed as the `responses/stream` data provider and emits SSE events with a `type` field matching the Responses API event types.
  • Conversation persistence can be configured by supplying a `conversation_context_store` and using `conversation_key` in requests.