feat(sdk/cs): add Responses API client for OpenAI/OpenResponses compat#505
feat(sdk/cs): add Responses API client for OpenAI/OpenResponses compat#505
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
There was a problem hiding this comment.
Pull request overview
Adds a new HTTP-based C# SDK v2 client for Foundry Local’s embedded web service that implements OpenAI/OpenResponses-compatible Responses API operations (including SSE streaming), and wires it into existing model/manager entry points.
Changes:
- Introduces
OpenAIResponsesClientwith CRUD + streaming overHttpClientand source-generated JSON serialization. - Adds Responses API DTOs + custom JSON converters and a dedicated
ResponsesJsonContextfor AOT/trimming. - Exposes the client via
FoundryLocalManager.GetResponsesClient(...)andIModel.GetResponsesClientAsync()(delegated throughModel/ModelVariant).
Reviewed changes
Copilot reviewed 7 out of 7 changed files in this pull request and generated 7 comments.
Show a summary per file
| File | Description |
|---|---|
| sdk_v2/cs/src/OpenAI/ResponsesTypes.cs | Adds request/response DTOs, polymorphic items/content, and JSON converters for the Responses API. |
| sdk_v2/cs/src/OpenAI/ResponsesJsonContext.cs | Adds a dedicated source-generated JsonSerializerContext for Responses API types (AOT-friendly). |
| sdk_v2/cs/src/OpenAI/ResponsesClient.cs | Implements the HTTP/SSE Responses API client and settings container. |
| sdk_v2/cs/src/IModel.cs | Adds GetResponsesClientAsync() to the public model interface. |
| sdk_v2/cs/src/ModelVariant.cs | Implements GetResponsesClientAsync() with loaded-model + web service checks. |
| sdk_v2/cs/src/Model.cs | Delegates GetResponsesClientAsync() to the selected variant. |
| sdk_v2/cs/src/FoundryLocalManager.cs | Adds a GetResponsesClient(...) factory that requires the web service to be running. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
…tibility Add OpenAIResponsesClient to the C# SDK v2 with full CRUD support for the Responses API served by Foundry Local's embedded web service. New files: - src/OpenAI/ResponsesClient.cs: HTTP-based client with SSE streaming - src/OpenAI/ResponsesTypes.cs: Request/response DTOs, items, streaming events - src/OpenAI/ResponsesJsonContext.cs: AOT-compatible source-generated JSON context Modified files: - src/IModel.cs: GetResponsesClientAsync() on IModel interface - src/ModelVariant.cs: Implementation with web service URL validation - src/Model.cs: Delegation to SelectedVariant - src/FoundryLocalManager.cs: GetResponsesClient() factory method Key design decisions: - HTTP-based (HttpClient + SSE), not FFI, since no CoreInterop command exists - AOT-compatible: all serialization uses source-generated JsonSerializerContext - IDisposable: HttpClient properly disposed - Follows existing patterns: Utils.CallWithExceptionHandling, ConfigureAwait(false) - Factory on FoundryLocalManager + convenience on IModel - ResponseObject.OutputText convenience property (matches OpenAI Python SDK) - Full CRUD: Create, CreateStreaming, Get, Delete, Cancel, GetInputItems
0c3da98 to
1b00229
Compare
|
@copilot re-review post all the changes please. |
…equest Agent-Logs-Url: https://github.com/microsoft/Foundry-Local/sessions/3b54ccd8-7b43-410d-95da-8c7add815a6b Co-authored-by: MaanavD <24942306+MaanavD@users.noreply.github.com>
Completed a full re-review of all changes. The code is solid — all previous review comments were properly addressed:
Found and fixed one minor inconsistency: the CodeQL scan: 0 alerts. Warning Firewall rules blocked me from connecting to one or more addresses (expand for details)I tried to connect to the following addresses, but was blocked by firewall rules:
If you need me to access, download, or install something from one of these locations, you can either:
|
_loggerinBuildRequestFoundryLocalException (consistency fix)💬 Send tasks to Copilot coding agent from Slack and Teams to turn conversations into code. Copilot posts an update in your thread when it's finished.