SF-30 add OAuth and OpenAPI shared packages#83
Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds a new http-openapi package and a suite of OAuth packages (oauth-core, oauth-flow, oauth-http, oauth-providers, oauth-router, oauth-storage, oauth-testing) including implementations, adapters (memory/SQL/DB), token ciphering, token HTTP client, provider policies, route factories, extensive tests, TypeScript configs, packaging, docs, and release manifest updates. Changes
Sequence Diagram(s)sequenceDiagram
actor User
participant Browser as Client
participant Router as OAuth Router
participant Flow as OAuth Flow
participant Core as OAuth Core
participant HTTP as OAuth HTTP
participant Storage as OAuth Storage
participant Provider as OAuth Provider
User->>Browser: initiate login
Browser->>Router: POST /start (provider)
Router->>Flow: start(providerId, subject, requestId)
Flow->>Core: createAuthorizationRequest(...)
Core->>Core: generate state/nonce/PKCE
Core->>Storage: put state record
Core-->>Flow: authorizationUrl + stateId
Flow-->>Router: return authorizationUrl/stateId
Browser->>Provider: redirect to authorizationUrl
Provider-->>Browser: redirect callback (code,state)
Browser->>Router: GET /callback?code=...&state=...
Router->>Flow: handleCallback(providerId, code, state, redirectUri)
Flow->>Core: handleCallback(...)
Core->>Storage: consume state
Core->>HTTP: exchangeCodeForToken(...)
HTTP->>Provider: POST /token (code exchange)
Provider-->>HTTP: token response
HTTP-->>Core: OAuthTokenSet
Core-->>Flow: tokenSet + subject
Flow->>Storage: persist token (possibly encrypted)
Flow-->>Router: respond (redirect or JSON)
sequenceDiagram
participant Client
participant Router
participant Flow
participant HTTP
participant Storage
participant Provider
Client->>Router: POST /refresh (connectionId)
Router->>Flow: refresh(connectionId, requestId)
Flow->>Storage: get token by connection
Flow->>HTTP: exchange token (refresh_token)
HTTP->>Provider: POST /token (refresh)
Provider-->>HTTP: new access token
HTTP-->>Flow: updated tokenSet
Flow->>Storage: put refreshed token
Flow-->>Router: return refreshed tokens
Estimated code review effort🎯 5 (Critical) | ⏱️ ~120 minutes Possibly related PRs
Suggested labels
Poem
✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
|
There was a problem hiding this comment.
Sorry @thyaravind, your pull request is larger than the review limit of 150000 diff characters
Review Summary by QodoAdd OAuth 2.0 and OpenAPI shared packages with complete flow orchestration and provider support
WalkthroughsDescription• Add comprehensive OAuth 2.0 foundation packages with complete flow orchestration, state management, and token lifecycle handling • Implement oauth-core package with provider descriptors, PKCE generation, and state validation for authorization flows • Implement oauth-http package with token client supporting multiple authentication methods, retry logic, and error normalization • Implement oauth-storage package with state/token persistence contracts and adapters (memory, SQL, generic database) • Implement oauth-flow package orchestrating complete OAuth lifecycle with token encryption, refresh, and disconnect operations • Implement oauth-router package providing browser-auth and connection-based HTTP route factories • Implement oauth-providers package with curated provider descriptors (Google, Apple, GitHub, Microsoft, Yahoo) and policy registry • Implement oauth-testing package with in-memory stores, fixtures, and secret redaction utilities • Add http-openapi shared package for generating OpenAPI 3.1.0 specifications from route metadata • Include comprehensive test coverage for all OAuth layers with lifecycle, error handling, and integration scenarios • Provide detailed package documentation with quick-start examples and production deployment guidance Diagramflowchart LR
A["oauth-core<br/>Provider Descriptors<br/>PKCE & State"]
B["oauth-http<br/>Token Client<br/>Retry & Auth"]
C["oauth-storage<br/>State & Token<br/>Persistence"]
D["oauth-flow<br/>Flow Orchestration<br/>Token Lifecycle"]
E["oauth-router<br/>HTTP Routes<br/>Browser & Connection"]
F["oauth-providers<br/>Provider Registry<br/>Policies"]
G["oauth-testing<br/>Test Utilities<br/>Fixtures"]
H["http-openapi<br/>OpenAPI Generation<br/>Route Metadata"]
A --> B
A --> C
A --> F
B --> D
C --> D
F --> D
D --> E
C --> G
A --> G
E --> H
File Changes1. packages/oauth-flow/src/service.ts
|
Code Review by Qodo
1.
|
|
Warning Review the following alerts detected in dependencies. According to your organization's Security Policy, it is recommended to resolve "Warn" alerts. Learn more about Socket for GitHub.
|
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: abfe149981
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| providerId: context.params.provider, | ||
| code: asRequiredString(resolvedInput.code ?? url.searchParams.get("code"), "code"), | ||
| state: asRequiredString(resolvedInput.state ?? url.searchParams.get("state"), "state"), | ||
| redirectUri: asRequiredString(resolvedInput.redirectUri ?? url.searchParams.get("redirectUri"), "redirectUri"), |
There was a problem hiding this comment.
Derive callback redirect URI instead of requiring query param
The callback handler currently requires redirectUri from resolveCallbackInput or ?redirectUri=..., but standard OAuth callbacks only send code and state. With the default route config (as shown in the package quick start), this path throws before flowService.handleCallback runs, so normal provider callbacks fail (and become 500s because a plain Error is thrown). This breaks out-of-the-box callback handling unless every consumer adds custom callback input resolution.
Useful? React with 👍 / 👎.
| await this.deleteTokenByConnection(input.connectionId); | ||
| localTokenDeleted = tokenRecord !== null; |
There was a problem hiding this comment.
Validate provider matches token record before disconnect delete
disconnect() looks up the token only by connectionId and then always deletes it, but never verifies that tokenRecord.record.providerId matches input.providerId. If a stale or incorrect provider is passed (for example via :provider path mismatch), the service can delete a different provider’s token and skip/revoke against the wrong upstream provider. This should fail fast on provider mismatch before local deletion.
Useful? React with 👍 / 👎.
| } catch { | ||
| return null; |
There was a problem hiding this comment.
Re-throw adapter failures when consuming OAuth state
The consume() implementation swallows every adapter.update() error and returns null, which makes infrastructure or query failures look like “state not found/expired.” Upstream this is mapped to user-facing OAuth state errors instead of surfacing a storage failure, so real DB outages become misleading 4xx auth failures and lose retry/debug signals. Only “not found” should be converted to null; unexpected adapter errors should be re-thrown.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
15 issues found across 84 files
Note: This PR contains a large number of files. cubic only reviews up to 75 files per PR, so some files may not have been reviewed.
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="packages/oauth-http/package.json">
<violation number="1" location="packages/oauth-http/package.json:41">
P1: Avoid `"*"` for runtime dependencies; pin to a compatible range (or workspace protocol) to prevent unbounded breaking upgrades.</violation>
</file>
<file name="packages/oauth-flow/scripts/run-vitest.mjs">
<violation number="1" location="packages/oauth-flow/scripts/run-vitest.mjs:15">
P1: Do not treat `null` child exit status as success; signal-terminated test runs currently exit with code 0.</violation>
</file>
<file name="packages/oauth-providers/src/providers/yahoo.ts">
<violation number="1" location="packages/oauth-providers/src/providers/yahoo.ts:47">
P1: `mail.read.fullbody` is allowed in operation policies, but Yahoo has no matching `featureScopes` entry. Scope validation for that feature will throw `feature scope policy not found`.</violation>
</file>
<file name="packages/oauth-providers/src/policy-registry.ts">
<violation number="1" location="packages/oauth-providers/src/policy-registry.ts:184">
P1: `restrictedScopes` is never enforced. Google and Microsoft providers configure restricted scopes (`Mail.ReadWrite`, `https://mail.google.com/`), but `validateScopes` only checks `allowedScopes`. Requesting a restricted scope will silently pass validation as long as it appears in `allowedScopes`.</violation>
</file>
<file name="packages/oauth-http/src/errors.ts">
<violation number="1" location="packages/oauth-http/src/errors.ts:26">
P2: The custom error constructor drops the original error cause, so thrown OAuthHttpError instances lose causal stack/context.</violation>
</file>
<file name="packages/oauth-flow/src/service.ts">
<violation number="1" location="packages/oauth-flow/src/service.ts:327">
P2: `pendingConnectionByState` is never pruned for abandoned OAuth flows, causing a slow memory leak. Every call to `start()` with a connection subject adds an entry, but only a completed `handleCallback()` removes it. Consider adding a TTL-based eviction (e.g., entries older than the state TTL) or removing this in-memory cache in favor of reading the connectionId solely from the persisted state record.</violation>
</file>
<file name="packages/oauth-core/src/service.ts">
<violation number="1" location="packages/oauth-core/src/service.ts:302">
P2: The legacy-record fallback can produce a `browser-auth` subject with an empty `intentId`. When the branch is entered because `regionId`, `returnTo`, or `metadata` is truthy—but `intentId` is absent—`intentId` defaults to `""`, which the storage layer rejects as invalid. Consider tightening the guard to require `state.intentId` before entering the browser-auth path.</violation>
</file>
<file name="packages/oauth-storage/src/adapters/memory.ts">
<violation number="1" location="packages/oauth-storage/src/adapters/memory.ts:78">
P1: Reusing a tokenId with a different connectionId leaves a stale connection index entry, so `getByConnection` can return the wrong token.</violation>
</file>
<file name="packages/oauth-storage/src/token-vault.ts">
<violation number="1" location="packages/oauth-storage/src/token-vault.ts:239">
P2: `Date.parse()` accepts many non-ISO date formats (e.g. `"March 4, 2025"`), so this does not actually enforce ISO-8601 as the function name and error message claim. Use a regex or stricter check to validate the format:
```ts
const ISO_RE = /^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}(\.\d+)?(Z|[+-]\d{2}:\d{2})$/;
```</violation>
</file>
<file name="packages/oauth-router/src/browser-routes.ts">
<violation number="1" location="packages/oauth-router/src/browser-routes.ts:116">
P2: Six private helper functions (`resolveRouteMeta`, `mergeAuthMeta`, `mergeOpenApiMeta`, `joinPath`, `asRequiredString`, `parseDisconnectBody`) are duplicated verbatim between `browser-routes.ts` and `connection-routes.ts`. Extract them into a shared internal module (e.g., `route-utils.ts`) within this package to avoid divergence when one copy is updated but not the other.</violation>
</file>
<file name="packages/oauth-storage/src/adapters/db.ts">
<violation number="1" location="packages/oauth-storage/src/adapters/db.ts:157">
P1: Silent `catch { return }` in `rotateKey` masks all DB errors, making key rotation appear successful when it may have failed. For a security-critical operation this can lead to data loss—if the caller decommissions the old key believing rotation succeeded, the token becomes permanently un-decryptable. Propagate or wrap the error instead.</violation>
<violation number="2" location="packages/oauth-storage/src/adapters/db.ts:207">
P1: `rotateKey` updates only `key_ref` without re-encrypting `encrypted_payload`. After this call, the stored payload is still encrypted with the old key but `key_ref` points to the new key, making the token un-decryptable on the next `get()`. `EncryptedTokenVault` never calls this method (it uses `put`), so when called directly on `DbAdapterTokenVault` it silently corrupts data. Consider either implementing the full decrypt-reencrypt-store cycle or documenting/restricting this method as internal-only.</violation>
</file>
<file name="packages/oauth-storage/src/adapters/sql.ts">
<violation number="1" location="packages/oauth-storage/src/adapters/sql.ts:670">
P2: `null` from the DB is not converted to `undefined`—the `as` cast hides the mismatch at compile time but consumers will see `null` at runtime. Every other nullable field in this file uses `?? undefined`; this one was missed.</violation>
</file>
<file name="packages/oauth-core/src/state.ts">
<violation number="1" location="packages/oauth-core/src/state.ts:39">
P1: Do not fail open when the redirect allowlist is empty; reject this configuration instead of allowing any redirect URI.</violation>
</file>
<file name="packages/oauth-http/src/token-client.ts">
<violation number="1" location="packages/oauth-http/src/token-client.ts:341">
P1: Basic auth credentials must be percent-encoded before base64 encoding per RFC 6749 §2.3.1. If a `clientId` or `clientSecret` contains reserved characters (e.g. `:`, `+`, `%`), the resulting Authorization header will be malformed and some providers will reject the request or misparse the credentials.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
| private readonly config: OAuthFlowServiceConfig; | ||
| private readonly coreService: DefaultOAuthService; | ||
| private readonly tokenHttpClient: OAuthTokenHttpClient; | ||
| private readonly pendingConnectionByState = new Map<string, string>(); |
There was a problem hiding this comment.
P2: pendingConnectionByState is never pruned for abandoned OAuth flows, causing a slow memory leak. Every call to start() with a connection subject adds an entry, but only a completed handleCallback() removes it. Consider adding a TTL-based eviction (e.g., entries older than the state TTL) or removing this in-memory cache in favor of reading the connectionId solely from the persisted state record.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At packages/oauth-flow/src/service.ts, line 327:
<comment>`pendingConnectionByState` is never pruned for abandoned OAuth flows, causing a slow memory leak. Every call to `start()` with a connection subject adds an entry, but only a completed `handleCallback()` removes it. Consider adding a TTL-based eviction (e.g., entries older than the state TTL) or removing this in-memory cache in favor of reading the connectionId solely from the persisted state record.</comment>
<file context>
@@ -0,0 +1,1313 @@
+ private readonly config: OAuthFlowServiceConfig;
+ private readonly coreService: DefaultOAuthService;
+ private readonly tokenHttpClient: OAuthTokenHttpClient;
+ private readonly pendingConnectionByState = new Map<string, string>();
+ private readonly now: () => Date;
+
</file context>
| @@ -0,0 +1,225 @@ | |||
| import type { AuthRouteMeta, HttpRouteMeta, OpenApiRouteMeta, Route, RouteContext } from "@superfunctions/http"; | |||
There was a problem hiding this comment.
P2: Six private helper functions (resolveRouteMeta, mergeAuthMeta, mergeOpenApiMeta, joinPath, asRequiredString, parseDisconnectBody) are duplicated verbatim between browser-routes.ts and connection-routes.ts. Extract them into a shared internal module (e.g., route-utils.ts) within this package to avoid divergence when one copy is updated but not the other.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At packages/oauth-router/src/browser-routes.ts, line 116:
<comment>Six private helper functions (`resolveRouteMeta`, `mergeAuthMeta`, `mergeOpenApiMeta`, `joinPath`, `asRequiredString`, `parseDisconnectBody`) are duplicated verbatim between `browser-routes.ts` and `connection-routes.ts`. Extract them into a shared internal module (e.g., `route-utils.ts`) within this package to avoid divergence when one copy is updated but not the other.</comment>
<file context>
@@ -0,0 +1,225 @@
+ ];
+}
+
+function resolveRouteMeta(defaultMeta: HttpRouteMeta, overrideMeta: HttpRouteMeta | undefined): HttpRouteMeta {
+ if (!overrideMeta) {
+ return defaultMeta;
</file context>
Greptile SummaryThis PR adds seven new shared packages forming a layered OAuth 2.0 stack: Key findings:
Confidence Score: 3/5Safe to merge for PostgreSQL/SQLite deployments only; the empty redirect-URI allowlist bypass is a meaningful security risk that should be resolved before production use. The redirect URI bypass in assertRedirectUriAllowed is a genuine security issue that can lead to open redirects and authorization-code interception if a provider's runtime config omits its allowlist. The RETURNING-clause dependency is a hard runtime failure for MySQL/MariaDB consumers. Both issues are in entirely new packages with no existing callers on this branch, which limits the immediate blast radius, but they should be corrected before the packages are consumed by any service. packages/oauth-core/src/state.ts (redirect URI bypass), packages/oauth-storage/src/adapters/sql.ts (RETURNING clause + table-name interpolation) Important Files Changed
Sequence DiagramsequenceDiagram
participant Client
participant OAuthRouter
participant OAuthFlowService
participant OAuthCoreService
participant OAuthTokenHttpClient
participant OAuthStateStore
participant TokenVault
Client->>OAuthRouter: GET /oauth/:provider/start
OAuthRouter->>OAuthFlowService: start(input)
OAuthFlowService->>OAuthCoreService: createAuthorizationRequest(input)
OAuthCoreService->>OAuthStateStore: put(stateRecord)
OAuthCoreService-->>OAuthFlowService: AuthorizationResult {authorizationUrl, stateId}
OAuthFlowService-->>OAuthRouter: OAuthFlowStartResult
OAuthRouter-->>Client: 302 Redirect to provider
Client->>OAuthRouter: GET /oauth/:provider/callback?code=&state=
OAuthRouter->>OAuthFlowService: callback(input)
OAuthFlowService->>OAuthCoreService: handleCallback(input)
OAuthCoreService->>OAuthStateStore: consume(stateId, consumedAt)
OAuthStateStore-->>OAuthCoreService: OAuthStateRecord
OAuthCoreService->>OAuthTokenHttpClient: exchangeToken(request)
OAuthTokenHttpClient-->>OAuthCoreService: OAuthTokenSet
OAuthCoreService-->>OAuthFlowService: OAuthTokenSet
OAuthFlowService->>TokenVault: putTokenSet(record)
OAuthFlowService-->>OAuthRouter: OAuthFlowCallbackResult
OAuthRouter-->>Client: 302 Redirect or JSON response
Client->>OAuthRouter: POST /oauth/:provider/disconnect
OAuthRouter->>OAuthFlowService: disconnect(input)
OAuthFlowService->>OAuthTokenHttpClient: revokeToken(request)
OAuthFlowService->>TokenVault: deleteByConnection(connectionId)
OAuthFlowService-->>OAuthRouter: OAuthFlowDisconnectResult
OAuthRouter-->>Client: 200 OK
Prompt To Fix All With AIThis is a comment left during a code review.
Path: packages/oauth-core/src/state.ts
Line: 44-49
Comment:
**Empty allowlist silently permits all redirect URIs**
When `allowlistedRedirectUris` is an empty array the function returns immediately, accepting any redirect URI the caller supplies. A provider whose runtime config omits or accidentally clears `allowlistedRedirectUris` will pass validation for every URI — including attacker-controlled ones — enabling an open-redirect and authorization-code interception attack.
The safer default is to treat an empty allowlist as _deny-all_, or require callers to explicitly pass a wildcard sentinel to opt into unrestricted mode:
```suggestion
export function assertRedirectUriAllowed(redirectUri: string, allowlistedRedirectUris: ReadonlyArray<string>): void {
if (!allowlistedRedirectUris.includes(redirectUri)) {
throw new OAuthCoreError("OAUTH_REDIRECT_DISALLOWED", "redirect URI not allowed");
}
```
How can I resolve this? If you propose a fix, please make it concise.
---
This is a comment left during a code review.
Path: packages/oauth-storage/src/adapters/sql.ts
Line: 181-198
Comment:
**`RETURNING` clause is not supported on MySQL / MariaDB**
The `consume` method issues `UPDATE … RETURNING …`, which is only available in PostgreSQL (≥ 8.2) and SQLite (≥ 3.35). MySQL and MariaDB do not support `RETURNING` on `UPDATE` statements, so any deployment running against those databases will encounter a SQL parse error at runtime when the first OAuth callback is processed.
If the `SqlClient` abstraction is meant to be database-agnostic, the atomic consume pattern needs a database-compatible fallback — typically a `BEGIN` / `UPDATE` / `SELECT` / `COMMIT` transaction, or a feature-detection path keyed to the underlying engine. At minimum, the class-level JSDoc or README should clearly state that only PostgreSQL and SQLite are supported.
How can I resolve this? If you propose a fix, please make it concise.
---
This is a comment left during a code review.
Path: packages/oauth-storage/src/adapters/sql.ts
Line: 149-165
Comment:
**Table names interpolated directly into SQL strings without validation**
`this.tableName` (and the other table name fields) are concatenated into raw SQL throughout this file without any sanitization, for example:
```ts
`SELECT ${STATE_COLUMNS} FROM ${this.tableName} WHERE state_id = ? LIMIT 1`
```
Although the values currently come from the constructor (not user input), if a table name is ever read from an environment variable or config file and passed through, it becomes an SQL injection vector. Adding an identifier-validation guard at construction time eliminates the risk entirely:
```ts
if (!/^[a-zA-Z_][a-zA-Z0-9_]*$/.test(tableName)) {
throw new Error(`Invalid SQL table name: ${tableName}`);
}
```
The same pattern applies to `SqlTokenVault`, `SqlOAuthConsentStore`, and `SqlOAuthRevocationFailureStore`.
How can I resolve this? If you propose a fix, please make it concise.
---
This is a comment left during a code review.
Path: packages/oauth-http/src/token-client.ts
Line: 197-214
Comment:
**`revokeMessage` in error details may surface sensitive provider content**
`disconnectWithRevoke` includes the raw provider error message in the thrown error's `details`:
```ts
details: {
...
revokeMessage: revokeError.message
}
```
Error messages from an OAuth provider's revocation endpoint could contain token values or other credential material (particularly from non-RFC-compliant providers). This `details` object propagates to callers who may log it or serialize it into an API response.
Consider either omitting `revokeMessage` from the details payload, or running it through the same `sanitizeDetails` / `sanitizeValue` scrubbing that `oauth-flow/service.ts` already applies to event payloads.
How can I resolve this? If you propose a fix, please make it concise.Reviews (1): Last reviewed commit: "SF-30 add OAuth and OpenAPI shared packa..." | Re-trigger Greptile |
| throw new OAuthCoreError("OAUTH_REDIRECT_DISALLOWED", "redirect URI not allowed"); | ||
| } | ||
| } | ||
|
|
||
| export function assertCallbackStateMatches(input: CallbackStateMatchInput, state: OAuthStateRecord): void { | ||
| if (input.providerId !== state.providerId) { |
There was a problem hiding this comment.
Empty allowlist silently permits all redirect URIs
When allowlistedRedirectUris is an empty array the function returns immediately, accepting any redirect URI the caller supplies. A provider whose runtime config omits or accidentally clears allowlistedRedirectUris will pass validation for every URI — including attacker-controlled ones — enabling an open-redirect and authorization-code interception attack.
The safer default is to treat an empty allowlist as deny-all, or require callers to explicitly pass a wildcard sentinel to opt into unrestricted mode:
| throw new OAuthCoreError("OAUTH_REDIRECT_DISALLOWED", "redirect URI not allowed"); | |
| } | |
| } | |
| export function assertCallbackStateMatches(input: CallbackStateMatchInput, state: OAuthStateRecord): void { | |
| if (input.providerId !== state.providerId) { | |
| export function assertRedirectUriAllowed(redirectUri: string, allowlistedRedirectUris: ReadonlyArray<string>): void { | |
| if (!allowlistedRedirectUris.includes(redirectUri)) { | |
| throw new OAuthCoreError("OAUTH_REDIRECT_DISALLOWED", "redirect URI not allowed"); | |
| } |
Prompt To Fix With AI
This is a comment left during a code review.
Path: packages/oauth-core/src/state.ts
Line: 44-49
Comment:
**Empty allowlist silently permits all redirect URIs**
When `allowlistedRedirectUris` is an empty array the function returns immediately, accepting any redirect URI the caller supplies. A provider whose runtime config omits or accidentally clears `allowlistedRedirectUris` will pass validation for every URI — including attacker-controlled ones — enabling an open-redirect and authorization-code interception attack.
The safer default is to treat an empty allowlist as _deny-all_, or require callers to explicitly pass a wildcard sentinel to opt into unrestricted mode:
```suggestion
export function assertRedirectUriAllowed(redirectUri: string, allowlistedRedirectUris: ReadonlyArray<string>): void {
if (!allowlistedRedirectUris.includes(redirectUri)) {
throw new OAuthCoreError("OAUTH_REDIRECT_DISALLOWED", "redirect URI not allowed");
}
```
How can I resolve this? If you propose a fix, please make it concise.| consumed_at = excluded.consumed_at | ||
| `, | ||
| [ | ||
| normalized.stateId, | ||
| normalized.providerId, | ||
| normalized.subject!.kind, | ||
| getOAuthSubjectKey(normalized.subject!), | ||
| JSON.stringify(normalized.subject), | ||
| normalized.redirectUri, | ||
| JSON.stringify(normalized.requestedScopes), | ||
| normalized.codeVerifier ?? null, | ||
| normalized.nonce ?? null, | ||
| normalized.createdAt, | ||
| normalized.expiresAt, | ||
| normalized.consumedAt ?? null | ||
| ] | ||
| ); | ||
| } |
There was a problem hiding this comment.
RETURNING clause is not supported on MySQL / MariaDB
The consume method issues UPDATE … RETURNING …, which is only available in PostgreSQL (≥ 8.2) and SQLite (≥ 3.35). MySQL and MariaDB do not support RETURNING on UPDATE statements, so any deployment running against those databases will encounter a SQL parse error at runtime when the first OAuth callback is processed.
If the SqlClient abstraction is meant to be database-agnostic, the atomic consume pattern needs a database-compatible fallback — typically a BEGIN / UPDATE / SELECT / COMMIT transaction, or a feature-detection path keyed to the underlying engine. At minimum, the class-level JSDoc or README should clearly state that only PostgreSQL and SQLite are supported.
Prompt To Fix With AI
This is a comment left during a code review.
Path: packages/oauth-storage/src/adapters/sql.ts
Line: 181-198
Comment:
**`RETURNING` clause is not supported on MySQL / MariaDB**
The `consume` method issues `UPDATE … RETURNING …`, which is only available in PostgreSQL (≥ 8.2) and SQLite (≥ 3.35). MySQL and MariaDB do not support `RETURNING` on `UPDATE` statements, so any deployment running against those databases will encounter a SQL parse error at runtime when the first OAuth callback is processed.
If the `SqlClient` abstraction is meant to be database-agnostic, the atomic consume pattern needs a database-compatible fallback — typically a `BEGIN` / `UPDATE` / `SELECT` / `COMMIT` transaction, or a feature-detection path keyed to the underlying engine. At minimum, the class-level JSDoc or README should clearly state that only PostgreSQL and SQLite are supported.
How can I resolve this? If you propose a fix, please make it concise.| } | ||
|
|
||
| async put(record: OAuthStateRecord): Promise<void> { | ||
| validateOAuthStateRecord(record); | ||
| const normalized = applySubjectToStateRecord(record); | ||
| await this.client.execute( | ||
| ` | ||
| INSERT INTO ${this.tableName} ( | ||
| state_id, | ||
| provider_id, | ||
| subject_kind, | ||
| subject_key, | ||
| subject_payload, | ||
| redirect_uri, | ||
| requested_scopes, | ||
| code_verifier, | ||
| nonce, |
There was a problem hiding this comment.
Table names interpolated directly into SQL strings without validation
this.tableName (and the other table name fields) are concatenated into raw SQL throughout this file without any sanitization, for example:
`SELECT ${STATE_COLUMNS} FROM ${this.tableName} WHERE state_id = ? LIMIT 1`Although the values currently come from the constructor (not user input), if a table name is ever read from an environment variable or config file and passed through, it becomes an SQL injection vector. Adding an identifier-validation guard at construction time eliminates the risk entirely:
if (!/^[a-zA-Z_][a-zA-Z0-9_]*$/.test(tableName)) {
throw new Error(`Invalid SQL table name: ${tableName}`);
}The same pattern applies to SqlTokenVault, SqlOAuthConsentStore, and SqlOAuthRevocationFailureStore.
Prompt To Fix With AI
This is a comment left during a code review.
Path: packages/oauth-storage/src/adapters/sql.ts
Line: 149-165
Comment:
**Table names interpolated directly into SQL strings without validation**
`this.tableName` (and the other table name fields) are concatenated into raw SQL throughout this file without any sanitization, for example:
```ts
`SELECT ${STATE_COLUMNS} FROM ${this.tableName} WHERE state_id = ? LIMIT 1`
```
Although the values currently come from the constructor (not user input), if a table name is ever read from an environment variable or config file and passed through, it becomes an SQL injection vector. Adding an identifier-validation guard at construction time eliminates the risk entirely:
```ts
if (!/^[a-zA-Z_][a-zA-Z0-9_]*$/.test(tableName)) {
throw new Error(`Invalid SQL table name: ${tableName}`);
}
```
The same pattern applies to `SqlTokenVault`, `SqlOAuthConsentStore`, and `SqlOAuthRevocationFailureStore`.
How can I resolve this? If you propose a fix, please make it concise.| revokeMessage: revokeError.message | ||
| } | ||
| }); | ||
| } | ||
|
|
||
| return { | ||
| remoteRevokeAttempted: input.revokeSupported, | ||
| localTokenDeleted, | ||
| connectionDeleted | ||
| }; | ||
| } | ||
|
|
||
| async function resolveRequestCredentials( | ||
| input: Pick<OAuthTokenEndpointRequest | OAuthRevocationRequest, "clientId" | "clientSecret" | "clientSecretResolver" | "provider">, | ||
| context: OAuthSecretResolverContext | ||
| ): Promise<ResolvedRequestCredentials> { | ||
| if (!input.clientId) { | ||
| throw invalidRuntimeConfigError("OAuth runtime config missing clientId", { |
There was a problem hiding this comment.
revokeMessage in error details may surface sensitive provider content
disconnectWithRevoke includes the raw provider error message in the thrown error's details:
details: {
...
revokeMessage: revokeError.message
}Error messages from an OAuth provider's revocation endpoint could contain token values or other credential material (particularly from non-RFC-compliant providers). This details object propagates to callers who may log it or serialize it into an API response.
Consider either omitting revokeMessage from the details payload, or running it through the same sanitizeDetails / sanitizeValue scrubbing that oauth-flow/service.ts already applies to event payloads.
Prompt To Fix With AI
This is a comment left during a code review.
Path: packages/oauth-http/src/token-client.ts
Line: 197-214
Comment:
**`revokeMessage` in error details may surface sensitive provider content**
`disconnectWithRevoke` includes the raw provider error message in the thrown error's `details`:
```ts
details: {
...
revokeMessage: revokeError.message
}
```
Error messages from an OAuth provider's revocation endpoint could contain token values or other credential material (particularly from non-RFC-compliant providers). This `details` object propagates to callers who may log it or serialize it into an API response.
Consider either omitting `revokeMessage` from the details payload, or running it through the same `sanitizeDetails` / `sanitizeValue` scrubbing that `oauth-flow/service.ts` already applies to event payloads.
How can I resolve this? If you propose a fix, please make it concise.There was a problem hiding this comment.
1 issue found across 10 files (changes from recent commits).
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="packages/oauth-core/src/state.ts">
<violation number="1" location="packages/oauth-core/src/state.ts:76">
P2: Storage read failures are being misclassified as client invalid-state errors. This maps backend issues to a 400/non-retryable OAuth error instead of a server error.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
There was a problem hiding this comment.
1 issue found across 10 files (changes from recent commits).
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="packages/oauth-router/src/browser-routes.ts">
<violation number="1" location="packages/oauth-router/src/browser-routes.ts:79">
P2: `redirectUri` is no longer validated as non-empty, so empty-string inputs now bypass required-field checks and can cause downstream callback/token-exchange failures.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
There was a problem hiding this comment.
Actionable comments posted: 18
Note
Due to the large number of review comments, Critical, Major severity comments were prioritized as inline comments.
🟡 Minor comments (4)
packages/oauth-testing/src/__tests__/docs-inventory.test.ts-27-31 (1)
27-31:⚠️ Potential issue | 🟡 MinorGuard against
undefinedpages array before assertion.If
nav.pagesisundefined(as the type annotation indicates is possible),expect(nav.pages).toContain("oauth-router")will fail with an unclear error message rather than explicitly indicating the missingpagesproperty.Proposed fix
const nav = JSON.parse( readFileSync(resolve(repoRoot, "docs/content/docs/authentication/meta.json"), "utf8"), ) as { pages?: string[] }; - expect(nav.pages).toContain("oauth-router"); + expect(nav.pages).toBeDefined(); + expect(nav.pages).toContain("oauth-router");🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-testing/src/__tests__/docs-inventory.test.ts` around lines 27 - 31, The test reads and parses meta.json into nav (const nav) but then calls expect(nav.pages).toContain(...) without handling nav.pages possibly being undefined; update the test in __tests__/docs-inventory.test.ts to first assert that nav.pages is defined (e.g., expect(nav.pages).toBeDefined() or expect(Array.isArray(nav.pages)).toBe(true)) or coerce to an array before calling toContain, so the assertion on "oauth-router" only runs when nav.pages is a real array and yields a clear failure if pages is missing.packages/oauth-router/README.md-47-48 (1)
47-48:⚠️ Potential issue | 🟡 MinorFix broken documentation links.
The referenced documentation files do not exist in the repository:
oauth-router.mdxindocs/content/docs/authentication/oauth-flow-architecture.mdxindocs/content/docs/architecture/Either create these documentation files or update/remove the links if documentation is not yet available.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-router/README.md` around lines 47 - 48, The README links to non-existent docs files (oauth-router.mdx and oauth-flow-architecture.mdx); update or remove those links in packages/oauth-router/README.md by either creating the missing files at docs/content/docs/authentication/oauth-router.mdx and docs/content/docs/architecture/oauth-flow-architecture.mdx or point the links to the correct existing doc locations or to a placeholder (e.g., TODO or a top-level docs index); ensure the two referenced filenames (oauth-router.mdx and oauth-flow-architecture.mdx) are corrected so the Markdown links resolve.packages/oauth-storage/src/state-store.ts-144-157 (1)
144-157:⚠️ Potential issue | 🟡 MinorRedundant ternary on line 149.
Both branches of
subject.kind === "connection" ? subject.tenantId : subject.tenantIdreturnsubject.tenantId. This appears to be copy-paste artifact or intended to beundefinedfor browser-auth subjects without tenantId.🐛 Suggested fix if browser-auth tenantId should be optional
tenantId: subject.kind === "connection" ? subject.tenantId : subject.tenantId, + // If browser-auth subjects may not have tenantId, consider: + // tenantId: subject.tenantId,If
browser-authsubjects always havetenantId, simplify to justtenantId: subject.tenantId.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-storage/src/state-store.ts` around lines 144 - 157, In applySubjectToStateRecord, the tenantId uses a redundant ternary that always returns subject.tenantId; update the tenantId assignment to reflect intent by making it conditional for connection subjects (tenantId: subject.kind === "connection" ? subject.tenantId : undefined) or, if browser-auth always includes tenantId, simply use tenantId: subject.tenantId; locate this change inside applySubjectToStateRecord (alongside resolveOAuthStoredSubject, cloneOAuthStoredSubject, and cloneUnknownRecord) and remove the duplicated ternary so tenantId behavior is explicit.packages/oauth-flow/src/service.ts-366-371 (1)
366-371:⚠️ Potential issue | 🟡 MinorTreat
expiresIn = 0as a real expiry.Both branches use truthy checks, so a provider response of
0seconds falls back toundefinedor the previous expiry instead ofnow.Suggested fix
- expiresAt: response.expiresIn + expiresAt: response.expiresIn !== undefined ? new Date(this.now().getTime() + response.expiresIn * 1000).toISOString() : undefined, ... - expiresAt: response.expiresIn + expiresAt: response.expiresIn !== undefined ? new Date(this.now().getTime() + response.expiresIn * 1000).toISOString() : existing.expiresAt,Also applies to: 746-748
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-flow/src/service.ts` around lines 366 - 371, The code treats response.expiresIn with a truthy check so an expiresIn of 0 is ignored; change the ternary that sets expiresAt to explicitly check for null/undefined (e.g., response.expiresIn !== undefined && response.expiresIn !== null or Number.isFinite(response.expiresIn)) and compute expiresAt as new Date(this.now().getTime() + response.expiresIn * 1000).toISOString() when that check passes; apply the same fix to the other identical branch that uses response.expiresIn so 0 seconds becomes a real expiry.
🧹 Nitpick comments (20)
packages/http-openapi/src/generate.ts (1)
154-172: Consider adding null check insortValuefor completeness.The
sortObjectfunction is called recursively viasortValue. While the current checkvalue && typeof value === "object"handlesnull(sincenull && ...is falsy), this relies on short-circuit evaluation. The logic is correct but slightly implicit.🔧 Optional: Make null handling explicit
function sortValue(value: unknown): unknown { if (Array.isArray(value)) { return value.map((item) => sortValue(item)); } - if (value && typeof value === "object") { + if (value !== null && typeof value === "object") { return sortObject(value as Record<string, unknown>); } return value; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/http-openapi/src/generate.ts` around lines 154 - 172, sortValue currently relies on short-circuiting to avoid treating null as an object; make the null check explicit by updating the conditional that decides to call sortObject so it only recurses when value !== null && typeof value === "object" (keep the existing Array.isArray branch intact) to make the intent clearer and avoid any ambiguity in sortValue and sortObject.packages/http-openapi/src/__tests__/generate.test.ts (1)
5-184: Consider adding edge case tests for broader coverage.The current tests cover the main scenarios well. For increased robustness, consider adding tests for:
- Multiple routers combined
- Empty routers array
- Duplicate
operationIdvalues (current behavior: silently accepts)Would you like me to generate additional test cases for these edge scenarios?
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/http-openapi/src/__tests__/generate.test.ts` around lines 5 - 184, Add three new tests in the same suite using createRouter and generateOpenApiDocument: (1) "merges multiple routers" — create two routers with different routes (e.g., POST /a and GET /b) and assert the produced document contains both paths merged; (2) "handles empty routers array" — call generateOpenApiDocument with routers: [] and assert paths is an empty object; (3) "allows duplicate operationId values" — create two routes (in same or different routers) that share the same operationId and assert generation does not throw and both paths exist in the document (verifying current silent-accept behavior). Reference functions/types: createRouter, generateOpenApiDocument, and OpenApiGenerationError for pattern of error assertions used elsewhere.packages/http-openapi/package.json (1)
40-42: Use workspace protocol for explicit local resolution.The monorepo uses npm workspaces, which supports the
workspace:*protocol for internal dependencies. Updating"@superfunctions/http": "*"to"@superfunctions/http": "workspace:*"provides explicit local resolution and prevents unexpected behavior if the package is published separately.This is a monorepo-wide pattern—other packages like
oauth-routerandoauth-coreuse the same"*"convention. Adoptingworkspace:*consistently across all packages would improve clarity.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/http-openapi/package.json` around lines 40 - 42, Update the dependency entry for "@superfunctions/http" in the package.json to use the workspace protocol by changing the version from "*" to "workspace:*" (i.e., update the "dependencies" key where "@superfunctions/http" is declared); also scan other package.json files that currently use "*" for internal packages (e.g., oauth-router, oauth-core) and replace them with "workspace:*" to ensure explicit local resolution across the monorepo.packages/oauth-storage/README.md (1)
24-30: Consider adding a comment clarifying this is a placeholder key.The
Buffer.alloc(32, 7)is clearly example-only, but developers might copy-paste this snippet. A brief inline comment would prevent accidental misuse.📝 Suggested clarification
const cipher = new AesGcmTokenCipher((keyRef) => { if (keyRef !== "oauth-default") { throw new Error("unknown keyRef"); } - return Buffer.alloc(32, 7); + // ⚠️ Example only – use a secure 256-bit key from your secrets manager + return Buffer.alloc(32, 7); });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-storage/README.md` around lines 24 - 30, Add a short inline comment above the Buffer.alloc(32, 7) in the AesGcmTokenCipher key provider to state that this is a placeholder/example key and must not be used in production; mention that real usage should fetch a secure 32-byte key (e.g., from KMS/secret store) and that keyRef handling should be replaced with the appropriate key lookup logic for real deployments.packages/oauth-storage/tests/shared-schema.test.ts (1)
22-29: Consider caching the joined statements.The
statements.join("\n")is called four times. While acceptable in tests, extracting to a variable improves readability.Optional refactor
it("renders deterministic SQL schema statements for all shared OAuth tables", () => { const statements = getOAuthStorageSchema(); + const schema = statements.join("\n"); - expect(statements.join("\n")).toContain("CREATE TABLE IF NOT EXISTS oauth_states"); - expect(statements.join("\n")).toContain("CREATE TABLE IF NOT EXISTS oauth_tokens"); - expect(statements.join("\n")).toContain("CREATE TABLE IF NOT EXISTS oauth_consents"); - expect(statements.join("\n")).toContain("CREATE TABLE IF NOT EXISTS oauth_revocation_failures"); + expect(schema).toContain("CREATE TABLE IF NOT EXISTS oauth_states"); + expect(schema).toContain("CREATE TABLE IF NOT EXISTS oauth_tokens"); + expect(schema).toContain("CREATE TABLE IF NOT EXISTS oauth_consents"); + expect(schema).toContain("CREATE TABLE IF NOT EXISTS oauth_revocation_failures"); });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-storage/tests/shared-schema.test.ts` around lines 22 - 29, The test repeatedly calls statements.join("\n") which is redundant; after calling getOAuthStorageSchema() (the statements variable), compute a single joined string (e.g., const sql = statements.join("\n")) and use that cached sql variable in each expect call in the "renders deterministic SQL schema statements for all shared OAuth tables" test so the join happens once and improves readability.packages/oauth-providers/package.json (1)
40-42: Useworkspace:*for internal dependency references.This package currently uses
"*"for@superfunctions/oauth-core, matching the pattern across other packages in the monorepo (oauth-router, oauth-flow, oauth-core, oauth-storage). However, per established best practices for this monorepo, internal workspace dependencies should use theworkspace:*protocol instead. This ensures proper version resolution in monorepo environments and clearer intent for workspace-local references.Consider updating all internal package dependencies in the monorepo to use
workspace:*rather than"*".🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-providers/package.json` around lines 40 - 42, The package.json dependency for the internal package "@superfunctions/oauth-core" currently uses "*" — update the dependency declaration in packages/oauth-providers/package.json to use the workspace protocol by replacing "*" with "workspace:*" (and similarly audit other internal packages in this monorepo for the same change) so internal workspace resolution is enforced; locate the dependency entry for "@superfunctions/oauth-core" in the package.json and make that single-value substitution.packages/oauth-router/tsconfig.json (1)
2-24: Consider extracting a shared base tsconfig to avoid config drift.Lines 2-24 match other new package tsconfigs almost exactly. A root
tsconfig.base.json+ per-packageextendswould reduce maintenance churn and keep compiler behavior consistent.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-router/tsconfig.json` around lines 2 - 24, Extract the duplicated compilerOptions into a shared tsconfig.base.json and have this package's tsconfig.json extend it; create a tsconfig.base.json that contains the common "compilerOptions" block (target, module, lib, moduleResolution, resolveJsonModule, allowJs, checkJs, outDir, rootDir, declaration, declarationMap, sourceMap, strict, noUnusedLocals, noUnusedParameters, noImplicitReturns, noFallthroughCasesInSwitch, esModuleInterop, skipLibCheck, forceConsistentCasingInFileNames, isolatedModules) and then update packages/oauth-router/tsconfig.json to replace that block with an "extends": "./tsconfig.base.json" (or a repo-level base path) and only keep package-specific overrides if needed.packages/oauth-router/README.md (1)
21-24: Clarify theas nevercast in the quick start example.The
as nevertype assertion in the example code is unconventional and may confuse readers. Consider using a more descriptive type or adding a comment explaining this is a placeholder for the actual input type.📝 Suggested improvement
async resolveStartInput(request) { - return (await request.json()) as never; + // Replace with your actual StartBrowserAuthInput type + return (await request.json()) as StartBrowserAuthInput; },🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-router/README.md` around lines 21 - 24, The example uses an unconventional type assertion "as never" in the async resolveStartInput(request) example which is confusing; update the snippet to use a descriptive placeholder type (e.g., StartInput) or add an inline comment explaining that "never" is a temporary placeholder and should be replaced with the real input type, and change the assertion in resolveStartInput to that descriptive type so readers know where to supply their actual request body shape.packages/oauth-testing/src/index.ts (1)
12-26: Mock descriptor is missingrevocationUrlproperty.The mock provider descriptor omits
revocationUrl, which exists in the real provider descriptors (e.g., Google, GitHub). Tests exercising revocation flows may need to provide this via overrides. Consider adding a default mock revocation URL for completeness.🔧 Optional: Add default revocationUrl
export function createMockOAuthProviderDescriptor( overrides: Partial<OAuthProviderDescriptor> = {} ): OAuthProviderDescriptor { return { id: "mock", authorizationUrl: "https://mock.example/auth", tokenUrl: "https://mock.example/token", + revocationUrl: "https://mock.example/revoke", defaultScopes: ["read"], supportsPkce: true, supportsRefreshToken: true, scopeSeparator: " ", tokenAuthMethod: "client_secret_post", ...overrides }; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-testing/src/index.ts` around lines 12 - 26, The mock provider descriptor returned by createMockOAuthProviderDescriptor is missing the revocationUrl property; update the function to include a sensible default revocationUrl (e.g., "https://mock.example/revoke") in the returned OAuthProviderDescriptor object while keeping the ...overrides spread so callers can override or omit it as needed; ensure the property name exactly matches revocationUrl used in real descriptors.packages/oauth-providers/src/providers/google.ts (1)
45-48: Clarify the dual listing ofhttps://mail.google.com/scope.The
mail.read.fullbodyfeature lists the same scope in bothallowedScopesandrestrictedScopes. This appears intentional to mark it as a sensitive/restricted scope that requires additional consent or audit, but a brief inline comment would clarify this design decision for future maintainers.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-providers/src/providers/google.ts` around lines 45 - 48, Update the "mail.read.fullbody" entry to include a short inline comment explaining why "https://mail.google.com/" is listed in both allowedScopes and restrictedScopes (to indicate it's both required for functionality and considered a sensitive/restricted scope that needs elevated consent/audit); locate the "mail.read.fullbody" object and add the clarifying comment adjacent to the allowedScopes/restrictedScopes arrays so future maintainers understand the intentional dual-listing.packages/oauth-storage/tests/db-adapter.test.ts (1)
366-375: Consider adding defensive error handling in test helper.
JSON.parseat line 371 can throw if the stored value is a malformed JSON string. While this is test code, wrapping it in a try-catch would provide clearer test failures when debugging unexpected adapter behavior.🔧 Optional: Add defensive parsing
function parseStoredJsonFields(record: Record<string, unknown>): Record<string, unknown> { const nextRecord = { ...record }; for (const key of ["subject_payload", "requested_scopes", "scope_set", "metadata"]) { const value = nextRecord[key]; if (typeof value === "string") { - nextRecord[key] = JSON.parse(value); + try { + nextRecord[key] = JSON.parse(value); + } catch { + // Leave as-is if not valid JSON - let the store validation catch it + } } } return nextRecord; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-storage/tests/db-adapter.test.ts` around lines 366 - 375, The helper function parseStoredJsonFields should defensively handle malformed JSON: wrap each JSON.parse call (for keys "subject_payload", "requested_scopes", "scope_set", "metadata") in a try-catch inside parseStoredJsonFields, and on parse failure throw a new Error (or rethrow) that includes the key name and the original value to make test failures clearer and easier to debug.packages/oauth-core/tests/oauth-core.test.ts (1)
128-145: Entropy and uniqueness test could be more rigorous.The test generates 32 state IDs and verifies uniqueness, which is reasonable. However, testing that
length >= 20(line 141) is a weak entropy check. The actual state format frompkce.tsshould produce ~48+ character IDs (prefix + 43 base64url chars from 32 bytes).🔧 Optional: Tighten entropy length assertion
stateIds.add(result.stateId); - expect(result.stateId.length).toBeGreaterThanOrEqual(20); + // st_ prefix (3) + base64url of 32 bytes (43) = 46+ chars + expect(result.stateId.length).toBeGreaterThanOrEqual(40); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-core/tests/oauth-core.test.ts` around lines 128 - 145, The entropy assertion in the "generates unique high-entropy state values across requests" test is too weak; update the assertion for result.stateId (produced by createAuthorizationRequest and pkce.ts) to enforce the expected higher length (e.g., >= 48) and optionally validate the expected prefix and base64url character set for stronger confidence in entropy/format rather than just length >= 20. Ensure you reference result.stateId in the loop and adjust the expect(...) call accordingly.packages/oauth-testing/src/browser-auth-fixtures.ts (1)
78-100:redactSecrets()misses secrets embedded in generic string fields.Only key names are checked here. A token or secret inside
message,error, or another free-form field will survive unchanged, which makes failure snapshots easy to leak.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-testing/src/browser-auth-fixtures.ts` around lines 78 - 100, redactSecrets currently only redacts based on key names (shouldRedactKey) so secret tokens embedded in free-form strings (e.g., message, error) are missed; update redactValue to also test primitive string values against the same secret regex and replace matching strings with REDACTED (e.g., if typeof value === "string" and /(token|secret|authorization|bearer|password)/i.test(value) then return REDACTED) while preserving existing array/object recursion and key-based redaction in redactValue and keeping shouldRedactKey as-is for key checks.packages/oauth-http/src/token-client.ts (2)
498-526: Consider adding a request timeout to the fetch wrapper.The global fetch wrapper doesn't configure a timeout. Long-running or hanging requests to provider endpoints could block indefinitely. Consider adding
AbortControllerwith a configurable timeout.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-http/src/token-client.ts` around lines 498 - 526, The getGlobalFetch wrapper currently uses globalThis.fetch without timeout; update getGlobalFetch to create an AbortController and race a configurable timeout against the fetch call (e.g., accept an optional timeoutMs parameter or read a module-level DEFAULT_TIMEOUT), attach controller.signal to the fetch call inside the returned async function, and ensure the controller is aborted when the timeout elapses so the promise rejects; also propagate or translate AbortError into an OAuthHttpError with code like "TIMEOUT" and retryable=false so callers of getGlobalFetch (and functions that call it) can handle timeouts uniformly.
350-373: Robust response parsing with sensible fallbacks.The content-type heuristics with JSON-first fallback handle various provider behaviors well. However,
parseFormBodyon line 364 will also be called fortext/plainresponses—this works becauseURLSearchParamsgracefully handles non-form text (returning a single key-value), but consider whether this is intentional.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-http/src/token-client.ts` around lines 350 - 373, The current parseResponseBody treats "text/plain" same as urlencoded and hands it to parseFormBody, which can accidentally coerce arbitrary text into a single form entry; update parseResponseBody so that only "application/x-www-form-urlencoded" triggers parseFormBody, while "text/plain" should first try safeParseJson and otherwise return the rawBodyText (or another explicit raw-string fallback) instead of parseFormBody; adjust the content-type checks in parseResponseBody and use safeParseJson, parseFormBody, and rawBodyText appropriately to implement this behavior.packages/oauth-storage/src/state-store.ts (1)
245-247: Shallow clone for metadata may not suffice for deeply nested objects.
cloneUnknownRecorduses spread which is shallow. If metadata contains nested objects, mutations could affect the stored copy. Consider usingstructuredCloneor a deep clone utility if nested metadata is expected.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-storage/src/state-store.ts` around lines 245 - 247, cloneUnknownRecord currently performs a shallow clone using object spread which will not protect stored metadata from mutations when the value contains nested objects; replace the shallow clone with a deep-clone operation (e.g., use structuredClone(value) when available or a reliable deep-clone utility such as lodash.clonedeep or JSON.parse(JSON.stringify(value)) as a fallback) while preserving the undefined handling so cloneUnknownRecord(value: Record<string, unknown> | undefined) returns undefined for undefined input and a deep-cloned object otherwise.packages/oauth-providers/src/index.ts (1)
69-75: Consider explicit re-export list for better tree-shaking.Using
export *re-exports everything from each module. If some exports are internal implementation details, consider explicit named exports to reduce public API surface and improve tree-shaking.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-providers/src/index.ts` around lines 69 - 75, Replace the broad "export *" barrel in packages/oauth-providers/src/index.ts with explicit named exports for each module to reduce public surface and improve tree-shaking: inspect each module ("policy-registry.js", "stores.js", "providers/google.js", "providers/microsoft.js", "providers/yahoo.js", "providers/apple.js", "providers/github.js") and export only the intended public symbols (functions, classes, types, constants) by name (e.g., export { PublicFunc, PublicType } from "./providers/google.js") rather than using export *, ensuring internal helpers are not re-exported and the package's public API is explicit and minimal.packages/oauth-router/src/browser-routes.ts (1)
216-230: Add validation for unexpected fields in disconnect body.
parseDisconnectBodycasts the JSON body without validating for unexpected fields. While not a security issue (extra fields are ignored), explicit validation or strict typing could prevent subtle bugs from typos likerevokRemoteinstead ofrevokeRemote.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-router/src/browser-routes.ts` around lines 216 - 230, parseDisconnectBody currently casts request.json() without rejecting unexpected or misspelled fields; update parseDisconnectBody to validate the incoming object keys against the allowed set {connectionId, revokeRemote, tokenTypeHint} and throw a clear error if extra keys are present, and also explicitly validate types for revokeRemote (boolean | undefined) and tokenTypeHint (only "access_token" | "refresh_token" | undefined) before returning the Omit<OAuthFlowDisconnectInput,...> result; keep using asRequiredString(body.connectionId, "connectionId") for connectionId and refer to parseDisconnectBody, OAuthFlowDisconnectInput, and asRequiredString when implementing the checks.packages/oauth-storage/src/adapters/memory.ts (1)
98-108: Redundant null check after confirmed non-null record.On line 107,
recordis guaranteed non-null (the early return on line 99-102 handles the null case), making the ternary redundant.♻️ Suggested simplification
if (record.connectionId !== connectionId) { this.tokenByConnection.delete(connectionId); return null; } - return record ? cloneTokenRecord(record) : null; + return cloneTokenRecord(record); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-storage/src/adapters/memory.ts` around lines 98 - 108, The final ternary in the method that retrieves a token by connection (the block using this.tokensById.get(tokenId), this.tokenByConnection.delete(connectionId), and cloneTokenRecord) is redundant because record is already checked for null earlier; replace the ternary return `return record ? cloneTokenRecord(record) : null;` with a direct `return cloneTokenRecord(record);` (or an explicit non-null assertion/cast if TypeScript complains) to simplify the code while keeping the existing null-early returns and tokenByConnection cleanup.packages/oauth-core/src/state.ts (1)
38-46: Exact string matching for redirect URI may cause issues with trailing slashes.
allowlistedRedirectUris.includes(redirectUri)performs exact string comparison. URLs likehttps://app.example/callbackandhttps://app.example/callback/would be treated as different. Consider whether this strictness is intentional (security benefit) or could cause developer confusion.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-core/src/state.ts` around lines 38 - 46, The check in assertRedirectUriAllowed uses exact string matching (allowlistedRedirectUris.includes(redirectUri)) which treats URIs with/without trailing slashes as different; update the function to normalize comparisons instead of raw string equality: parse redirectUri and each entry in allowlistedRedirectUris (use the URL constructor), compare normalized components (e.g., origin + pathname after trimming a trailing slash) and consider whether to include search/hash, and then throw the same OAuthCoreError if no normalized match is found; alternatively, if strictness is desired, add a comment documenting the intentional exact-match behavior referencing assertRedirectUriAllowed, allowlistedRedirectUris and OAuthCoreError.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/oauth-flow/src/service.ts`:
- Around line 435-438: The preview read (stateStore.get and toFlowSubject) must
be moved inside the try/catch so storage or transformation failures are
normalized and routed through mapToOAuthFlowError() and the
oauth.flow.callback.failed telemetry path; keep requestId creation as-is
(requestId = input.requestId ?? randomUUID()), then inside the existing try
block perform const statePreview = await this.config.stateStore.get(input.state)
and const subjectPreview = statePreview ? toFlowSubject(statePreview) :
undefined, and derive providerId = input.providerId ?? statePreview?.providerId
there; ensure the catch still calls mapToOAuthFlowError(...) and emits
oauth.flow.callback.failed so preview lookup failures are handled consistently.
- Around line 1031-1037: The current getTokenStorageMode() returns
"plaintext-unsafe" when tokenStorageMode is unset and a plain TokenVault is
injected; change the default behavior to require encryption by returning
"encrypted-required" when this.config.tokenStorageMode is not set so plaintext
storage is not chosen implicitly. Update the getTokenStorageMode()
implementation (referencing getTokenStorageMode, OAuthTokenStorageMode,
this.config.tokenStorageMode, this.config.tokenVault, isEncryptedTokenVault) to
return "encrypted-required" as the default and preserve any explicitly provided
tokenStorageMode values; optionally add a comment or a warning log where
tokenStorageMode is unset to surface that plaintext must be opted into
explicitly.
In `@packages/oauth-http/src/errors.ts`:
- Around line 61-72: The OAuthHttpError construction is copying the resolver's
raw error text into details.causeMessage which may leak secrets; remove the
causeMessage field from the details object and instead rely on the existing
cause property to carry the original Error for debugging. Update the code that
builds the error (the OAuthHttpError call in this file) to omit
details.causeMessage (you can keep providerId, operation, grantType,
hasResolver) and, if you need a non-sensitive indicator, add a boolean flag like
details.resolverFailed = true or a sanitized message token instead of the raw
cause.message.
- Around line 81-92: The current code returns the entire upstream payload as
error details (payload) which can leak sensitive fields; instead, construct a
safe allow-list object containing only specific OAuth-safe keys (for example:
error, error_description, error_uri, status, code, provider) and map values
using asString(...) where appropriate; replace the return's details: payload
with details: { error: asString(payload.error), error_description: description,
error_uri: asString(payload.error_uri), status: asString(payload.status), code:
asString(payload.code), provider: asString(payload.provider) } (omit undefined
entries) so only the allow-listed fields from payload are forwarded.
In `@packages/oauth-http/src/retry-policy.ts`:
- Around line 40-43: The code reads Retry-After via parseRetryAfter into
retryAfterMs and returns it without bounds-checking, so very large headers
bypass maxDelayMs; update the logic where retryAfterMs is handled (the block
using parseRetryAfter and returning { retry: true, delayMs: retryAfterMs }) to
clamp the value to the configured maxDelayMs (use Math.min or equivalent) before
returning so the returned delayMs never exceeds maxDelayMs; reference
parseRetryAfter, retryAfterMs and maxDelayMs when making the change.
In `@packages/oauth-providers/README.md`:
- Around line 51-52: The README contains broken links to
docs/content/docs/authentication/oauth-providers.mdx and
docs/content/docs/architecture/oauth-flow-architecture.mdx; update README.md to
either remove those two markdown links or replace them with the correct existing
documentation paths (or point to high-level docs or a placeholder note) so the
links resolve; look for and use the canonical doc names "oauth-providers.mdx"
and "oauth-flow-architecture.mdx" as references when searching the repo to find
the correct targets or decide to remove the links.
In `@packages/oauth-providers/src/policy-registry.ts`:
- Around line 234-258: assertOperationAllowed currently only checks
operationPolicies but doesn't block caller-supplied featureMode values that are
not allowed by the provider; add a provider-level featureMode check right after
retrieving policy (in assertOperationAllowed / after const policy =
this.getPolicy(input.providerId)) that, if input.featureMode is provided and not
contained in policy.capabilities.allowedFeatureModes, throws a
ProviderPolicyError (same "PROVIDER_POLICY_BLOCKED" shape) including providerId,
operation, policyVersion, attempted featureMode and allowedFeatureModes; keep
the existing operationPolicy.requiredFeatureMode check afterward so
operation-specific requirements still apply.
In `@packages/oauth-providers/src/providers/github.ts`:
- Around line 4-14: The revocationUrl in the GitHub provider contains a
`{client_id}` placeholder that is never substituted before the HTTP call; update
the HTTP transport used by revokeToken (in
packages/oauth-http/src/token-client.ts — look for the TokenClient/revokeToken
implementation and the fetch/executor call around line 116) to perform template
variable substitution on the provider's revocationUrl before calling the
fetcher: accept a context with client_id (from client configuration) and replace
`{client_id}` (and any similar `{...}` placeholders) in the revocationUrl
string, then pass the resulting concrete URL to the fetcher so GitHub revocation
requests use the actual client id. Ensure this substitution is generic (string
replace for `{key}` tokens) so other provider-specific placeholders are
supported.
In `@packages/oauth-providers/src/providers/microsoft.ts`:
- Around line 8-59: The revocation behavior is incorrect:
microsoftOAuthProviderDescriptor.revocationUrl points to the logout endpoint
which doesn't revoke tokens, so update microsoftProviderPolicy and the
descriptor to stop claiming revoke support—remove or clear revocationUrl in
microsoftOAuthProviderDescriptor (or document it as logout-only), set
capabilities.supportsRevoke to false, and change
operationPolicies["oauth.revoke"] to disallow (allowed: false) or remove that
entry; optionally add a comment in the descriptor/policy noting that full
server-side revocation requires Microsoft Graph (e.g., revokeSignInSessions) and
must be implemented separately if desired.
In `@packages/oauth-router/src/connection-routes.ts`:
- Around line 164-177: The parseDisconnectBody function currently casts
request.json() directly and doesn't validate types; update parseDisconnectBody
to safely parse JSON (catch JSON parse errors) and validate each field: use
asRequiredString for connectionId, ensure revokeRemote is either undefined or a
boolean (coerce only from true/false, reject strings), and enforce tokenTypeHint
is either "access_token" or "refresh_token" or undefined; on invalid input throw
a deterministic validation error (400/BadRequest) before calling disconnect() so
malformed payloads are rejected.
- Around line 46-50: The spread order currently overwrites a resolver-provided
requestId from config.resolveStartInput() with the local requestId variable
(which is undefined when the header is absent); update the payload passed to
config.flowService.start so you don't clobber the input.requestId — e.g., only
include the header-derived requestId when it's defined (use a conditional spread
or merge with ...input after header fields) so that config.resolveStartInput()'s
requestId is preserved; relevant symbols: requestIdHeader,
request.headers.get(...), config.resolveStartInput, requestId variable, and
config.flowService.start.
In `@packages/oauth-storage/README.md`:
- Around line 67-70: The README references two broken doc links
(`../../docs/content/docs/authentication/oauth-storage.mdx` and
`../../docs/content/docs/architecture/oauth-flow-architecture.mdx`); fix by
either adding the missing MDX files at those paths with the appropriate content,
or update the two link targets in packages/oauth-storage/README.md to point to
existing documentation pages (or remove the two list items entirely); ensure the
"Canonical docs" and "Stack overview" entries are consistent and use valid repo
doc URLs.
In `@packages/oauth-storage/src/adapters/db.ts`:
- Around line 509-515: parseJsonValue<T> currently lets JSON.parse throw raw
SyntaxErrors for malformed persisted JSON; update it to catch JSON.parse
exceptions and rethrow an OAuthDbAdapterError with classification INTERNAL_ERROR
so callers get consistent adapter errors. Specifically, wrap the JSON.parse call
inside a try/catch in parseJsonValue, and in the catch create and throw new
OAuthDbAdapterError("Malformed persisted JSON", { classification:
"INTERNAL_ERROR", cause: err, meta: { raw } }) (or the project's
OAuthDbAdapterError constructor shape) so parsing failures for fields like
subject_payload, requested_scopes, and metadata are normalized into the adapter
error type.
- Around line 211-223: rotateKey currently swallows all errors with a bare catch
which hides real failures; change rotateKey (the method) to only ignore a "row
not found" outcome and surface other failures: either use the return value of
this.adapter.update (if it returns affected rows) to return silently when 0 rows
were updated and otherwise throw, or catch the adapter-specific not-found error
(e.g., check error.name or error.code from this.adapter.update) and rethrow any
other errors so connectivity/write failures propagate; keep the
updated_at/key_ref update logic and only suppress the specific missing-row case.
In `@packages/oauth-storage/src/adapters/sql.ts`:
- Around line 146-149: The constructor assigns tableName to this.tableName and
that identifier (and the schema helper parameters used across methods referenced
around lines for functions that build SQL) is interpolated directly into
SQL—validate and sanitize these identifiers before interpolation by allowing
only a safe pattern (e.g., letters, digits, underscores, and optional dot for
schema) or by using the SqlClient's identifier-quoting/escaping helper if
available; update the constructor or add a private validateIdentifier(name:
string) helper and call it for tableName and every schema/name parameter used in
SQL-building functions so only vetted identifiers are embedded in the query
strings (reject or throw on invalid values).
- Around line 644-706: The mappers (mapStateRow, mapConsentRow,
mapRevocationFailureRow, parseSubject) currently call JSON.parse with unchecked
casts and pass row.retryable directly, which can leak SyntaxError or wrong
types; update these mappers to safely parse and validate decoded fields: wrap
JSON.parse for requested_scopes and scope_set in a try/catch, verify the result
is an array of strings (fallback or throw a controlled error if not), parse
metadata and confirm it's an object before casting (or set undefined), and
convert row.retryable from integer to boolean (e.g., retryable = row.retryable
=== 1). Use the existing validateOAuthStoredSubject/cloneOAuthStoredSubject in
parseSubject as-is but ensure parseSubject itself catches JSON errors and
rethrows a clearer validation error. Ensure
mapTokenRow/mapStateRow/mapConsentRow/mapRevocationFailureRow all perform these
checks so malformed DB rows cannot leak raw SyntaxError or incorrect runtime
shapes.
In `@packages/oauth-storage/src/index.ts`:
- Around line 82-87: The rotateKey implementations in SqlTokenVault,
DbTokenVault, and MemoryTokenVault currently only update key_ref/updated_at
leaving encryptedPayload encrypted with the old key; update each rotateKey to
perform full re-encryption: locate the TokenRecord (via get or getByConnection
as appropriate), decrypt the existing encryptedPayload using the current key,
re-encrypt the plaintext with the newKeyRef, update encryptedPayload and
updated_at, and persist the whole TokenRecord (or call the existing put(record)
path that performs encryption) so the TokenVault contract is honored; reference
TokenVault.rotateKey, TokenRecord, SqlTokenVault.rotateKey,
DbTokenVault.rotateKey, MemoryTokenVault.rotateKey, and use the same
encryption/decryption helpers used by put().
In `@packages/oauth-storage/src/token-vault.ts`:
- Around line 68-85: The decrypt method can throw native crypto or parsing
errors (from Buffer.from, createDecipheriv, decipher.update/final) which bypass
the TokenVaultError contract; wrap the parsing and decryption block in a
try/catch inside async decrypt(ciphertext, keyRef) (covering the
split/Buffer.from calls, getKey(), createDecipheriv(), setAuthTag(),
update()/final()) and on any thrown error throw a TokenVaultError (e.g.
"DECRYPTION_ERROR" or "INTERNAL_ERROR") with a concise message and include the
original error.message as context; keep using the existing symbols (decrypt,
getKey, createDecipheriv, decipher.setAuthTag, decipher.update, decipher.final)
so callers always receive TokenVaultError for malformed/tampered/wrong-key
cases.
---
Minor comments:
In `@packages/oauth-flow/src/service.ts`:
- Around line 366-371: The code treats response.expiresIn with a truthy check so
an expiresIn of 0 is ignored; change the ternary that sets expiresAt to
explicitly check for null/undefined (e.g., response.expiresIn !== undefined &&
response.expiresIn !== null or Number.isFinite(response.expiresIn)) and compute
expiresAt as new Date(this.now().getTime() + response.expiresIn *
1000).toISOString() when that check passes; apply the same fix to the other
identical branch that uses response.expiresIn so 0 seconds becomes a real
expiry.
In `@packages/oauth-router/README.md`:
- Around line 47-48: The README links to non-existent docs files
(oauth-router.mdx and oauth-flow-architecture.mdx); update or remove those links
in packages/oauth-router/README.md by either creating the missing files at
docs/content/docs/authentication/oauth-router.mdx and
docs/content/docs/architecture/oauth-flow-architecture.mdx or point the links to
the correct existing doc locations or to a placeholder (e.g., TODO or a
top-level docs index); ensure the two referenced filenames (oauth-router.mdx and
oauth-flow-architecture.mdx) are corrected so the Markdown links resolve.
In `@packages/oauth-storage/src/state-store.ts`:
- Around line 144-157: In applySubjectToStateRecord, the tenantId uses a
redundant ternary that always returns subject.tenantId; update the tenantId
assignment to reflect intent by making it conditional for connection subjects
(tenantId: subject.kind === "connection" ? subject.tenantId : undefined) or, if
browser-auth always includes tenantId, simply use tenantId: subject.tenantId;
locate this change inside applySubjectToStateRecord (alongside
resolveOAuthStoredSubject, cloneOAuthStoredSubject, and cloneUnknownRecord) and
remove the duplicated ternary so tenantId behavior is explicit.
In `@packages/oauth-testing/src/__tests__/docs-inventory.test.ts`:
- Around line 27-31: The test reads and parses meta.json into nav (const nav)
but then calls expect(nav.pages).toContain(...) without handling nav.pages
possibly being undefined; update the test in __tests__/docs-inventory.test.ts to
first assert that nav.pages is defined (e.g., expect(nav.pages).toBeDefined() or
expect(Array.isArray(nav.pages)).toBe(true)) or coerce to an array before
calling toContain, so the assertion on "oauth-router" only runs when nav.pages
is a real array and yields a clear failure if pages is missing.
---
Nitpick comments:
In `@packages/http-openapi/package.json`:
- Around line 40-42: Update the dependency entry for "@superfunctions/http" in
the package.json to use the workspace protocol by changing the version from "*"
to "workspace:*" (i.e., update the "dependencies" key where
"@superfunctions/http" is declared); also scan other package.json files that
currently use "*" for internal packages (e.g., oauth-router, oauth-core) and
replace them with "workspace:*" to ensure explicit local resolution across the
monorepo.
In `@packages/http-openapi/src/__tests__/generate.test.ts`:
- Around line 5-184: Add three new tests in the same suite using createRouter
and generateOpenApiDocument: (1) "merges multiple routers" — create two routers
with different routes (e.g., POST /a and GET /b) and assert the produced
document contains both paths merged; (2) "handles empty routers array" — call
generateOpenApiDocument with routers: [] and assert paths is an empty object;
(3) "allows duplicate operationId values" — create two routes (in same or
different routers) that share the same operationId and assert generation does
not throw and both paths exist in the document (verifying current silent-accept
behavior). Reference functions/types: createRouter, generateOpenApiDocument, and
OpenApiGenerationError for pattern of error assertions used elsewhere.
In `@packages/http-openapi/src/generate.ts`:
- Around line 154-172: sortValue currently relies on short-circuiting to avoid
treating null as an object; make the null check explicit by updating the
conditional that decides to call sortObject so it only recurses when value !==
null && typeof value === "object" (keep the existing Array.isArray branch
intact) to make the intent clearer and avoid any ambiguity in sortValue and
sortObject.
In `@packages/oauth-core/src/state.ts`:
- Around line 38-46: The check in assertRedirectUriAllowed uses exact string
matching (allowlistedRedirectUris.includes(redirectUri)) which treats URIs
with/without trailing slashes as different; update the function to normalize
comparisons instead of raw string equality: parse redirectUri and each entry in
allowlistedRedirectUris (use the URL constructor), compare normalized components
(e.g., origin + pathname after trimming a trailing slash) and consider whether
to include search/hash, and then throw the same OAuthCoreError if no normalized
match is found; alternatively, if strictness is desired, add a comment
documenting the intentional exact-match behavior referencing
assertRedirectUriAllowed, allowlistedRedirectUris and OAuthCoreError.
In `@packages/oauth-core/tests/oauth-core.test.ts`:
- Around line 128-145: The entropy assertion in the "generates unique
high-entropy state values across requests" test is too weak; update the
assertion for result.stateId (produced by createAuthorizationRequest and
pkce.ts) to enforce the expected higher length (e.g., >= 48) and optionally
validate the expected prefix and base64url character set for stronger confidence
in entropy/format rather than just length >= 20. Ensure you reference
result.stateId in the loop and adjust the expect(...) call accordingly.
In `@packages/oauth-http/src/token-client.ts`:
- Around line 498-526: The getGlobalFetch wrapper currently uses
globalThis.fetch without timeout; update getGlobalFetch to create an
AbortController and race a configurable timeout against the fetch call (e.g.,
accept an optional timeoutMs parameter or read a module-level DEFAULT_TIMEOUT),
attach controller.signal to the fetch call inside the returned async function,
and ensure the controller is aborted when the timeout elapses so the promise
rejects; also propagate or translate AbortError into an OAuthHttpError with code
like "TIMEOUT" and retryable=false so callers of getGlobalFetch (and functions
that call it) can handle timeouts uniformly.
- Around line 350-373: The current parseResponseBody treats "text/plain" same as
urlencoded and hands it to parseFormBody, which can accidentally coerce
arbitrary text into a single form entry; update parseResponseBody so that only
"application/x-www-form-urlencoded" triggers parseFormBody, while "text/plain"
should first try safeParseJson and otherwise return the rawBodyText (or another
explicit raw-string fallback) instead of parseFormBody; adjust the content-type
checks in parseResponseBody and use safeParseJson, parseFormBody, and
rawBodyText appropriately to implement this behavior.
In `@packages/oauth-providers/package.json`:
- Around line 40-42: The package.json dependency for the internal package
"@superfunctions/oauth-core" currently uses "*" — update the dependency
declaration in packages/oauth-providers/package.json to use the workspace
protocol by replacing "*" with "workspace:*" (and similarly audit other internal
packages in this monorepo for the same change) so internal workspace resolution
is enforced; locate the dependency entry for "@superfunctions/oauth-core" in the
package.json and make that single-value substitution.
In `@packages/oauth-providers/src/index.ts`:
- Around line 69-75: Replace the broad "export *" barrel in
packages/oauth-providers/src/index.ts with explicit named exports for each
module to reduce public surface and improve tree-shaking: inspect each module
("policy-registry.js", "stores.js", "providers/google.js",
"providers/microsoft.js", "providers/yahoo.js", "providers/apple.js",
"providers/github.js") and export only the intended public symbols (functions,
classes, types, constants) by name (e.g., export { PublicFunc, PublicType } from
"./providers/google.js") rather than using export *, ensuring internal helpers
are not re-exported and the package's public API is explicit and minimal.
In `@packages/oauth-providers/src/providers/google.ts`:
- Around line 45-48: Update the "mail.read.fullbody" entry to include a short
inline comment explaining why "https://mail.google.com/" is listed in both
allowedScopes and restrictedScopes (to indicate it's both required for
functionality and considered a sensitive/restricted scope that needs elevated
consent/audit); locate the "mail.read.fullbody" object and add the clarifying
comment adjacent to the allowedScopes/restrictedScopes arrays so future
maintainers understand the intentional dual-listing.
In `@packages/oauth-router/README.md`:
- Around line 21-24: The example uses an unconventional type assertion "as
never" in the async resolveStartInput(request) example which is confusing;
update the snippet to use a descriptive placeholder type (e.g., StartInput) or
add an inline comment explaining that "never" is a temporary placeholder and
should be replaced with the real input type, and change the assertion in
resolveStartInput to that descriptive type so readers know where to supply their
actual request body shape.
In `@packages/oauth-router/src/browser-routes.ts`:
- Around line 216-230: parseDisconnectBody currently casts request.json()
without rejecting unexpected or misspelled fields; update parseDisconnectBody to
validate the incoming object keys against the allowed set {connectionId,
revokeRemote, tokenTypeHint} and throw a clear error if extra keys are present,
and also explicitly validate types for revokeRemote (boolean | undefined) and
tokenTypeHint (only "access_token" | "refresh_token" | undefined) before
returning the Omit<OAuthFlowDisconnectInput,...> result; keep using
asRequiredString(body.connectionId, "connectionId") for connectionId and refer
to parseDisconnectBody, OAuthFlowDisconnectInput, and asRequiredString when
implementing the checks.
In `@packages/oauth-router/tsconfig.json`:
- Around line 2-24: Extract the duplicated compilerOptions into a shared
tsconfig.base.json and have this package's tsconfig.json extend it; create a
tsconfig.base.json that contains the common "compilerOptions" block (target,
module, lib, moduleResolution, resolveJsonModule, allowJs, checkJs, outDir,
rootDir, declaration, declarationMap, sourceMap, strict, noUnusedLocals,
noUnusedParameters, noImplicitReturns, noFallthroughCasesInSwitch,
esModuleInterop, skipLibCheck, forceConsistentCasingInFileNames,
isolatedModules) and then update packages/oauth-router/tsconfig.json to replace
that block with an "extends": "./tsconfig.base.json" (or a repo-level base path)
and only keep package-specific overrides if needed.
In `@packages/oauth-storage/README.md`:
- Around line 24-30: Add a short inline comment above the Buffer.alloc(32, 7) in
the AesGcmTokenCipher key provider to state that this is a placeholder/example
key and must not be used in production; mention that real usage should fetch a
secure 32-byte key (e.g., from KMS/secret store) and that keyRef handling should
be replaced with the appropriate key lookup logic for real deployments.
In `@packages/oauth-storage/src/adapters/memory.ts`:
- Around line 98-108: The final ternary in the method that retrieves a token by
connection (the block using this.tokensById.get(tokenId),
this.tokenByConnection.delete(connectionId), and cloneTokenRecord) is redundant
because record is already checked for null earlier; replace the ternary return
`return record ? cloneTokenRecord(record) : null;` with a direct `return
cloneTokenRecord(record);` (or an explicit non-null assertion/cast if TypeScript
complains) to simplify the code while keeping the existing null-early returns
and tokenByConnection cleanup.
In `@packages/oauth-storage/src/state-store.ts`:
- Around line 245-247: cloneUnknownRecord currently performs a shallow clone
using object spread which will not protect stored metadata from mutations when
the value contains nested objects; replace the shallow clone with a deep-clone
operation (e.g., use structuredClone(value) when available or a reliable
deep-clone utility such as lodash.clonedeep or JSON.parse(JSON.stringify(value))
as a fallback) while preserving the undefined handling so
cloneUnknownRecord(value: Record<string, unknown> | undefined) returns undefined
for undefined input and a deep-cloned object otherwise.
In `@packages/oauth-storage/tests/db-adapter.test.ts`:
- Around line 366-375: The helper function parseStoredJsonFields should
defensively handle malformed JSON: wrap each JSON.parse call (for keys
"subject_payload", "requested_scopes", "scope_set", "metadata") in a try-catch
inside parseStoredJsonFields, and on parse failure throw a new Error (or
rethrow) that includes the key name and the original value to make test failures
clearer and easier to debug.
In `@packages/oauth-storage/tests/shared-schema.test.ts`:
- Around line 22-29: The test repeatedly calls statements.join("\n") which is
redundant; after calling getOAuthStorageSchema() (the statements variable),
compute a single joined string (e.g., const sql = statements.join("\n")) and use
that cached sql variable in each expect call in the "renders deterministic SQL
schema statements for all shared OAuth tables" test so the join happens once and
improves readability.
In `@packages/oauth-testing/src/browser-auth-fixtures.ts`:
- Around line 78-100: redactSecrets currently only redacts based on key names
(shouldRedactKey) so secret tokens embedded in free-form strings (e.g., message,
error) are missed; update redactValue to also test primitive string values
against the same secret regex and replace matching strings with REDACTED (e.g.,
if typeof value === "string" and
/(token|secret|authorization|bearer|password)/i.test(value) then return
REDACTED) while preserving existing array/object recursion and key-based
redaction in redactValue and keeping shouldRedactKey as-is for key checks.
In `@packages/oauth-testing/src/index.ts`:
- Around line 12-26: The mock provider descriptor returned by
createMockOAuthProviderDescriptor is missing the revocationUrl property; update
the function to include a sensible default revocationUrl (e.g.,
"https://mock.example/revoke") in the returned OAuthProviderDescriptor object
while keeping the ...overrides spread so callers can override or omit it as
needed; ensure the property name exactly matches revocationUrl used in real
descriptors.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
| private getTokenStorageMode(): OAuthTokenStorageMode { | ||
| if (this.config.tokenStorageMode) { | ||
| return this.config.tokenStorageMode; | ||
| } | ||
|
|
||
| return isEncryptedTokenVault(this.config.tokenVault) ? "encrypted-required" : "plaintext-unsafe"; | ||
| } |
There was a problem hiding this comment.
Require explicit opt-in for plaintext token storage.
When tokenStorageMode is unset and a plain TokenVault is injected, getTokenStorageMode() returns plaintext-unsafe, so access and refresh tokens are silently persisted unencrypted. That makes the insecure path the default instead of an explicit override.
Suggested fix
private getTokenStorageMode(): OAuthTokenStorageMode {
- if (this.config.tokenStorageMode) {
- return this.config.tokenStorageMode;
- }
-
- return isEncryptedTokenVault(this.config.tokenVault) ? "encrypted-required" : "plaintext-unsafe";
+ return this.config.tokenStorageMode ?? "encrypted-required";
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/oauth-flow/src/service.ts` around lines 1031 - 1037, The current
getTokenStorageMode() returns "plaintext-unsafe" when tokenStorageMode is unset
and a plain TokenVault is injected; change the default behavior to require
encryption by returning "encrypted-required" when this.config.tokenStorageMode
is not set so plaintext storage is not chosen implicitly. Update the
getTokenStorageMode() implementation (referencing getTokenStorageMode,
OAuthTokenStorageMode, this.config.tokenStorageMode, this.config.tokenVault,
isEncryptedTokenVault) to return "encrypted-required" as the default and
preserve any explicitly provided tokenStorageMode values; optionally add a
comment or a warning log where tokenStorageMode is unset to surface that
plaintext must be opted into explicitly.
| const payload = rawBody as Record<string, unknown>; | ||
| const description = | ||
| asString(payload.error_description) ?? | ||
| asString(payload.message) ?? | ||
| asString(payload.error) ?? | ||
| asString(payload.detail) ?? | ||
| "OAuth provider request failed"; | ||
|
|
||
| return { | ||
| message: description, | ||
| details: payload | ||
| }; |
There was a problem hiding this comment.
Avoid attaching the full provider response as error details.
This forwards every upstream field into the structured error. If a provider echoes request fragments or internal identifiers, they will leak anywhere this error is logged or serialized. Allow-list the small set of safe OAuth fields instead.
Suggested fix
const payload = rawBody as Record<string, unknown>;
const description =
asString(payload.error_description) ??
asString(payload.message) ??
asString(payload.error) ??
asString(payload.detail) ??
"OAuth provider request failed";
+ const details = Object.fromEntries(
+ Object.entries({
+ error: asString(payload.error),
+ error_description: asString(payload.error_description),
+ error_uri: asString(payload.error_uri),
+ detail: asString(payload.detail)
+ }).filter(([, value]) => value !== undefined)
+ );
return {
message: description,
- details: payload
+ details: Object.keys(details).length > 0 ? details : undefined
};
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const payload = rawBody as Record<string, unknown>; | |
| const description = | |
| asString(payload.error_description) ?? | |
| asString(payload.message) ?? | |
| asString(payload.error) ?? | |
| asString(payload.detail) ?? | |
| "OAuth provider request failed"; | |
| return { | |
| message: description, | |
| details: payload | |
| }; | |
| const payload = rawBody as Record<string, unknown>; | |
| const description = | |
| asString(payload.error_description) ?? | |
| asString(payload.message) ?? | |
| asString(payload.error) ?? | |
| asString(payload.detail) ?? | |
| "OAuth provider request failed"; | |
| const details = Object.fromEntries( | |
| Object.entries({ | |
| error: asString(payload.error), | |
| error_description: asString(payload.error_description), | |
| error_uri: asString(payload.error_uri), | |
| detail: asString(payload.detail) | |
| }).filter(([, value]) => value !== undefined) | |
| ); | |
| return { | |
| message: description, | |
| details: Object.keys(details).length > 0 ? details : undefined | |
| }; |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/oauth-http/src/errors.ts` around lines 81 - 92, The current code
returns the entire upstream payload as error details (payload) which can leak
sensitive fields; instead, construct a safe allow-list object containing only
specific OAuth-safe keys (for example: error, error_description, error_uri,
status, code, provider) and map values using asString(...) where appropriate;
replace the return's details: payload with details: { error:
asString(payload.error), error_description: description, error_uri:
asString(payload.error_uri), status: asString(payload.status), code:
asString(payload.code), provider: asString(payload.provider) } (omit undefined
entries) so only the allow-listed fields from payload are forwarded.
| export interface TokenVault { | ||
| put(record: TokenRecord): Promise<void>; | ||
| get(tokenId: string): Promise<TokenRecord | null>; | ||
| getByConnection(connectionId: string): Promise<TokenRecord | null>; | ||
| rotateKey(tokenId: string, newKeyRef: string): Promise<void>; | ||
| deleteByConnection(connectionId: string): Promise<void>; |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Inspect rotateKey implementations and whether callers also rewrite encryptedPayload.
# Expectation: a safe rotation path must either persist a new encryptedPayload or avoid this API entirely.
rg -n -C3 '\brotateKey\s*\(' packages/oauth-storage
rg -n -C3 '\bencryptedPayload\b|\bnewKeyRef\b' packages/oauth-storageRepository: 21nCo/super-functions
Length of output: 18023
🏁 Script executed:
#!/bin/bash
# Check what's exported from index.ts (the public API)
cat -n packages/oauth-storage/src/index.ts | head -100
# Check if adapters are imported/exported
rg -n 'export.*adapter|export.*storage' packages/oauth-storage/src/index.tsRepository: 21nCo/super-functions
Length of output: 3689
Storage adapter implementations of rotateKey() only update metadata without re-encrypting data.
The rotateKey() method in SqlTokenVault, DbTokenVault, and MemoryTokenVault (lines 322, 211, and 110 respectively) update only key_ref and updated_at, leaving encryptedPayload encrypted with the old key. While the TokenVault class performs full re-encryption via put(), these adapters are directly exported as part of the public API and can be used independently, violating the semantic contract of key rotation.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/oauth-storage/src/index.ts` around lines 82 - 87, The rotateKey
implementations in SqlTokenVault, DbTokenVault, and MemoryTokenVault currently
only update key_ref/updated_at leaving encryptedPayload encrypted with the old
key; update each rotateKey to perform full re-encryption: locate the TokenRecord
(via get or getByConnection as appropriate), decrypt the existing
encryptedPayload using the current key, re-encrypt the plaintext with the
newKeyRef, update encryptedPayload and updated_at, and persist the whole
TokenRecord (or call the existing put(record) path that performs encryption) so
the TokenVault contract is honored; reference TokenVault.rotateKey, TokenRecord,
SqlTokenVault.rotateKey, DbTokenVault.rotateKey, MemoryTokenVault.rotateKey, and
use the same encryption/decryption helpers used by put().
| async decrypt(ciphertext: string, keyRef: string): Promise<string> { | ||
| if (!keyRef) { | ||
| throw new TokenVaultError("VALIDATION_ERROR", "keyRef is required"); | ||
| } | ||
|
|
||
| const [version, ivPart, authTagPart, payloadPart] = ciphertext.split(":"); | ||
| if (version !== "v1" || !ivPart || !authTagPart || !payloadPart) { | ||
| throw new TokenVaultError("INTERNAL_ERROR", "Invalid encrypted payload format"); | ||
| } | ||
|
|
||
| const key = await this.getKey(keyRef); | ||
| const iv = Buffer.from(ivPart, "base64url"); | ||
| const authTag = Buffer.from(authTagPart, "base64url"); | ||
| const payload = Buffer.from(payloadPart, "base64url"); | ||
| const decipher = createDecipheriv("aes-256-gcm", key, iv); | ||
| decipher.setAuthTag(authTag); | ||
| const decrypted = Buffer.concat([decipher.update(payload), decipher.final()]); | ||
| return decrypted.toString("utf8"); |
There was a problem hiding this comment.
Normalize decrypt failures to TokenVaultError.
Wrong key material, tampered ciphertext, or malformed encoded parts currently escape as native crypto errors from createDecipheriv() / decipher.final(). That breaks the vault's typed error contract and makes corruption handling inconsistent for callers.
Suggested fix
async decrypt(ciphertext: string, keyRef: string): Promise<string> {
if (!keyRef) {
throw new TokenVaultError("VALIDATION_ERROR", "keyRef is required");
}
- const [version, ivPart, authTagPart, payloadPart] = ciphertext.split(":");
- if (version !== "v1" || !ivPart || !authTagPart || !payloadPart) {
- throw new TokenVaultError("INTERNAL_ERROR", "Invalid encrypted payload format");
- }
-
- const key = await this.getKey(keyRef);
- const iv = Buffer.from(ivPart, "base64url");
- const authTag = Buffer.from(authTagPart, "base64url");
- const payload = Buffer.from(payloadPart, "base64url");
- const decipher = createDecipheriv("aes-256-gcm", key, iv);
- decipher.setAuthTag(authTag);
- const decrypted = Buffer.concat([decipher.update(payload), decipher.final()]);
- return decrypted.toString("utf8");
+ try {
+ const [version, ivPart, authTagPart, payloadPart] = ciphertext.split(":");
+ if (version !== "v1" || !ivPart || !authTagPart || !payloadPart) {
+ throw new TokenVaultError("INTERNAL_ERROR", "Invalid encrypted payload format");
+ }
+
+ const key = await this.getKey(keyRef);
+ const iv = Buffer.from(ivPart, "base64url");
+ const authTag = Buffer.from(authTagPart, "base64url");
+ const payload = Buffer.from(payloadPart, "base64url");
+ const decipher = createDecipheriv("aes-256-gcm", key, iv);
+ decipher.setAuthTag(authTag);
+ const decrypted = Buffer.concat([decipher.update(payload), decipher.final()]);
+ return decrypted.toString("utf8");
+ } catch (error) {
+ if (error instanceof TokenVaultError) {
+ throw error;
+ }
+ throw new TokenVaultError("INTERNAL_ERROR", "Failed to decrypt token payload");
+ }
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| async decrypt(ciphertext: string, keyRef: string): Promise<string> { | |
| if (!keyRef) { | |
| throw new TokenVaultError("VALIDATION_ERROR", "keyRef is required"); | |
| } | |
| const [version, ivPart, authTagPart, payloadPart] = ciphertext.split(":"); | |
| if (version !== "v1" || !ivPart || !authTagPart || !payloadPart) { | |
| throw new TokenVaultError("INTERNAL_ERROR", "Invalid encrypted payload format"); | |
| } | |
| const key = await this.getKey(keyRef); | |
| const iv = Buffer.from(ivPart, "base64url"); | |
| const authTag = Buffer.from(authTagPart, "base64url"); | |
| const payload = Buffer.from(payloadPart, "base64url"); | |
| const decipher = createDecipheriv("aes-256-gcm", key, iv); | |
| decipher.setAuthTag(authTag); | |
| const decrypted = Buffer.concat([decipher.update(payload), decipher.final()]); | |
| return decrypted.toString("utf8"); | |
| async decrypt(ciphertext: string, keyRef: string): Promise<string> { | |
| if (!keyRef) { | |
| throw new TokenVaultError("VALIDATION_ERROR", "keyRef is required"); | |
| } | |
| try { | |
| const [version, ivPart, authTagPart, payloadPart] = ciphertext.split(":"); | |
| if (version !== "v1" || !ivPart || !authTagPart || !payloadPart) { | |
| throw new TokenVaultError("INTERNAL_ERROR", "Invalid encrypted payload format"); | |
| } | |
| const key = await this.getKey(keyRef); | |
| const iv = Buffer.from(ivPart, "base64url"); | |
| const authTag = Buffer.from(authTagPart, "base64url"); | |
| const payload = Buffer.from(payloadPart, "base64url"); | |
| const decipher = createDecipheriv("aes-256-gcm", key, iv); | |
| decipher.setAuthTag(authTag); | |
| const decrypted = Buffer.concat([decipher.update(payload), decipher.final()]); | |
| return decrypted.toString("utf8"); | |
| } catch (error) { | |
| if (error instanceof TokenVaultError) { | |
| throw error; | |
| } | |
| throw new TokenVaultError("INTERNAL_ERROR", "Failed to decrypt token payload"); | |
| } | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/oauth-storage/src/token-vault.ts` around lines 68 - 85, The decrypt
method can throw native crypto or parsing errors (from Buffer.from,
createDecipheriv, decipher.update/final) which bypass the TokenVaultError
contract; wrap the parsing and decryption block in a try/catch inside async
decrypt(ciphertext, keyRef) (covering the split/Buffer.from calls, getKey(),
createDecipheriv(), setAuthTag(), update()/final()) and on any thrown error
throw a TokenVaultError (e.g. "DECRYPTION_ERROR" or "INTERNAL_ERROR") with a
concise message and include the original error.message as context; keep using
the existing symbols (decrypt, getKey, createDecipheriv, decipher.setAuthTag,
decipher.update, decipher.final) so callers always receive TokenVaultError for
malformed/tampered/wrong-key cases.
There was a problem hiding this comment.
♻️ Duplicate comments (2)
packages/oauth-storage/src/adapters/db.ts (1)
506-512:⚠️ Potential issue | 🟠 MajorWrap
JSON.parseerrors inOAuthDbAdapterError.When
rawis a malformed JSON string,JSON.parse(raw)throws a rawSyntaxErrorthat escapes the adapter's error classification. This was flagged in a previous review. Callers expectingOAuthDbAdapterErrorwill instead receive an untyped exception.Suggested fix
function parseJsonValue<T>(raw: unknown): T { if (typeof raw === "string") { - return JSON.parse(raw) as T; + try { + return JSON.parse(raw) as T; + } catch { + throw new OAuthDbAdapterError("INTERNAL_ERROR", "Malformed persisted JSON"); + } } return raw as T; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-storage/src/adapters/db.ts` around lines 506 - 512, The parseJsonValue function should catch JSON.parse errors and rethrow them as an OAuthDbAdapterError so callers always receive the adapter's error type; modify parseJsonValue<T> to try { JSON.parse(raw) as T } catch (err) { throw new OAuthDbAdapterError("Failed to parse JSON value", { cause: err }) } (or include the original error message) when raw is a string, keeping the same return for non-strings; reference the parseJsonValue function and the OAuthDbAdapterError class when making the change.packages/oauth-flow/src/service.ts (1)
1023-1029:⚠️ Potential issue | 🟠 MajorPlaintext token storage remains the implicit default.
When
tokenStorageModeis unset and a plainTokenVault(notEncryptedTokenVault) is injected,getTokenStorageMode()returns"plaintext-unsafe", making unencrypted storage the default. This was flagged in a previous review but appears unchanged.Consider requiring explicit opt-in for plaintext storage to prevent accidental insecure configurations:
Suggested fix to default to encrypted-required
private getTokenStorageMode(): OAuthTokenStorageMode { if (this.config.tokenStorageMode) { return this.config.tokenStorageMode; } - return isEncryptedTokenVault(this.config.tokenVault) ? "encrypted-required" : "plaintext-unsafe"; + return "encrypted-required"; }This would cause operations to fail with
OAUTH_TOKEN_STORAGE_UNSAFEif a plainTokenVaultis used without explicitly settingtokenStorageMode: "plaintext-unsafe", making the insecure path opt-in rather than implicit.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-flow/src/service.ts` around lines 1023 - 1029, getTokenStorageMode currently falls back to "plaintext-unsafe" when tokenStorageMode is unset, making plaintext the implicit default; change it so the default is "encrypted-required" unless the caller explicitly sets tokenStorageMode to "plaintext-unsafe". Update the method (getTokenStorageMode) to: if config.tokenStorageMode is present return it; otherwise return "encrypted-required" (or, if you prefer, check isEncryptedTokenVault and only allow "plaintext-unsafe" when config.tokenStorageMode === "plaintext-unsafe"); ensure references to OAuthTokenStorageMode and isEncryptedTokenVault are used to gate behavior so plaintext becomes an opt-in only.
🧹 Nitpick comments (3)
packages/oauth-http/tests/token-client.test.ts (1)
505-529: Consider addingclone()method to mock response.The
createResponsehelper provides a minimal mock that works for current tests. However, the standardResponseinterface includes aclone()method that some code paths may use.Optional enhancement
return { ok: input.status >= 200 && input.status <= 299, status: input.status, headers: { get(name: string) { return headerMap.get(name.toLowerCase()) ?? null; } }, async text() { return input.body; + }, + clone() { + return createResponse(input); } };🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-http/tests/token-client.test.ts` around lines 505 - 529, The mock Response produced by createResponse is missing the standard clone() method; add a clone() function to the returned object that returns a new object with the same status, ok, headers.get behavior and text() behavior (i.e., a shallow copy of the response including a fresh headers map and the same body string) so code paths that call response.clone() behave like a real Response; update createResponse to construct and return this clone-capable object.packages/oauth-storage/src/token-vault.ts (1)
246-254: ISO-8601 regex allows common variations but may reject valid formats.The regex
/^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}(?:\.\d{3})?Z$/accepts2024-01-15T10:30:00Zand2024-01-15T10:30:00.000Zbut rejects:
- Higher precision:
2024-01-15T10:30:00.123456Z- Timezone offsets:
2024-01-15T10:30:00+00:00Since
new Date().toISOString()produces.000Zformat, this is fine for internally-generated timestamps. Just be aware that externally-provided timestamps with different precision may be rejected.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-storage/src/token-vault.ts` around lines 246 - 254, The assertIsoTimestamp validation is too strict: update the regex in assertIsoTimestamp to accept fractional seconds of any length and timezone offsets (e.g. +00:00) instead of only exactly three fractional digits and a trailing Z; specifically replace the current pattern with one that uses (?:\.\d+)? to allow variable fractional precision and (?:Z|[+-]\d{2}:\d{2}) to allow offsets, keep the subsequent Date.parse and TokenVaultError behavior unchanged so invalid strings still throw the same error.packages/oauth-storage/src/adapters/sql.ts (1)
761-765: SQL identifier validation doesn't support schema-qualified names.The current regex
^[A-Za-z_][A-Za-z0-9_]*$rejects schema-qualified identifiers like"public.oauth_states". If cross-schema deployment is needed, consider extending the pattern:Optional: Support schema-qualified names
function validateSqlIdentifier(value: string, fieldName: string): string { - if (!/^[A-Za-z_][A-Za-z0-9_]*$/.test(value)) { - throw new Error(`${fieldName} must contain only letters, numbers, and underscores`); + // Allows: "table_name" or "schema.table_name" + if (!/^[A-Za-z_][A-Za-z0-9_]*(?:\.[A-Za-z_][A-Za-z0-9_]*)?$/.test(value)) { + throw new Error(`${fieldName} must be a valid SQL identifier (letters, numbers, underscores, optional schema prefix)`); } return value; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-storage/src/adapters/sql.ts` around lines 761 - 765, The validateSqlIdentifier function currently rejects schema-qualified names; update validateSqlIdentifier to accept an optional schema prefix by changing the validation to allow either a single identifier or a dot-separated schema.identifier (e.g. use a pattern like ^[A-Za-z_][A-Za-z0-9_]*(?:\.[A-Za-z_][A-Za-z0-9_]*)?$) and update the error message to reflect that schema-qualified names (schema.table) are permitted; keep the function name validateSqlIdentifier and the fieldName parameter unchanged.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@packages/oauth-flow/src/service.ts`:
- Around line 1023-1029: getTokenStorageMode currently falls back to
"plaintext-unsafe" when tokenStorageMode is unset, making plaintext the implicit
default; change it so the default is "encrypted-required" unless the caller
explicitly sets tokenStorageMode to "plaintext-unsafe". Update the method
(getTokenStorageMode) to: if config.tokenStorageMode is present return it;
otherwise return "encrypted-required" (or, if you prefer, check
isEncryptedTokenVault and only allow "plaintext-unsafe" when
config.tokenStorageMode === "plaintext-unsafe"); ensure references to
OAuthTokenStorageMode and isEncryptedTokenVault are used to gate behavior so
plaintext becomes an opt-in only.
In `@packages/oauth-storage/src/adapters/db.ts`:
- Around line 506-512: The parseJsonValue function should catch JSON.parse
errors and rethrow them as an OAuthDbAdapterError so callers always receive the
adapter's error type; modify parseJsonValue<T> to try { JSON.parse(raw) as T }
catch (err) { throw new OAuthDbAdapterError("Failed to parse JSON value", {
cause: err }) } (or include the original error message) when raw is a string,
keeping the same return for non-strings; reference the parseJsonValue function
and the OAuthDbAdapterError class when making the change.
---
Nitpick comments:
In `@packages/oauth-http/tests/token-client.test.ts`:
- Around line 505-529: The mock Response produced by createResponse is missing
the standard clone() method; add a clone() function to the returned object that
returns a new object with the same status, ok, headers.get behavior and text()
behavior (i.e., a shallow copy of the response including a fresh headers map and
the same body string) so code paths that call response.clone() behave like a
real Response; update createResponse to construct and return this clone-capable
object.
In `@packages/oauth-storage/src/adapters/sql.ts`:
- Around line 761-765: The validateSqlIdentifier function currently rejects
schema-qualified names; update validateSqlIdentifier to accept an optional
schema prefix by changing the validation to allow either a single identifier or
a dot-separated schema.identifier (e.g. use a pattern like
^[A-Za-z_][A-Za-z0-9_]*(?:\.[A-Za-z_][A-Za-z0-9_]*)?$) and update the error
message to reflect that schema-qualified names (schema.table) are permitted;
keep the function name validateSqlIdentifier and the fieldName parameter
unchanged.
In `@packages/oauth-storage/src/token-vault.ts`:
- Around line 246-254: The assertIsoTimestamp validation is too strict: update
the regex in assertIsoTimestamp to accept fractional seconds of any length and
timezone offsets (e.g. +00:00) instead of only exactly three fractional digits
and a trailing Z; specifically replace the current pattern with one that uses
(?:\.\d+)? to allow variable fractional precision and (?:Z|[+-]\d{2}:\d{2}) to
allow offsets, keep the subsequent Date.parse and TokenVaultError behavior
unchanged so invalid strings still throw the same error.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 6280bf96-d4d3-453a-ae37-5534796b7999
📒 Files selected for processing (11)
packages/oauth-flow/src/service.tspackages/oauth-http/src/errors.tspackages/oauth-http/src/retry-policy.tspackages/oauth-http/tests/token-client.test.tspackages/oauth-router/src/__tests__/routes.test.tspackages/oauth-router/src/connection-routes.tspackages/oauth-storage/src/adapters/db.tspackages/oauth-storage/src/adapters/memory.tspackages/oauth-storage/src/adapters/sql.tspackages/oauth-storage/src/token-vault.tspackages/oauth-storage/tests/db-adapter.test.ts
✅ Files skipped from review due to trivial changes (1)
- packages/oauth-router/src/tests/routes.test.ts
🚧 Files skipped from review as they are similar to previous changes (3)
- packages/oauth-storage/tests/db-adapter.test.ts
- packages/oauth-http/src/retry-policy.ts
- packages/oauth-storage/src/adapters/memory.ts
There was a problem hiding this comment.
2 issues found across 11 files (changes from recent commits).
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="packages/oauth-http/src/retry-policy.ts">
<violation number="1" location="packages/oauth-http/src/retry-policy.ts:42">
P1: Do not cap `Retry-After` with `maxDelayMs`; it can force retries earlier than the server-requested backoff window.</violation>
</file>
<file name="packages/oauth-flow/src/service.ts">
<violation number="1" location="packages/oauth-flow/src/service.ts:441">
P2: `providerId` can be undefined in the callback failure event after moving the state lookup into the try block. If state lookup fails before `providerId` is resolved, the failure event now emits `providerId: undefined` instead of a safe fallback, which can break event consumers expecting a string.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 1a25f6181c
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| const provider = this.getProvider(input.providerId); | ||
| const existing = await this.getTokenSetByConnection(input.connectionId); |
There was a problem hiding this comment.
Verify token provider before refresh exchange
refresh() loads the stored token by connectionId but immediately uses input.providerId to choose the OAuth endpoint, without checking that the stored record belongs to the same provider. If a caller passes a mismatched provider, the method can send provider A’s refresh token to provider B’s token endpoint, causing cross-provider token leakage and hard-to-debug refresh failures; this should fail fast on provider mismatch before building the exchange request.
Useful? React with 👍 / 👎.
| id: "github", | ||
| authorizationUrl: "https://github.com/login/oauth/authorize", | ||
| tokenUrl: "https://github.com/login/oauth/access_token", | ||
| revocationUrl: "https://api.github.com/applications/{client_id}/token", |
There was a problem hiding this comment.
Use concrete GitHub revocation endpoint URL
The GitHub descriptor sets revocationUrl to a template containing {client_id}, but DefaultOAuthTokenHttpClient.revokeToken() sends provider.revocationUrl as-is. That means revoke calls for GitHub will target a literal placeholder path instead of a valid app-specific endpoint, so revokeRemote disconnects for GitHub will consistently fail.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 13dd37e9ef
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| if (subject.kind === "connection" && subject.connectionId) { | ||
| this.pendingConnectionByState.set(result.stateId, subject.connectionId); |
There was a problem hiding this comment.
Bound pendingConnectionByState growth
Each connection start stores an entry in pendingConnectionByState, but eviction only happens on the successful connection callback path; abandoned logins or failed callbacks leave entries behind indefinitely. In long-running services this becomes an unbounded in-memory cache, and it is especially unnecessary because connectionId is already persisted in OAuth state and can be recovered from there.
Useful? React with 👍 / 👎.
| prompt: undefined, | ||
| loginHint: undefined | ||
| }); | ||
| const tokenToRevoke = tokenRecord.tokenSet.refreshToken ?? tokenRecord.tokenSet.accessToken; |
There was a problem hiding this comment.
Honor tokenTypeHint when selecting revoke token
The disconnect flow always prefers refreshToken for remote revocation even when the caller explicitly sets tokenTypeHint: "access_token". That can send a token/hint mismatch to providers and revoke the wrong credential (or fail revocation entirely) for clients that intentionally request access-token revocation semantics.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 0b638c0b35
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| updated_at, | ||
| expires_at | ||
| ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?) | ||
| ON CONFLICT(token_id) DO UPDATE SET |
There was a problem hiding this comment.
Upsert tokens by connection_id to avoid reconnect failures
SqlTokenVault.put() only resolves conflicts on token_id, but reconnect callbacks generate a new token id each time while reusing the same connectionId. Because the schema enforces connection_id uniqueness, a second connect for the same connection can throw a unique-constraint error instead of replacing the old token record, breaking normal re-link/rotate flows. The conflict target (or replacement strategy) needs to handle connection_id collisions.
Useful? React with 👍 / 👎.
| revokeRemote: body.revokeRemote, | ||
| tokenTypeHint: body.tokenTypeHint |
There was a problem hiding this comment.
Validate browser disconnect payload types before forwarding
parseDisconnectBody() trusts request.json() typing and forwards revokeRemote directly without runtime checks. If a client sends "revokeRemote": "false" (string), downstream flow logic treats it as truthy (input.revokeRemote ?? false) and attempts remote revocation unexpectedly. This can revoke upstream tokens on malformed input instead of rejecting the request with validation errors.
Useful? React with 👍 / 👎.
| "dependencies": { | ||
| "@superfunctions/oauth-core": "*", | ||
| "@superfunctions/oauth-storage": "*" | ||
| }, |
There was a problem hiding this comment.
Declare oauth-flow dependency for exported oauth-testing types
@superfunctions/oauth-testing exports interfaces that reference @superfunctions/oauth-flow types (via browser-auth-fixtures.ts), so the generated declaration files require that package to be present. Since package.json omits @superfunctions/oauth-flow from dependencies/peerDependencies, TypeScript consumers installing only @superfunctions/oauth-testing can hit unresolved module errors when reading its public types.
Useful? React with 👍 / 👎.
| { "slug": "superfunctions-oauth-http", "path": "packages/oauth-http", "name": "@superfunctions/oauth-http" }, | ||
| { "slug": "superfunctions-oauth-providers", "path": "packages/oauth-providers", "name": "@superfunctions/oauth-providers" }, | ||
| { "slug": "superfunctions-oauth-router", "path": "packages/oauth-router", "name": "@superfunctions/oauth-router" }, | ||
| { "slug": "superfunctions-oauth-storage", "path": "packages/oauth-storage", "name": "@superfunctions/oauth-storage" } |
There was a problem hiding this comment.
Add oauth-testing to release target manifest
This commit adds @superfunctions/oauth-testing as a publishable package, but release-packages.json stops at oauth-storage and does not include an oauth-testing slug. The release-tag resolver only permits slugs listed in this manifest, so oauth-testing tags will fail resolution and the new package cannot be published through the existing release workflow.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: a1935171d9
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| const result = await config.flowService.start({ | ||
| ...input, | ||
| requestId | ||
| }); |
There was a problem hiding this comment.
Preserve resolver requestId in browser start handler
When the browser start route calls flowService.start, it spreads input and then unconditionally sets requestId from the header, so a missing x-request-id overwrites any resolver-provided input.requestId with undefined. This drops caller-supplied correlation IDs for browser auth starts, which breaks tracing/event correlation unless every client always sends the header; the connection route already avoids this by using a fallback (requestId ?? input.requestId).
Useful? React with 👍 / 👎.
| id: "microsoft", | ||
| authorizationUrl: "https://login.microsoftonline.com/common/oauth2/v2.0/authorize", | ||
| tokenUrl: "https://login.microsoftonline.com/common/oauth2/v2.0/token", | ||
| revocationUrl: "https://login.microsoftonline.com/common/oauth2/v2.0/logout", |
There was a problem hiding this comment.
Use a token revocation endpoint for Microsoft provider
The Microsoft descriptor points revocationUrl at /oauth2/v2.0/logout, but DefaultOAuthTokenHttpClient.revokeToken() performs RFC7009-style token revocation by POSTing token data to provider.revocationUrl. With the current URL, disconnect({ revokeRemote: true }) for Microsoft cannot revoke stored tokens remotely and will surface revoke failures after local cleanup, leaving upstream tokens active when callers explicitly requested remote revocation.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: c4bf4d92d0
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| await this.adapter.upsert({ | ||
| model: this.model, | ||
| where: [{ field: "token_id", operator: "eq", value: record.tokenId }], | ||
| create: mapTokenRecordToRow(record), | ||
| update: mapTokenRecordToRow(record), | ||
| conflictTarget: "token_id" | ||
| }); |
There was a problem hiding this comment.
Upsert DB token rows on connection_id collisions
The DB adapter only upserts on token_id, so reconnecting an existing connectionId with a newly generated token ID will create a second row (or violate a unique connection constraint, depending on backend schema) instead of replacing the old token record. This breaks refresh/disconnect flows because getByConnection can return stale credentials or the callback can fail with a uniqueness error during normal re-link operations.
Useful? React with 👍 / 👎.
| if (!value || value.length === 0) { | ||
| throw new Error(`${field} is required`); |
There was a problem hiding this comment.
Return 400-class errors for invalid OAuth route inputs
Input validation in the OAuth routes throws plain Error, which the default HTTP router treats as an unknown exception and converts to a 500 response. Missing code/state query params or malformed disconnect JSON are client mistakes and should not surface as internal server errors, otherwise callers receive misleading status codes and retry/alert behavior is distorted.
Useful? React with 👍 / 👎.
| function normalizeOpenApiPath(path: string): string { | ||
| return path.replace(/:([A-Za-z0-9_]+)/g, "{$1}"); |
There was a problem hiding this comment.
Emit path parameters when converting :params to OpenAPI
The generator rewrites /:param paths to /{param} but never emits required OpenAPI parameters entries for those path variables. For any route with params (for example /users/:userId), the produced document is incomplete and can fail validation or client generation because the templated variable is undocumented at the operation/path level.
Useful? React with 👍 / 👎.
|
You're iterating quickly on this pull request. To help protect your rate limits, cubic has paused automatic reviews on new pushes for now—when you're ready for another review, comment |
| let connectionCleanup: OAuthFlowConnectionCleanup = { | ||
| attempted: false, | ||
| deleted: false, | ||
| reason: "not-configured" | ||
| }; |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: f902a87ec3
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| function collectRoutes(routers: GenerateOpenApiDocumentInput["routers"]): Route[] { | ||
| return routers.flatMap((router) => router.getRoutes()); |
There was a problem hiding this comment.
Preserve router basePath when building OpenAPI paths
The generator currently flattens router.getRoutes() and uses each raw route.path directly, but createRouter applies basePath only during route matching and getRoutes() returns the original unprefixed definitions (packages/http/src/router.ts). For routers configured with basePath (for example /api), the document will emit wrong endpoints (e.g. /users instead of /api/users), which breaks generated clients and published API docs for those deployments.
Useful? React with 👍 / 👎.
| function asRequiredString(value: string | null | undefined, field: string): string { | ||
| if (!value || value.length === 0) { | ||
| throw new BadRequestError(`${field} is required`, "OAUTH_ROUTE_INVALID_INPUT"); |
There was a problem hiding this comment.
Reject non-string required route fields
asRequiredString only checks truthiness and .length, so non-string JSON values can pass validation when request bodies are cast from request.json(). For example, {"connectionId":123} reaches flowService.disconnect as a non-string ID instead of returning a 400, leading to malformed-input no-ops (or backend-dependent unintended lookups/deletes) rather than deterministic client validation errors; the same helper pattern is also duplicated in browser-routes.ts.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
Actionable comments posted: 2
♻️ Duplicate comments (1)
packages/oauth-flow/src/service.ts (1)
1040-1046:⚠️ Potential issue | 🟠 MajorPlaintext token storage is still implicitly enabled when using a plain
TokenVault.When
tokenStorageModeis unset and a plainTokenVaultis injected,getTokenStorageMode()returns"plaintext-unsafe", which then passes the check inassertTokenStorageIsSafe(). This makes insecure plaintext storage the implicit default rather than requiring explicit opt-in.Consider defaulting to
"encrypted-required"so that plaintext storage requires explicit configuration:Suggested fix
private getTokenStorageMode(): OAuthTokenStorageMode { - if (this.config.tokenStorageMode) { - return this.config.tokenStorageMode; - } - - return isEncryptedTokenVault(this.config.tokenVault) ? "encrypted-required" : "plaintext-unsafe"; + return this.config.tokenStorageMode ?? "encrypted-required"; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-flow/src/service.ts` around lines 1040 - 1046, getTokenStorageMode currently returns "plaintext-unsafe" when tokenStorageMode is unset and a plain TokenVault is injected, making plaintext the implicit default; change it so that when this.config.tokenStorageMode is undefined the method returns "encrypted-required" (i.e., default to requiring encryption), using the existing helper isEncryptedTokenVault only if you need to validate/throw elsewhere; update getTokenStorageMode to return "encrypted-required" by default and ensure assertTokenStorageIsSafe still validates explicit config values accordingly (refer to getTokenStorageMode, isEncryptedTokenVault, and assertTokenStorageIsSafe).
🧹 Nitpick comments (1)
packages/oauth-router/src/connection-routes.ts (1)
104-151: Consider extracting shared helpers to reduce duplication.The
resolveRouteMeta,mergeAuthMeta,mergeOpenApiMeta,joinPath, and validation helpers are duplicated betweenconnection-routes.tsandbrowser-routes.ts. Consider extracting them to a shared internal module.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-router/src/connection-routes.ts` around lines 104 - 151, Extract duplicated helpers (resolveRouteMeta, mergeAuthMeta, mergeOpenApiMeta, joinPath, and any validation helpers) into a new shared internal module and import them from both connection-routes.ts and browser-routes.ts; move the function implementations and any dependent types or small utility types into that module, export them with the same names, update both files to import instead of defining the duplicates, and run type checks to ensure AuthRouteMeta and OpenApiRouteMeta types are either re-exported or imported so behavior remains identical.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/oauth-router/src/browser-routes.ts`:
- Around line 200-215: resolveRedirectLocation currently returns
result.subject.returnTo without validation, allowing open redirects; update
resolveRedirectLocation to validate returnTo when used (only when callbackMode
is "redirect" or when result.subject.kind === "browser-auth") by reusing the
same allowlist/validation logic used for redirectUri during flow initialization
(e.g., the flow's redirect URI whitelist or helper function that validates
redirect URIs), and if returnTo fails validation or is missing, fall back to "/"
(or another safe default). Ensure you reference and call the existing validator
rather than duplicating rules, and keep the existing branch that honors
config.getRedirectLocation unchanged.
In `@packages/oauth-storage/src/adapters/db.ts`:
- Around line 485-498: In mapRevocationFailureRowToRecord, don't cast
row.token_type_hint directly; instead validate it against the allowed union
values ("access_token", "refresh_token") before assigning to tokenTypeHint so
invalid DB values don't pass through. Update mapRevocationFailureRowToRecord to
check row.token_type_hint (e.g., via a small helper or inline conditional) and
only set tokenTypeHint when it exactly equals one of the allowed strings;
otherwise set tokenTypeHint to undefined (preserve existing handling for
null/undefined). Keep the rest of the mapping (subject via parseSubject,
metadata via parseRecordValue, tokenId mapping) unchanged.
---
Duplicate comments:
In `@packages/oauth-flow/src/service.ts`:
- Around line 1040-1046: getTokenStorageMode currently returns
"plaintext-unsafe" when tokenStorageMode is unset and a plain TokenVault is
injected, making plaintext the implicit default; change it so that when
this.config.tokenStorageMode is undefined the method returns
"encrypted-required" (i.e., default to requiring encryption), using the existing
helper isEncryptedTokenVault only if you need to validate/throw elsewhere;
update getTokenStorageMode to return "encrypted-required" by default and ensure
assertTokenStorageIsSafe still validates explicit config values accordingly
(refer to getTokenStorageMode, isEncryptedTokenVault, and
assertTokenStorageIsSafe).
---
Nitpick comments:
In `@packages/oauth-router/src/connection-routes.ts`:
- Around line 104-151: Extract duplicated helpers (resolveRouteMeta,
mergeAuthMeta, mergeOpenApiMeta, joinPath, and any validation helpers) into a
new shared internal module and import them from both connection-routes.ts and
browser-routes.ts; move the function implementations and any dependent types or
small utility types into that module, export them with the same names, update
both files to import instead of defining the duplicates, and run type checks to
ensure AuthRouteMeta and OpenApiRouteMeta types are either re-exported or
imported so behavior remains identical.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 2db6f9a4-26d5-4b35-83dd-80af883f125f
📒 Files selected for processing (10)
packages/oauth-flow/src/__tests__/refresh-disconnect.test.tspackages/oauth-flow/src/service.tspackages/oauth-http/src/retry-policy.tspackages/oauth-http/tests/token-client.test.tspackages/oauth-router/src/__tests__/routes.test.tspackages/oauth-router/src/browser-routes.tspackages/oauth-router/src/connection-routes.tspackages/oauth-storage/src/adapters/db.tspackages/oauth-testing/package.jsonrelease-packages.json
✅ Files skipped from review due to trivial changes (4)
- packages/oauth-http/src/retry-policy.ts
- packages/oauth-router/src/tests/routes.test.ts
- packages/oauth-flow/src/tests/refresh-disconnect.test.ts
- packages/oauth-http/tests/token-client.test.ts
🚧 Files skipped from review as they are similar to previous changes (1)
- packages/oauth-testing/package.json
| async function resolveRedirectLocation( | ||
| config: OAuthBrowserRouteConfig, | ||
| result: OAuthFlowCallbackResult, | ||
| request: Request, | ||
| context: RouteContext | ||
| ): Promise<string> { | ||
| if (config.getRedirectLocation) { | ||
| return config.getRedirectLocation(result, request, context); | ||
| } | ||
|
|
||
| if (result.subject.kind === "browser-auth" && result.subject.returnTo) { | ||
| return result.subject.returnTo; | ||
| } | ||
|
|
||
| return "/"; | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Search for returnTo validation in oauth-core and oauth-flow packages
echo "=== Searching for returnTo validation ==="
rg -n -C5 'returnTo' --type=ts -g '!**/node_modules/**' -g '!**/*.test.ts' packages/oauth-core packages/oauth-flow | head -100
echo ""
echo "=== Searching for URL validation patterns ==="
rg -n 'allowlist|whitelist|validat.*url|validat.*redirect|assertRedirect' --type=ts -g '!**/node_modules/**' packages/oauth-core packages/oauth-flowRepository: 21nCo/super-functions
Length of output: 8949
🏁 Script executed:
#!/bin/bash
# Find where returnTo is assigned/set
echo "=== Where returnTo is assigned ==="
rg -n 'returnTo\s*[:=]' --type=ts -g '!**/node_modules/**' -g '!**/*.test.ts' packages/ | head -50
echo ""
echo "=== Check browser-routes.ts implementation details ==="
cat -n packages/oauth-router/src/browser-routes.ts | head -230 | tail -50
echo ""
echo "=== Search for getRedirectLocation usage ==="
rg -n 'getRedirectLocation' --type=ts -g '!**/node_modules/**' packages/Repository: 21nCo/super-functions
Length of output: 2721
🏁 Script executed:
#!/bin/bash
# Find returnTo in API inputs/interfaces
echo "=== Search for returnTo in interfaces and types ==="
rg -n 'returnTo' --type=ts -g '!**/node_modules/**' packages/oauth-router packages/oauth-flow | grep -E '(interface|type|Input|Config)' -B2 -A2 | head -80
echo ""
echo "=== Check OAuthBrowserRouteConfig interface ==="
rg -n 'interface OAuthBrowserRouteConfig' --type=ts -g '!**/node_modules/**' -A30 packages/oauth-router
echo ""
echo "=== Check start/callback input interfaces ==="
rg -n 'OAuthFlowCallbackInput|OAuthFlowStartInput' --type=ts -g '!**/node_modules/**' -A15 packages/oauth-flowRepository: 21nCo/super-functions
Length of output: 9803
🏁 Script executed:
#!/bin/bash
# Check how returnTo is used in tests
echo "=== returnTo in test files ==="
rg -n 'returnTo' --type=ts -g '**/node_modules/**' -g '**/*.test.ts' packages/ | head -50
echo ""
echo "=== Check OAuthIntentSubject definition ==="
rg -n 'OAuthIntentSubject' --type=ts -g '!**/node_modules/**' -B5 -A10 packages/oauth-flow/src/service.ts | head -60
echo ""
echo "=== Check if there's any documentation/comments about returnTo validation ==="
rg -n 'returnTo|redirect.*validat|open.*redirect' --type=ts -g '!**/node_modules/**' -B3 -A3 packages/oauth-core/src/state.tsRepository: 21nCo/super-functions
Length of output: 2949
🏁 Script executed:
#!/bin/bash
# Find OAuthIntentSubject definition in oauth-core
echo "=== OAuthIntentSubject definition in oauth-core ==="
rg -n 'OAuthIntentSubject' --type=ts -g '!**/node_modules/**' -B5 -A15 packages/oauth-core/src/
echo ""
echo "=== Check if assertRedirectUriAllowed is also used for returnTo ==="
rg -n 'assertRedirectUriAllowed' --type=ts -g '!**/node_modules/**' -B2 -A2 packages/Repository: 21nCo/super-functions
Length of output: 12245
Validate returnTo URLs to prevent open redirect attacks.
When callbackMode is "redirect" and no custom getRedirectLocation is provided, the function redirects to result.subject.returnTo. Unlike redirectUri which is validated against an allowlist during flow initialization, returnTo is developer-provided and receives no validation. If a developer passes untrusted user input to returnTo, it enables open redirect attacks.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/oauth-router/src/browser-routes.ts` around lines 200 - 215,
resolveRedirectLocation currently returns result.subject.returnTo without
validation, allowing open redirects; update resolveRedirectLocation to validate
returnTo when used (only when callbackMode is "redirect" or when
result.subject.kind === "browser-auth") by reusing the same allowlist/validation
logic used for redirectUri during flow initialization (e.g., the flow's redirect
URI whitelist or helper function that validates redirect URIs), and if returnTo
fails validation or is missing, fall back to "/" (or another safe default).
Ensure you reference and call the existing validator rather than duplicating
rules, and keep the existing branch that honors config.getRedirectLocation
unchanged.
There was a problem hiding this comment.
🧹 Nitpick comments (1)
packages/http-openapi/src/__tests__/generate.test.ts (1)
5-240: Consider adding test coverage for implicit route inclusion.The current tests cover
include: falseandinclude: true, but the common case whereincludeis omitted (undefined) could be explicitly tested to document the default behavior. Additionally, a test with multiple routers would verify correct merging.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/http-openapi/src/__tests__/generate.test.ts` around lines 5 - 240, Add a new test that asserts routes with no openapi.include specified are included by default and that multiple routers merge correctly: createRouter with a route lacking meta.openapi.include (e.g., operationId "implicitInclude") and another router with a different route, call generateOpenApiDocument with both routers, and expect the resulting document.paths to contain both routes (using templated path/{param} forms and operationId checks). Ensure you reference the existing helpers generateOpenApiDocument and createRouter and reuse the assertion style from other tests to keep deterministic ordering of tags/parameters.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@packages/http-openapi/src/__tests__/generate.test.ts`:
- Around line 5-240: Add a new test that asserts routes with no openapi.include
specified are included by default and that multiple routers merge correctly:
createRouter with a route lacking meta.openapi.include (e.g., operationId
"implicitInclude") and another router with a different route, call
generateOpenApiDocument with both routers, and expect the resulting
document.paths to contain both routes (using templated path/{param} forms and
operationId checks). Ensure you reference the existing helpers
generateOpenApiDocument and createRouter and reuse the assertion style from
other tests to keep deterministic ordering of tags/parameters.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: c05dabd9-f017-4ff3-8218-ca7d2d66b7bb
📒 Files selected for processing (2)
packages/http-openapi/src/__tests__/generate.test.tspackages/http-openapi/src/generate.ts
🚧 Files skipped from review as they are similar to previous changes (1)
- packages/http-openapi/src/generate.ts
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 270a5af741
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| export function getOAuthProviderDescriptor(providerId: OAuthProviderId): OAuthProviderDescriptor { | ||
| return oauthProviderDescriptors[providerId]; |
There was a problem hiding this comment.
Return cloned provider descriptors
getOAuthProviderDescriptor returns the shared object from oauthProviderDescriptors by reference, so any caller mutation (for example pushing to defaultScopes or editing URLs) mutates global provider config for the entire process. In multi-request services this can silently change scopes/endpoints for later OAuth operations and is very hard to debug; this API should return an immutable/deep-cloned descriptor (and similarly avoid exposing mutable shared references from list helpers).
Useful? React with 👍 / 👎.
| const operations = pathEntries.get(normalizedPath) ?? new Map<string, Record<string, unknown>>(); | ||
| operations.set(normalizedMethod, buildOperation(meta, normalizedPath)); | ||
| pathEntries.set(normalizedPath, operations); |
There was a problem hiding this comment.
Detect duplicate OpenAPI operations before overwrite
When two routes normalize to the same OpenAPI path+method, operations.set silently overwrites the first operation with the later one instead of failing generation. This produces incomplete/incorrect API docs without any warning, so client generation and published docs can point to the wrong operation; the generator should raise a deterministic error on duplicate method/path pairs.
Useful? React with 👍 / 👎.
| return new OAuthFlowError("INTERNAL_ERROR", "Unexpected OAuth flow error", { | ||
| status: 500, | ||
| retryable: false | ||
| }); |
There was a problem hiding this comment.
Preserve validation errors instead of mapping all to 500
The fallback branch collapses every non-OAuthCoreError/OAuthHttpError into a generic INTERNAL_ERROR 500, which misclassifies known validation failures coming from lower layers (for example state-store validation when malformed start input reaches the flow service). In that case clients receive a server error for bad input, which breaks retry semantics and observability; known domain validation errors should be mapped to a 4xx VALIDATION_ERROR instead.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (2)
packages/oauth-providers/src/index.ts (2)
66-75: Redundant default store assignments.Lines 72-73 explicitly set default
consentStoreandauditStorevalues, but theProviderPolicyRegistryconstructor already applies identical defaults when these options are undefined (per the constructor inpolicy-registry.ts). The factory could simplify to:return new ProviderPolicyRegistry(listProviderPolicies(), resolvedOptions);That said, keeping the explicit defaults here does make the factory's behavior self-documenting. This is a minor DRY consideration—no functional impact.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-providers/src/index.ts` around lines 66 - 75, The factory createDefaultProviderPolicyRegistry redundantly supplies consentStore and auditStore defaults already applied by the ProviderPolicyRegistry constructor; remove the explicit default assignments in createDefaultProviderPolicyRegistry and pass resolvedOptions directly to new ProviderPolicyRegistry(listProviderPolicies(), resolvedOptions) so the constructor-level defaults (MemoryProviderPolicyConsentStore, MemoryProviderPolicyAuditStore) are relied upon; ensure listProviderPolicies, ProviderPolicyRegistryOptions and the MemoryProviderPolicyConsentStore/MemoryProviderPolicyAuditStore references remain intact so behavior is unchanged.
54-64: Consider cloning policies for consistency with descriptor functions.The policy getter/lister functions return references directly from the shared
oauthProviderPoliciesregistry, unlike the descriptor functions which defensively clone. WhileProviderPolicyRegistryclones policies internally during construction (per the constructor logic), direct consumers ofgetProviderPolicy()could inadvertently mutate the shared policy definitions.This isn't critical since the primary use case routes through the registry, but adding cloning would provide consistent defensive behavior across the API.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/oauth-providers/src/index.ts` around lines 54 - 64, getProviderPolicy, listProviderPolicies and listAuthProviderPolicies currently return direct references from oauthProviderPolicies which allows callers to mutate shared ProviderPolicyDefinition objects; change these functions to return defensive clones instead (e.g., deep-clone the single object in getProviderPolicy, deep-clone each object in Object.values(oauthProviderPolicies) for listProviderPolicies, and deep-clone each mapped oauthProviderPolicies[providerId] in listAuthProviderPolicies) so callers receive independent copies of the ProviderPolicyDefinition instances.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/http-openapi/src/generate.ts`:
- Around line 167-169: The compareMethods function currently returns
getMethodOrder(left) - getMethodOrder(right), which leaves methods not in
METHOD_ORDER with equal rank and non-deterministic ordering; update
compareMethods (and keep getMethodOrder/METHOD_ORDER references) to detect when
the numeric ranks are equal and then return a deterministic tie-break using
lexical comparison (e.g., left.localeCompare(right)) so unknown or custom HTTP
methods sort stably.
- Around line 123-148: The createResponses function currently force-casts each
schema to Record<string, unknown> before calling sortObject, which can cause
uncaught TypeError on malformed metadata; update createResponses to validate
each entry of OpenApiRouteMeta["responseSchemas"] (check schema is a non-null
plain object and not an array) before calling sortObject, and if validation
fails throw a descriptive OpenApiGenerationError that includes the offending
statusCode and the bad value; perform this validation inside the map where
statusCode and schema are available so the error can be specific, and only call
sortObject(schema as Record<string, unknown>) after the check passes.
---
Nitpick comments:
In `@packages/oauth-providers/src/index.ts`:
- Around line 66-75: The factory createDefaultProviderPolicyRegistry redundantly
supplies consentStore and auditStore defaults already applied by the
ProviderPolicyRegistry constructor; remove the explicit default assignments in
createDefaultProviderPolicyRegistry and pass resolvedOptions directly to new
ProviderPolicyRegistry(listProviderPolicies(), resolvedOptions) so the
constructor-level defaults (MemoryProviderPolicyConsentStore,
MemoryProviderPolicyAuditStore) are relied upon; ensure listProviderPolicies,
ProviderPolicyRegistryOptions and the
MemoryProviderPolicyConsentStore/MemoryProviderPolicyAuditStore references
remain intact so behavior is unchanged.
- Around line 54-64: getProviderPolicy, listProviderPolicies and
listAuthProviderPolicies currently return direct references from
oauthProviderPolicies which allows callers to mutate shared
ProviderPolicyDefinition objects; change these functions to return defensive
clones instead (e.g., deep-clone the single object in getProviderPolicy,
deep-clone each object in Object.values(oauthProviderPolicies) for
listProviderPolicies, and deep-clone each mapped
oauthProviderPolicies[providerId] in listAuthProviderPolicies) so callers
receive independent copies of the ProviderPolicyDefinition instances.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 95517b39-9a63-42c5-9111-38700d9c580d
📒 Files selected for processing (4)
packages/http-openapi/src/__tests__/generate.test.tspackages/http-openapi/src/generate.tspackages/oauth-providers/src/__tests__/exports.test.tspackages/oauth-providers/src/index.ts
🚧 Files skipped from review as they are similar to previous changes (2)
- packages/oauth-providers/src/tests/exports.test.ts
- packages/http-openapi/src/tests/generate.test.ts
|
Review the following changes in direct dependencies. Learn more about Socket for GitHub. |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 3f7d66627d
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| const consumedState = await this.requireState(input.state); | ||
| const subject = toFlowSubject(consumedState); |
There was a problem hiding this comment.
Use consumed state payload instead of re-fetching callback state
After coreService.handleCallback succeeds, this code immediately does stateStore.get via requireState. The OAuthStateStore contract does not guarantee consumed states remain readable, so implementations that delete state rows on consume (a common replay-protection pattern) will fail here with OAUTH_STATE_INVALID after the auth code was already exchanged. That turns valid callbacks into failures and can leave token exchange side effects without local persistence.
Useful? React with 👍 / 👎.
| return oauthProviderPolicies[providerId]; | ||
| } | ||
|
|
||
| export function listProviderPolicies(): ProviderPolicyDefinition[] { | ||
| return Object.values(oauthProviderPolicies); |
There was a problem hiding this comment.
Return cloned provider policies from public accessors
These accessors return shared objects from oauthProviderPolicies by reference, so caller mutation can silently alter global policy state for the entire process (for example changing allowed scopes or operation flags before later registry creation). That makes policy behavior non-deterministic across requests and tests; the API should return defensive clones here just like descriptor accessors do.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 1ae0d73d7b
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| return result.subject.returnTo; | ||
| } | ||
|
|
||
| return "/"; |
There was a problem hiding this comment.
Return absolute fallback URL for redirect callbacks
When callbackMode is "redirect" and neither getRedirectLocation nor subject.returnTo is set, this falls back to "/". In Node/undici runtimes, Response.redirect("/") throws (Failed to parse URL from /), so otherwise-successful callbacks turn into 500s for browser-auth flows without returnTo (and any connection-subject callbacks using browser routes). Build the fallback as an absolute URL from the current request instead of returning a relative path.
Useful? React with 👍 / 👎.
| }; | ||
|
|
||
| return { | ||
| connectionId: asRequiredString(body.connectionId, "connectionId"), |
There was a problem hiding this comment.
Validate disconnect body object before dereferencing fields
request.json() can legally return null, but the disconnect parser immediately reads body.connectionId without checking that body is an object. A request body of literal null will therefore throw a TypeError and surface as a 500 instead of a client-facing validation error. Add a non-null object guard before reading fields (this same parser shape is also used in the sibling route factory).
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 7c6ccd74db
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| const payload = parsedBody as Record<string, unknown>; | ||
| const accessToken = asString(payload.access_token); |
There was a problem hiding this comment.
Handle malformed JSON token bodies before property access
When a token endpoint responds 200 with Content-Type: application/json but a malformed body (common during upstream outages/proxy errors), parseResponseBody returns null and normalizeTokenResponse immediately dereferences payload.access_token. That throws a raw TypeError instead of the expected OAuthHttpError, so callers lose deterministic OAuth error codes and structured details for exchange/refresh failures.
Useful? React with 👍 / 👎.
|
|
||
| private async executeWithRetry(url: string, init: RequestInitLike): Promise<ResponseLike> { | ||
| let attempt = 1; | ||
| let response = await this.fetcher(url, init); |
There was a problem hiding this comment.
Retry transient fetch exceptions in executeWithRetry
executeWithRetry only retries based on HTTP statuses, but the initial await this.fetcher(...) is outside any error handling. If fetcher throws (e.g., DNS hiccup, socket reset, timeout), the method exits on the first attempt and never applies the configured retry policy, which reduces resilience for the most common transient transport failures.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 6f994e8a03
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| resolvedInput.redirectUri ?? | ||
| url.searchParams.get("redirectUri") ?? | ||
| `${url.origin}${url.pathname}`, |
There was a problem hiding this comment.
Preserve query string in inferred callback redirectUri
When redirectUri is not provided explicitly, this fallback rebuilds it from origin + pathname and drops any existing query component from the callback URL. OAuth clients that register redirect URIs with fixed query parameters (for example .../callback?tenant=acme) will then fail callback validation because handleCallback compares against the originally stored redirect URI, causing valid provider callbacks to be rejected. The same inference pattern is also used in the connection route factory.
Useful? React with 👍 / 👎.
| params.set("client_id", credentials.clientId); | ||
| const authUser = encodeURIComponent(credentials.clientId); | ||
| const authPassword = encodeURIComponent(credentials.clientSecret); | ||
| const auth = Buffer.from(`${authUser}:${authPassword}`, "utf8").toString("base64"); | ||
| headers.authorization = `Basic ${auth}`; |
There was a problem hiding this comment.
Remove body client_id for client_secret_basic requests
In the client_secret_basic branch, the request currently sends credentials in both places: Authorization header and form body (client_id). OAuth 2.0 forbids using multiple client authentication methods in the same token request, and strict providers reject this with invalid_client, so exchange/refresh/revoke can fail whenever a provider/runtime config uses client_secret_basic.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 51e646ba43
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| const statePreview = await this.config.stateStore.get(input.state); | ||
| providerId = input.providerId ?? statePreview?.providerId; | ||
| subjectPreview = statePreview ? toFlowSubject(statePreview) : undefined; |
There was a problem hiding this comment.
Remove mandatory state preview read before callback exchange
handleCallback unconditionally calls stateStore.get before invoking coreService.handleCallback, even when input.providerId is already known from the route. If that preview read throws (for example, a transient read-path failure), the callback fails before state consumption/token exchange, turning an otherwise valid callback into an internal error. This makes callback success depend on an unnecessary extra storage read rather than the actual consume/exchange path.
Useful? React with 👍 / 👎.
| if (!decision.retry) { | ||
| throw error; | ||
| } |
There was a problem hiding this comment.
Convert exhausted fetch exceptions into OAuthHttpError
When fetcher throws and retry attempts are exhausted, executeWithRetry rethrows the raw exception. That causes exchangeToken/revokeToken to leak non-OAuthHttpError failures, so higher layers lose structured OAuth error codes/details and may misclassify retryability. Wrapping the terminal transport failure in OAuthHttpError keeps error handling deterministic for callers.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: bb5bff4333
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| let revokeError: OAuthFlowError | null = null; | ||
| if (revokeRemote && provider.revocationUrl && tokenRecord) { | ||
| remoteRevokeAttempted = true; | ||
| const runtime = await this.getRuntimeConfig({ |
There was a problem hiding this comment.
Continue local disconnect when revoke config resolution fails
In disconnect(), runtime credentials are resolved before entering the try/catch that downgrades revoke errors, so a resolver/runtime-config failure with revokeRemote: true exits early and skips deleteTokenByConnection. That leaves the local token attached even though a disconnect was requested, and it breaks the intended “clean up locally even if remote revoke fails” behavior for outages in secret/config backends.
Useful? React with 👍 / 👎.
| throw new OAuthHttpError("OAuth provider request failed", { | ||
| code: "OAUTH_TOKEN_EXCHANGE_FAILED", | ||
| status: 502, |
There was a problem hiding this comment.
Preserve refresh error code on transport failures
executeWithRetry() hard-codes code: "OAUTH_TOKEN_EXCHANGE_FAILED" when fetch exceptions exhaust retries, but this helper is also used by refresh requests. As a result, network failures during grantType: "refresh_token" are misclassified as exchange failures instead of OAUTH_TOKEN_REFRESH_FAILED, which can break caller logic that keys retries/telemetry off refresh-specific error codes.
Useful? React with 👍 / 👎.
| if (result.subject.kind === "browser-auth" && result.subject.returnTo) { | ||
| return result.subject.returnTo; |
There was a problem hiding this comment.
Normalize browser returnTo before issuing redirect
resolveRedirectLocation() returns subject.returnTo as-is, so a relative value like /dashboard (valid per current types and common in app code) is forwarded to Response.redirect. In Node/undici this throws Failed to parse URL, causing successful OAuth callbacks in redirect mode to fail with 500 unless every returnTo is absolute.
Useful? React with 👍 / 👎.
Merge activity
|
## Summary - add the missing shared OAuth and OpenAPI packages to `origin/dev` - include the OAuth core, flow, http, provider, router, storage, and testing layers - add the shared OpenAPI generation package used by auth-related packages ## Validation - inspected the copied package set in a clean worktree created from `origin/dev` - did not run a reliable monorepo validation pass in this worktree ## Notes - no `.conduct` changes included - opened directly against `origin/dev` with `GH_CONFIG_DIR=~/.config/gh-other` <!-- This is an auto-generated description by cubic. --> --- ## Summary by cubic Adds shared OAuth and OpenAPI packages and hardens OAuth route validation and transport error handling. Completes SF-30 and aligns with router/token client review follow-ups. - **New Features** - `@superfunctions/http-openapi`: Deterministic OpenAPI 3.1 generation from `@superfunctions/http` router metadata with stricter validation and a typed `OpenApiGenerationError`. - OAuth platform: `@superfunctions/oauth-core`, `@superfunctions/oauth-flow`, `@superfunctions/oauth-http`, `@superfunctions/oauth-providers`, `@superfunctions/oauth-storage`, `@superfunctions/oauth-router`, `@superfunctions/oauth-testing` with PKCE/state/nonce, curated provider registry (Google/Apple/GitHub/Microsoft/Yahoo), retry/backoff transport with `Retry-After`, encrypted token vault + memory/DB/SQL adapters, and secure route factories with OpenAPI metadata. - **Bug Fixes** - `@superfunctions/http-openapi`: Fixed path parameter emission; clearer metadata errors. - Storage: Hardened adapter parsing/validation, deterministic SQL schema, robust NotFound handling. - Router: Stricter input validation for start/callback/disconnect with consistent 400s, preserved `requestId` on browser start, tightened callback parsing, hardened packaging. - Transport: Fixed token client encoding/header edge cases, robust non-JSON/empty body parsing, normalized errors, revocation honors `token_type_hint`. <sup>Written for commit 0f41730. Summary will update on new commits.</sup> <!-- End of auto-generated description by cubic. --> <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **New Features** * OpenAPI 3.1 generator and published http-openapi package. * Full OAuth platform: oauth-core, oauth-flow, oauth-http (token exchange/refresh/revoke with retry), oauth-providers catalogue, oauth-router route factories, oauth-storage adapters (memory/SQL/DB) and encrypted token vault with key rotation. * Provider policy registry with scope validation, consent/audit persistence, and runtime config tooling. * OAuth testing utilities and fixtures. * **Documentation** * READMEs and changelogs added across OAuth packages. * **Tests** * Extensive unit and integration test coverage for all new packages. <!-- end of auto-generated comment: release notes by coderabbit.ai -->
0f41730 to
12317f3
Compare
|




Summary
origin/devValidation
origin/devNotes
.conductchanges includedorigin/devwithGH_CONFIG_DIR=~/.config/gh-otherSummary by cubic
Adds shared OAuth and OpenAPI packages with deterministic OpenAPI generation and a secure, tested OAuth stack across transport, storage, and routes. Completes SF-30.
New Features
@superfunctions/http-openapi: Generates OpenAPI 3.1 from@superfunctions/httprouter metadata with stricter validation and a typedOpenApiGenerationError.@superfunctions/oauth-core,@superfunctions/oauth-flow,@superfunctions/oauth-http,@superfunctions/oauth-providers,@superfunctions/oauth-storage,@superfunctions/oauth-router,@superfunctions/oauth-testingwith PKCE/state/nonce, curated provider registry (Google/Apple/GitHub/Microsoft/Yahoo), retry/backoff transport honoringRetry-After, encrypted token vault + memory/DB/SQL adapters, and secure route factories with OpenAPI metadata.Bug Fixes
requestIdon browser start, tighter callback parsing.token_type_hint.Written for commit 12317f3. Summary will update on new commits.
Summary by CodeRabbit
New Features
Documentation
Tests