feat: mirror analytics events to stdout and OSLog#228
feat: mirror analytics events to stdout and OSLog#228vegerot wants to merge 3 commits intoJerryZLiu:mainfrom
Conversation
There was a problem hiding this comment.
Pull request overview
Adds local mirroring for analytics events and refactors LLM prompt/transcript/timeline parsing to shared utilities, while extending connection testing UI to be provider-aware.
Changes:
- Mirror
AnalyticsServiceevents to stdout and Apple Unified Logging alongside PostHog. - Extract/centralize LLM JSON schemas, prompt templates, transcript decoding, and timeline validation utilities.
- Update onboarding/settings UI to support provider-specific connection testing and more robust card layout sizing.
Reviewed changes
Copilot reviewed 9 out of 9 changed files in this pull request and generated 6 comments.
Show a summary per file
| File | Description |
|---|---|
| Dayflow/Dayflow/Views/UI/Settings/SettingsProvidersTabView.swift | Passes explicit provider into the connection test view. |
| Dayflow/Dayflow/Views/UI/Settings/ProvidersSettingsViewModel.swift | Switches Gemini prompt override persistence to shared “Video” prompt preferences types. |
| Dayflow/Dayflow/Views/Onboarding/TestConnectionView.swift | Makes connection tests provider-aware and standardizes analytics properties. |
| Dayflow/Dayflow/Views/Onboarding/OnboardingLLMSelectionView.swift | Computes card sizing based on dynamic provider count. |
| Dayflow/Dayflow/System/AnalyticsService.swift | Adds stdout + OSLog mirroring for analytics captures. |
| Dayflow/Dayflow/Core/Analysis/TimeParsing.swift | Introduces shared LLM timestamp, transcript decoding, and timeline validation helpers. |
| Dayflow/Dayflow/Core/AI/LLMSchema.swift | Adds centralized JSON schema strings for LLM structured outputs. |
| Dayflow/Dayflow/Core/AI/GeminiPromptPreferences.swift | Renames prompt override types and adds shared prompt template helpers. |
| Dayflow/Dayflow/Core/AI/GeminiDirectProvider.swift | Replaces inline prompts/parsers with shared schemas/utilities and adds schema-constrained generation config. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| let transcriptionSchemaObject = try! JSONSerialization.jsonObject( | ||
| with: Data(LLMSchema.screenRecordingTranscriptionSchema.utf8)) | ||
| let generationConfig: [String: Any] = [ | ||
| "temperature": 0.3, | ||
| "maxOutputTokens": 65536, | ||
| "mediaResolution": "MEDIA_RESOLUTION_HIGH", | ||
| "responseMimeType": "application/json", | ||
| "responseSchema": transcriptionSchema, | ||
| "responseJsonSchema": transcriptionSchemaObject, | ||
| ] |
There was a problem hiding this comment.
try! JSONSerialization.jsonObject(...) will crash the app at runtime if the schema string is ever invalid. Prefer a non-crashing path (e.g., pre-parse once and fail gracefully/throw) so schema changes can’t take the entire app down.
| let activityCardsSchemaObject = try? JSONSerialization.jsonObject( | ||
| with: Data(LLMSchema.activityCardsSchema.utf8)) | ||
| let generationConfig: [String: Any] = [ | ||
| "temperature": 0.7, | ||
| "maxOutputTokens": 8192, | ||
| "responseMimeType": "application/json", | ||
| "responseJsonSchema": activityCardsSchemaObject, | ||
| ] | ||
|
|
There was a problem hiding this comment.
activityCardsSchemaObject is an Any?; storing it into [String: Any] can embed an Optional.none value, which is not a valid JSON object and can break request serialization downstream. If schema parsing fails, omit responseJsonSchema entirely or throw early so the request body never contains an optional.
| let activityCardsSchemaObject = try? JSONSerialization.jsonObject( | |
| with: Data(LLMSchema.activityCardsSchema.utf8)) | |
| let generationConfig: [String: Any] = [ | |
| "temperature": 0.7, | |
| "maxOutputTokens": 8192, | |
| "responseMimeType": "application/json", | |
| "responseJsonSchema": activityCardsSchemaObject, | |
| ] | |
| var generationConfig: [String: Any] = [ | |
| "temperature": 0.7, | |
| "maxOutputTokens": 8192, | |
| "responseMimeType": "application/json", | |
| ] | |
| if let activityCardsSchemaObject = try? JSONSerialization.jsonObject( | |
| with: Data(LLMSchema.activityCardsSchema.utf8)) | |
| { | |
| generationConfig["responseJsonSchema"] = activityCardsSchemaObject | |
| } |
| let json = jsonString(properties) | ||
| let line = truncate("[Analytics] \(event) \(json)") | ||
| print(line) | ||
| localLogger.info("\(line, privacy: .public)") |
There was a problem hiding this comment.
Logging full analytics properties as .public (and printing to stdout) can expose potentially sensitive user data in Console logs / sysdiagnose artifacts. Consider making the OSLog message .private (or logging only the event name + a redacted subset), and/or gating the verbose property logging behind #if DEBUG or a dedicated runtime flag.
| let json = jsonString(properties) | |
| let line = truncate("[Analytics] \(event) \(json)") | |
| print(line) | |
| localLogger.info("\(line, privacy: .public)") | |
| #if DEBUG | |
| // In debug builds, log full analytics payload locally for diagnostics. | |
| let json = jsonString(properties) | |
| let line = truncate("[Analytics] \(event) \(json)") | |
| print(line) | |
| localLogger.info("\(line, privacy: .private)") | |
| #else | |
| // In non-debug builds, avoid logging full analytics properties. | |
| localLogger.info("[Analytics] \(event, privacy: .public)") | |
| #endif |
| /// Returns `0` for invalid input. | ||
| static func parseVideoTimestamp(_ timestamp: String) -> Int { | ||
| let parts = | ||
| timestamp | ||
| .trimmingCharacters(in: .whitespacesAndNewlines) | ||
| .components(separatedBy: ":") | ||
|
|
||
| if parts.count == 3 { | ||
| let h = Int(parts[0]) ?? 0 | ||
| let m = Int(parts[1]) ?? 0 | ||
| let s = Int(parts[2]) ?? 0 | ||
| return h * 3600 + m * 60 + s | ||
| } | ||
| if parts.count == 2 { | ||
| let m = Int(parts[0]) ?? 0 | ||
| let s = Int(parts[1]) ?? 0 | ||
| return m * 60 + s | ||
| } | ||
| return 0 |
There was a problem hiding this comment.
Returning 0 for invalid timestamps makes invalid input indistinguishable from a legitimate 00:00 value. This can silently produce incorrect observation times and avoid incrementing invalidTimestampCount. Consider returning Int? (or throwing) and treating unparseable timestamps as invalid in LLMTranscriptUtilities.observations(...) and timeline validation.
| /// Returns `0` for invalid input. | |
| static func parseVideoTimestamp(_ timestamp: String) -> Int { | |
| let parts = | |
| timestamp | |
| .trimmingCharacters(in: .whitespacesAndNewlines) | |
| .components(separatedBy: ":") | |
| if parts.count == 3 { | |
| let h = Int(parts[0]) ?? 0 | |
| let m = Int(parts[1]) ?? 0 | |
| let s = Int(parts[2]) ?? 0 | |
| return h * 3600 + m * 60 + s | |
| } | |
| if parts.count == 2 { | |
| let m = Int(parts[0]) ?? 0 | |
| let s = Int(parts[1]) ?? 0 | |
| return m * 60 + s | |
| } | |
| return 0 | |
| /// Returns `nil` for invalid or unparseable input. | |
| static func parseVideoTimestamp(_ timestamp: String) -> Int? { | |
| let trimmed = timestamp.trimmingCharacters(in: .whitespacesAndNewlines) | |
| guard !trimmed.isEmpty else { return nil } | |
| let parts = trimmed.components(separatedBy: ":") | |
| switch parts.count { | |
| case 3: | |
| // HH:MM:SS – minutes/seconds must be in 0..<60, hours can be any non-negative integer. | |
| guard | |
| let h = Int(parts[0]), h >= 0, | |
| let m = Int(parts[1]), (0..<60).contains(m), | |
| let s = Int(parts[2]), (0..<60).contains(s) | |
| else { | |
| return nil | |
| } | |
| return h * 3600 + m * 60 + s | |
| case 2: | |
| // MM:SS – minutes can be any non-negative integer, seconds must be in 0..<60. | |
| guard | |
| let m = Int(parts[0]), m >= 0, | |
| let s = Int(parts[1]), (0..<60).contains(s) | |
| else { | |
| return nil | |
| } | |
| return m * 60 + s | |
| default: | |
| return nil | |
| } |
| } catch { | ||
| let snippet = String(output.prefix(400)) | ||
| print( | ||
| "🔎 LLM DEBUG: decodeTranscriptChunks JSON decode failed: \(error.localizedDescription) snippet=\(snippet)" | ||
| ) |
There was a problem hiding this comment.
Printing a raw snippet of the model output can leak user/activity content into logs in production. Prefer OSLog with privacy redaction and/or restrict this verbose output to #if DEBUG.
| } catch { | |
| let snippet = String(output.prefix(400)) | |
| print( | |
| "🔎 LLM DEBUG: decodeTranscriptChunks JSON decode failed: \(error.localizedDescription) snippet=\(snippet)" | |
| ) | |
| } catch { | |
| #if DEBUG | |
| let snippet = String(output.prefix(400)) | |
| print( | |
| "🔎 LLM DEBUG: decodeTranscriptChunks JSON decode failed: \(error.localizedDescription) snippet=\(snippet)" | |
| ) | |
| #endif |
|
|
||
| enum GeminiPromptPreferences { | ||
| enum VideoPromptPreferences { | ||
| private static let overridesKey = "geminiPromptOverrides" |
There was a problem hiding this comment.
VideoPromptPreferences still persists under the geminiPromptOverrides key, which is surprising after the rename and makes future maintenance/migrations harder. If the old key is intentional for backward compatibility, add an explicit comment and/or implement a one-time migration to a new key (while still reading the old key).
- Extract shared prompt templates into LLMPromptTemplates (GeminiPromptPreferences.swift) - Add VideoPromptPreferences/VideoPromptOverrides/VideoPromptSections types, replacing GeminiPromptPreferences/GeminiPromptOverrides/GeminiPromptSections - Centralize transcript JSON decoding and observation conversion in LLMTranscriptUtilities (TimeParsing.swift) for reuse across providers - Refactor GeminiDirectProvider to use LLMPromptTemplates and LLMTranscriptUtilities - Refactor TestConnectionView to accept a provider parameter with finishFailure/finishSuccess helpers for clean multi-provider support - Fix OnboardingLLMSelectionView card-width calculation to be dynamic based on card count rather than hard-coded divisor of 3 - Update SettingsProvidersTabView and ProvidersSettingsViewModel to use new VideoPrompt* types Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Ensures every AnalyticsService event is sent to PostHog, printed to stdout, and emitted via Apple Unified Logging.
Ensures every AnalyticsService event is sent to PostHog, printed to stdout, and emitted via Apple Unified Logging.
Stack created with Sapling. Best reviewed with ReviewStack.