Skip to content

feat: mirror analytics events to stdout and OSLog#228

Open
vegerot wants to merge 3 commits intoJerryZLiu:mainfrom
vegerot:pr228
Open

feat: mirror analytics events to stdout and OSLog#228
vegerot wants to merge 3 commits intoJerryZLiu:mainfrom
vegerot:pr228

Conversation

@vegerot
Copy link
Copy Markdown
Contributor

@vegerot vegerot commented Mar 5, 2026

Ensures every AnalyticsService event is sent to PostHog, printed to stdout, and emitted via Apple Unified Logging.


Stack created with Sapling. Best reviewed with ReviewStack.

Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds local mirroring for analytics events and refactors LLM prompt/transcript/timeline parsing to shared utilities, while extending connection testing UI to be provider-aware.

Changes:

  • Mirror AnalyticsService events to stdout and Apple Unified Logging alongside PostHog.
  • Extract/centralize LLM JSON schemas, prompt templates, transcript decoding, and timeline validation utilities.
  • Update onboarding/settings UI to support provider-specific connection testing and more robust card layout sizing.

Reviewed changes

Copilot reviewed 9 out of 9 changed files in this pull request and generated 6 comments.

Show a summary per file
File Description
Dayflow/Dayflow/Views/UI/Settings/SettingsProvidersTabView.swift Passes explicit provider into the connection test view.
Dayflow/Dayflow/Views/UI/Settings/ProvidersSettingsViewModel.swift Switches Gemini prompt override persistence to shared “Video” prompt preferences types.
Dayflow/Dayflow/Views/Onboarding/TestConnectionView.swift Makes connection tests provider-aware and standardizes analytics properties.
Dayflow/Dayflow/Views/Onboarding/OnboardingLLMSelectionView.swift Computes card sizing based on dynamic provider count.
Dayflow/Dayflow/System/AnalyticsService.swift Adds stdout + OSLog mirroring for analytics captures.
Dayflow/Dayflow/Core/Analysis/TimeParsing.swift Introduces shared LLM timestamp, transcript decoding, and timeline validation helpers.
Dayflow/Dayflow/Core/AI/LLMSchema.swift Adds centralized JSON schema strings for LLM structured outputs.
Dayflow/Dayflow/Core/AI/GeminiPromptPreferences.swift Renames prompt override types and adds shared prompt template helpers.
Dayflow/Dayflow/Core/AI/GeminiDirectProvider.swift Replaces inline prompts/parsers with shared schemas/utilities and adds schema-constrained generation config.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +998 to 1006
let transcriptionSchemaObject = try! JSONSerialization.jsonObject(
with: Data(LLMSchema.screenRecordingTranscriptionSchema.utf8))
let generationConfig: [String: Any] = [
"temperature": 0.3,
"maxOutputTokens": 65536,
"mediaResolution": "MEDIA_RESOLUTION_HIGH",
"responseMimeType": "application/json",
"responseSchema": transcriptionSchema,
"responseJsonSchema": transcriptionSchemaObject,
]
Copy link

Copilot AI Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

try! JSONSerialization.jsonObject(...) will crash the app at runtime if the schema string is ever invalid. Prefer a non-crashing path (e.g., pre-parse once and fail gracefully/throw) so schema changes can’t take the entire app down.

Copilot uses AI. Check for mistakes.
Comment on lines +1680 to 1688
let activityCardsSchemaObject = try? JSONSerialization.jsonObject(
with: Data(LLMSchema.activityCardsSchema.utf8))
let generationConfig: [String: Any] = [
"temperature": 0.7,
"maxOutputTokens": 8192,
"responseMimeType": "application/json",
"responseJsonSchema": activityCardsSchemaObject,
]

Copy link

Copilot AI Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

activityCardsSchemaObject is an Any?; storing it into [String: Any] can embed an Optional.none value, which is not a valid JSON object and can break request serialization downstream. If schema parsing fails, omit responseJsonSchema entirely or throw early so the request body never contains an optional.

Suggested change
let activityCardsSchemaObject = try? JSONSerialization.jsonObject(
with: Data(LLMSchema.activityCardsSchema.utf8))
let generationConfig: [String: Any] = [
"temperature": 0.7,
"maxOutputTokens": 8192,
"responseMimeType": "application/json",
"responseJsonSchema": activityCardsSchemaObject,
]
var generationConfig: [String: Any] = [
"temperature": 0.7,
"maxOutputTokens": 8192,
"responseMimeType": "application/json",
]
if let activityCardsSchemaObject = try? JSONSerialization.jsonObject(
with: Data(LLMSchema.activityCardsSchema.utf8))
{
generationConfig["responseJsonSchema"] = activityCardsSchemaObject
}

Copilot uses AI. Check for mistakes.
Comment on lines +235 to +238
let json = jsonString(properties)
let line = truncate("[Analytics] \(event) \(json)")
print(line)
localLogger.info("\(line, privacy: .public)")
Copy link

Copilot AI Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Logging full analytics properties as .public (and printing to stdout) can expose potentially sensitive user data in Console logs / sysdiagnose artifacts. Consider making the OSLog message .private (or logging only the event name + a redacted subset), and/or gating the verbose property logging behind #if DEBUG or a dedicated runtime flag.

Suggested change
let json = jsonString(properties)
let line = truncate("[Analytics] \(event) \(json)")
print(line)
localLogger.info("\(line, privacy: .public)")
#if DEBUG
// In debug builds, log full analytics payload locally for diagnostics.
let json = jsonString(properties)
let line = truncate("[Analytics] \(event) \(json)")
print(line)
localLogger.info("\(line, privacy: .private)")
#else
// In non-debug builds, avoid logging full analytics properties.
localLogger.info("[Analytics] \(event, privacy: .public)")
#endif

Copilot uses AI. Check for mistakes.
Comment on lines +45 to +63
/// Returns `0` for invalid input.
static func parseVideoTimestamp(_ timestamp: String) -> Int {
let parts =
timestamp
.trimmingCharacters(in: .whitespacesAndNewlines)
.components(separatedBy: ":")

if parts.count == 3 {
let h = Int(parts[0]) ?? 0
let m = Int(parts[1]) ?? 0
let s = Int(parts[2]) ?? 0
return h * 3600 + m * 60 + s
}
if parts.count == 2 {
let m = Int(parts[0]) ?? 0
let s = Int(parts[1]) ?? 0
return m * 60 + s
}
return 0
Copy link

Copilot AI Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Returning 0 for invalid timestamps makes invalid input indistinguishable from a legitimate 00:00 value. This can silently produce incorrect observation times and avoid incrementing invalidTimestampCount. Consider returning Int? (or throwing) and treating unparseable timestamps as invalid in LLMTranscriptUtilities.observations(...) and timeline validation.

Suggested change
/// Returns `0` for invalid input.
static func parseVideoTimestamp(_ timestamp: String) -> Int {
let parts =
timestamp
.trimmingCharacters(in: .whitespacesAndNewlines)
.components(separatedBy: ":")
if parts.count == 3 {
let h = Int(parts[0]) ?? 0
let m = Int(parts[1]) ?? 0
let s = Int(parts[2]) ?? 0
return h * 3600 + m * 60 + s
}
if parts.count == 2 {
let m = Int(parts[0]) ?? 0
let s = Int(parts[1]) ?? 0
return m * 60 + s
}
return 0
/// Returns `nil` for invalid or unparseable input.
static func parseVideoTimestamp(_ timestamp: String) -> Int? {
let trimmed = timestamp.trimmingCharacters(in: .whitespacesAndNewlines)
guard !trimmed.isEmpty else { return nil }
let parts = trimmed.components(separatedBy: ":")
switch parts.count {
case 3:
// HH:MM:SS – minutes/seconds must be in 0..<60, hours can be any non-negative integer.
guard
let h = Int(parts[0]), h >= 0,
let m = Int(parts[1]), (0..<60).contains(m),
let s = Int(parts[2]), (0..<60).contains(s)
else {
return nil
}
return h * 3600 + m * 60 + s
case 2:
// MM:SS – minutes can be any non-negative integer, seconds must be in 0..<60.
guard
let m = Int(parts[0]), m >= 0,
let s = Int(parts[1]), (0..<60).contains(s)
else {
return nil
}
return m * 60 + s
default:
return nil
}

Copilot uses AI. Check for mistakes.
Comment on lines +285 to +289
} catch {
let snippet = String(output.prefix(400))
print(
"🔎 LLM DEBUG: decodeTranscriptChunks JSON decode failed: \(error.localizedDescription) snippet=\(snippet)"
)
Copy link

Copilot AI Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Printing a raw snippet of the model output can leak user/activity content into logs in production. Prefer OSLog with privacy redaction and/or restrict this verbose output to #if DEBUG.

Suggested change
} catch {
let snippet = String(output.prefix(400))
print(
"🔎 LLM DEBUG: decodeTranscriptChunks JSON decode failed: \(error.localizedDescription) snippet=\(snippet)"
)
} catch {
#if DEBUG
let snippet = String(output.prefix(400))
print(
"🔎 LLM DEBUG: decodeTranscriptChunks JSON decode failed: \(error.localizedDescription) snippet=\(snippet)"
)
#endif

Copilot uses AI. Check for mistakes.

enum GeminiPromptPreferences {
enum VideoPromptPreferences {
private static let overridesKey = "geminiPromptOverrides"
Copy link

Copilot AI Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

VideoPromptPreferences still persists under the geminiPromptOverrides key, which is surprising after the rename and makes future maintenance/migrations harder. If the old key is intentional for backward compatibility, add an explicit comment and/or implement a one-time migration to a new key (while still reading the old key).

Copilot uses AI. Check for mistakes.
vegerot and others added 3 commits March 6, 2026 16:12
- Extract shared prompt templates into LLMPromptTemplates (GeminiPromptPreferences.swift)
- Add VideoPromptPreferences/VideoPromptOverrides/VideoPromptSections types,
  replacing GeminiPromptPreferences/GeminiPromptOverrides/GeminiPromptSections
- Centralize transcript JSON decoding and observation conversion in
  LLMTranscriptUtilities (TimeParsing.swift) for reuse across providers
- Refactor GeminiDirectProvider to use LLMPromptTemplates and LLMTranscriptUtilities
- Refactor TestConnectionView to accept a provider parameter with
  finishFailure/finishSuccess helpers for clean multi-provider support
- Fix OnboardingLLMSelectionView card-width calculation to be dynamic
  based on card count rather than hard-coded divisor of 3
- Update SettingsProvidersTabView and ProvidersSettingsViewModel to use
  new VideoPrompt* types

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Ensures every AnalyticsService event is sent to PostHog, printed to stdout, and emitted via Apple Unified Logging.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants