Skip to content

perf: Eliminate duplicate data merging in RobotWriter #704

@oboehmer

Description

@oboehmer

Problem

Data files are currently merged twice during test execution:

  1. CLI (main.py) — merges data and writes merged_data_model_test_variables.yaml
  2. RobotWriter — re-merges the same files when initialized

Current Flow

CLI (lines 398-399):

merged_data = DataMerger.merge_data_files(data)
DataMerger.write_merged_data_model(merged_data, output, merged_data_filename)

RobotWriter __init__ (line 70):

self.data = DataMerger.merge_data_files(data_paths)

RobotOrchestrator then calls write_merged_data_model() which writes the file again.

Proposed Solution

Refactor RobotWriter to accept the merged data path instead of raw data_paths, and read the file that CLI already created:

# Current
RobotWriter(data_paths: list[Path], ...)

# Proposed  
RobotWriter(merged_data_path: Path, ...)

This aligns with how PyATS orchestrator handles merged data.

Related

Identified during #677 work.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions