Skip to content

[dead-code] chore: remove dead functions — 7 functions removed#24727

Merged
pelikhan merged 1 commit intomainfrom
dead-code/remove-batch-6-564babdfb9496302
Apr 5, 2026
Merged

[dead-code] chore: remove dead functions — 7 functions removed#24727
pelikhan merged 1 commit intomainfrom
dead-code/remove-batch-6-564babdfb9496302

Conversation

@github-actions
Copy link
Copy Markdown
Contributor

@github-actions github-actions bot commented Apr 5, 2026

Dead Code Removal

This PR removes unreachable Go functions identified by the deadcode static analyzer.

Functions Removed

Function File
Miner.Match pkg/agentdrain/miner.go
LoadMinerJSON pkg/agentdrain/persist.go
NormalizeExpressionForComparison pkg/workflow/expression_parser.go
GetSafeOutputsToolsJSON pkg/workflow/safe_outputs_tools.go
checkAllEnabledToolsPresent pkg/workflow/safe_outputs_state.go
generateFilteredToolsJSON pkg/workflow/safe_outputs_tools_generation.go
filterToolSchemaFields pkg/workflow/safe_outputs_tools_generation.go

Tests Removed

Exclusive test functions removed (test functions that exclusively tested deleted functions):

  • TestMatch_InferenceOnly (tested Miner.Match)
  • TestSaveLoadJSON (tested LoadMinerJSON)
  • TestNormalizeExpressionForComparison (tested NormalizeExpressionForComparison)
  • TestGetSafeOutputsToolsJSON (tested GetSafeOutputsToolsJSON)
  • TestCheckAllEnabledToolsPresentAllMatch, TestCheckAllEnabledToolsPresentMissing, TestCheckAllEnabledToolsPresentEmpty, TestCheckAllEnabledToolsPresentMultipleMissing (tested checkAllEnabledToolsPresent)
  • 16 functions testing generateFilteredToolsJSON and filterToolSchemaFields across safe_outputs_tools_test.go, safe_outputs_tools_generation_test.go, safe_outputs_custom_job_tools_test.go, safe_outputs_tools_schema_test.go, safe_outputs_tools_update_issue_test.go, safe_scripts_test.go, safe_outputs_tools_meta_integration_test.go

Three test files deleted entirely (all functions removed): safe_outputs_custom_job_tools_test.go, safe_outputs_tools_schema_test.go, safe_outputs_tools_update_issue_test.go.

NormalizeExpressionForComparison call sites in tests were replaced with the equivalent inline expression strings.Join(strings.Fields(x), " ").

Note: generateFilteredToolsJSON was the legacy approach replaced by generateToolsMetaJSON. Its callee functions (GetSafeOutputsToolsJSON, checkAllEnabledToolsPresent, filterToolSchemaFields) were also dead because their only caller was generateFilteredToolsJSON.

Verification

  • go build ./... — passes
  • go vet ./... — passes
  • go vet -tags=integration ./... — passes
  • make fmt — no changes needed
  • go test ./pkg/agentdrain/... ./pkg/workflow/... — passes

Dead Function Count

  • Before this batch: ~14 functions
  • Removed in this PR: 7 functions
  • Remaining: ~7 functions (NewCompilerWithVersion, Miner.ClusterCount, and others)

Automated by Dead Code Removal workflow — https://github.com/github/gh-aw/actions/runs/24000944639

Generated by Dead Code Removal Agent · ● 10.1M ·

  • expires on Apr 8, 2026, 12:15 PM UTC

Remove unreachable functions identified by the deadcode static analyzer:
- Miner.Match (pkg/agentdrain/miner.go)
- LoadMinerJSON (pkg/agentdrain/persist.go)
- NormalizeExpressionForComparison (pkg/workflow/expression_parser.go)
- GetSafeOutputsToolsJSON (pkg/workflow/safe_outputs_tools.go)
- checkAllEnabledToolsPresent (pkg/workflow/safe_outputs_state.go)
- generateFilteredToolsJSON (pkg/workflow/safe_outputs_tools_generation.go)
- filterToolSchemaFields (pkg/workflow/safe_outputs_tools_generation.go)

Also remove exclusive test functions that exclusively tested these dead functions,
and inline NormalizeExpressionForComparison logic as strings.Join(strings.Fields).

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
@pelikhan pelikhan marked this pull request as ready for review April 5, 2026 14:52
Copilot AI review requested due to automatic review settings April 5, 2026 14:52
@pelikhan pelikhan merged commit 35bf516 into main Apr 5, 2026
64 of 65 checks passed
@pelikhan pelikhan deleted the dead-code/remove-batch-6-564babdfb9496302 branch April 5, 2026 14:56
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Removes dead Go code paths (and their exclusive tests) identified by static analysis, primarily around the legacy safe-outputs tools JSON generation and some agentdrain miner helpers.

Changes:

  • Deleted 7 unreachable functions (legacy safe-outputs filtered-tools generation helpers, expression normalization helper, agentdrain miner helpers).
  • Removed/updated tests that only exercised the deleted code paths (including deleting several test files entirely).
  • Replaced test-only expression normalization helper usage with an inline strings.Fields-based normalization.
Show a summary per file
File Description
pkg/workflow/safe_scripts_test.go Removes a test that depended on the deleted legacy generateFilteredToolsJSON.
pkg/workflow/safe_outputs_tools.go Removes dead GetSafeOutputsToolsJSON accessor.
pkg/workflow/safe_outputs_tools_update_issue_test.go Deletes tests that depended on GetSafeOutputsToolsJSON.
pkg/workflow/safe_outputs_tools_test.go Deletes tests for the removed legacy filtered-tools generator, keeps description-enhancement tests.
pkg/workflow/safe_outputs_tools_schema_test.go Deletes schema-validation tests that depended on GetSafeOutputsToolsJSON.
pkg/workflow/safe_outputs_tools_meta_integration_test.go Removes legacy-approach regression tests; keeps meta embedding tests (but leaves some stale comments).
pkg/workflow/safe_outputs_tools_generation.go Deletes legacy generateFilteredToolsJSON + helpers; retains tools_meta generation.
pkg/workflow/safe_outputs_tools_generation_test.go Removes tests that depended on deleted legacy generator/check helper.
pkg/workflow/safe_outputs_state.go Removes dead checkAllEnabledToolsPresent.
pkg/workflow/safe_outputs_custom_job_tools_test.go Deletes tests that depended on deleted legacy generator.
pkg/workflow/expressions_test.go Replaces removed normalization helper with inline normalization.
pkg/workflow/expression_parser.go Removes dead NormalizeExpressionForComparison.
pkg/workflow/compiler_events_test.go Replaces removed normalization helper with inline normalization.
pkg/workflow/compiler_draft_test.go Replaces removed normalization helper with inline normalization.
pkg/agentdrain/persist.go Removes dead LoadMinerJSON constructor helper.
pkg/agentdrain/miner.go Removes dead inference-only Miner.Match.
pkg/agentdrain/miner_test.go Removes tests that depended on deleted Match/LoadMinerJSON helpers (but also drops persistence round-trip coverage).

Copilot's findings

Tip

Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comments suppressed due to low confidence (2)

pkg/workflow/safe_outputs_tools_meta_integration_test.go:82

  • This comment block mentions TestToolsMetaJSONPushRepoMemory, but the next test is TestToolsMetaJSONEmptyWhenNoSafeOutputs (the push_repo_memory test was removed). Please update/remove this comment to avoid confusion when scanning the integration tests.
// TestToolsMetaJSONPushRepoMemory verifies that push_repo_memory appears in
// description_suffixes when repo-memory is configured.
func TestToolsMetaJSONEmptyWhenNoSafeOutputs(t *testing.T) {
	data := &WorkflowData{SafeOutputs: nil}

pkg/workflow/safe_outputs_tools_meta_integration_test.go:131

  • There is an orphaned trailing comment at the end of the file that no longer attaches to any test/function. Please remove it or attach it to an actual test so the file doesn’t end with a confusing fragment.

// constraint-bearing configurations to ensure no tool type regresses silently.

  • Files reviewed: 17/17 changed files
  • Comments generated: 3

Comment on lines 15 to 19
// This file handles tool JSON generation: it takes the full set of
// safe-output tool definitions (from safe-output-tools.json) and produces a
// filtered subset containing only those tools enabled by the workflow's
// SafeOutputsConfig. Dynamic tools (dispatch-workflow, custom jobs) are also
// generated here.
Copy link

Copilot AI Apr 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The file-level comment still says this code “produces a filtered subset” from “safe-output-tools.json”, but the legacy Go-side filtering has been removed and this file now generates tools_meta.json (description suffixes / repo params / dynamic tools). Please update the header comment (and file name reference) to reflect the current generation strategy to avoid misleading future readers.

See below for a potential fix:

// This file generates workflow-specific tool metadata to be written to
// tools_meta.json at compile time. The metadata captures description suffixes,
// repo parameter definitions, and dynamic tool definitions derived from the
// workflow configuration.
//
// At runtime, generate_safe_outputs_tools.cjs reads the source
// safe_outputs_tools.json from the actions folder, applies the overrides from
// tools_meta.json, and writes the final tools.json—avoiding inlining the
// entire source tool definition file into the compiled workflow YAML.

Copilot uses AI. Check for mistakes.
Comment on lines 18 to 22
// TestToolsMetaJSONContainsDescriptionSuffixes verifies that the compiled workflow YAML
// embeds a tools_meta.json with the correct description suffixes instead of the large
// inlined tools.json used in the old approach.
func TestToolsMetaJSONContainsDescriptionSuffixes(t *testing.T) {
tmpDir := testutil.TempDir(t, "tools-meta-desc-test")

testContent := `---
on: push
name: Test Tools Meta Description
engine: copilot
safe-outputs:
create-issue:
max: 5
title-prefix: "[bot] "
labels:
- automation
- testing
add-comment:
max: 10
---

Test workflow for tools meta description suffixes.
`

testFile := filepath.Join(tmpDir, "test-tools-meta-desc.md")
require.NoError(t, os.WriteFile(testFile, []byte(testContent), 0644), "should write test file")

compiler := NewCompiler()
require.NoError(t, compiler.CompileWorkflow(testFile), "compilation should succeed")

lockFile := filepath.Join(tmpDir, "test-tools-meta-desc.lock.yml")
yamlBytes, err := os.ReadFile(lockFile)
require.NoError(t, err, "should read lock file")
yamlStr := string(yamlBytes)

// Must write tools_meta.json (new approach), NOT tools.json
assert.Contains(t, yamlStr, "cat > ${RUNNER_TEMP}/gh-aw/safeoutputs/tools_meta.json", "should write tools_meta.json")
assert.NotContains(t, yamlStr, "cat > ${RUNNER_TEMP}/gh-aw/safeoutputs/tools.json", "should NOT inline tools.json")

// Must invoke the JavaScript generator at runtime
assert.Contains(t, yamlStr, "node ${RUNNER_TEMP}/gh-aw/actions/generate_safe_outputs_tools.cjs", "should invoke JS generator")

// Extract the tools_meta.json content from the lock file
meta := extractToolsMetaFromLockFile(t, yamlStr)

// Verify description suffixes are present and correct
createIssueSuffix, ok := meta.DescriptionSuffixes["create_issue"]
require.True(t, ok, "create_issue should have a description suffix")
assert.Contains(t, createIssueSuffix, "CONSTRAINTS:", "suffix should contain CONSTRAINTS")
assert.Contains(t, createIssueSuffix, "Maximum 5 issue(s)", "suffix should include max constraint")
assert.Contains(t, createIssueSuffix, `Title will be prefixed with "[bot] "`, "suffix should include title prefix")
assert.Contains(t, createIssueSuffix, `Labels ["automation" "testing"]`, "suffix should include labels")

addCommentSuffix, ok := meta.DescriptionSuffixes["add_comment"]
require.True(t, ok, "add_comment should have a description suffix")
assert.Contains(t, addCommentSuffix, "Maximum 10 comment(s)", "suffix should include max constraint")
}

// TestToolsMetaJSONDescriptionsMatchFilteredTools verifies that the description suffixes
// in tools_meta.json match what generateFilteredToolsJSON would have embedded directly.
// This is the primary regression test for the strategy change.
func TestToolsMetaJSONDescriptionsMatchFilteredTools(t *testing.T) {
tests := []struct {
name string
safeOutputs *SafeOutputsConfig
toolName string
}{
{
name: "create_issue with max and labels",
safeOutputs: &SafeOutputsConfig{
CreateIssues: &CreateIssuesConfig{
BaseSafeOutputConfig: BaseSafeOutputConfig{Max: strPtr("3")},
TitlePrefix: "[automated] ",
Labels: []string{"bot", "enhancement"},
AllowedLabels: []string{"bug", "feature"},
},
},
toolName: "create_issue",
},
{
name: "add_comment with max and target",
safeOutputs: &SafeOutputsConfig{
AddComments: &AddCommentsConfig{
BaseSafeOutputConfig: BaseSafeOutputConfig{Max: strPtr("5")},
Target: "trigger",
},
},
toolName: "add_comment",
},
{
name: "create_pull_request with draft and reviewers",
safeOutputs: &SafeOutputsConfig{
CreatePullRequests: &CreatePullRequestsConfig{
BaseSafeOutputConfig: BaseSafeOutputConfig{Max: strPtr("2")},
TitlePrefix: "[pr] ",
Labels: []string{"ci"},
Draft: strPtr("true"),
Reviewers: []string{"reviewer1"},
},
},
toolName: "create_pull_request",
},
{
name: "upload_asset with max and size limit",
safeOutputs: &SafeOutputsConfig{
UploadAssets: &UploadAssetsConfig{
BaseSafeOutputConfig: BaseSafeOutputConfig{Max: strPtr("1")},
MaxSizeKB: 1024,
AllowedExts: []string{".zip", ".tar.gz"},
},
},
toolName: "upload_asset",
},
}

for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
data := &WorkflowData{SafeOutputs: tt.safeOutputs}

// Old approach: full tool JSON with description already embedded
filteredJSON, err := generateFilteredToolsJSON(data, ".github/workflows/test.md")
require.NoError(t, err, "generateFilteredToolsJSON should not error")

var filteredTools []map[string]any
require.NoError(t, json.Unmarshal([]byte(filteredJSON), &filteredTools), "filtered JSON should parse")

var oldTool map[string]any
for _, tool := range filteredTools {
if tool["name"] == tt.toolName {
oldTool = tool
break
}
}
require.NotNil(t, oldTool, "old approach should include tool %s", tt.toolName)
oldDescription, ok := oldTool["description"].(string)
require.True(t, ok, "old tool should have string description")

// New approach: tools_meta.json with description suffix
metaJSON, err := generateToolsMetaJSON(data, ".github/workflows/test.md")
require.NoError(t, err, "generateToolsMetaJSON should not error")

var meta ToolsMeta
require.NoError(t, json.Unmarshal([]byte(metaJSON), &meta), "meta JSON should parse")

// The suffix should equal the constraint portion of the old description.
// Old description = baseDescription + suffix
// Example: "Create a new GitHub issue..." + " CONSTRAINTS: Maximum 3 issue(s) can be created."
suffix, hasSuffix := meta.DescriptionSuffixes[tt.toolName]
if strings.Contains(oldDescription, "CONSTRAINTS:") {
require.True(t, hasSuffix, "tools_meta should have suffix for %s when old description has constraints", tt.toolName)
assert.True(t, strings.HasSuffix(oldDescription, suffix),
"old description should end with the suffix from tools_meta\n old: %q\n suffix: %q", oldDescription, suffix)
} else {
assert.False(t, hasSuffix, "tools_meta should NOT have suffix for %s when there are no constraints", tt.toolName)
}
})
}
}

// TestToolsMetaJSONRepoParamsMatchFilteredTools verifies that repo parameter definitions
// in tools_meta.json match the repo parameters embedded in the old full tools.json.
func TestToolsMetaJSONRepoParamsMatchFilteredTools(t *testing.T) {
tests := []struct {
name string
safeOutputs *SafeOutputsConfig
toolName string
expectRepoParam bool
expectWildcard bool
}{
{
name: "create_issue with allowed-repos",
safeOutputs: &SafeOutputsConfig{
CreateIssues: &CreateIssuesConfig{
BaseSafeOutputConfig: BaseSafeOutputConfig{Max: strPtr("1")},
TargetRepoSlug: "org/default-repo",
AllowedRepos: []string{"org/other-repo"},
},
},
toolName: "create_issue",
expectRepoParam: true,
},
{
name: "update_issue with wildcard target-repo",
safeOutputs: &SafeOutputsConfig{
UpdateIssues: &UpdateIssuesConfig{
UpdateEntityConfig: UpdateEntityConfig{
SafeOutputTargetConfig: SafeOutputTargetConfig{
TargetRepoSlug: "*",
},
},
},
},
toolName: "update_issue",
expectRepoParam: true,
expectWildcard: true,
},
{
name: "create_issue without allowed-repos",
safeOutputs: &SafeOutputsConfig{
CreateIssues: &CreateIssuesConfig{
BaseSafeOutputConfig: BaseSafeOutputConfig{Max: strPtr("1")},
TargetRepoSlug: "org/target",
},
},
toolName: "create_issue",
expectRepoParam: false,
},
}

for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
data := &WorkflowData{SafeOutputs: tt.safeOutputs}

// Old approach: get repo param from embedded tools.json
filteredJSON, err := generateFilteredToolsJSON(data, ".github/workflows/test.md")
require.NoError(t, err, "generateFilteredToolsJSON should not error")

var filteredTools []map[string]any
require.NoError(t, json.Unmarshal([]byte(filteredJSON), &filteredTools), "filtered JSON should parse")

var oldTool map[string]any
for _, tool := range filteredTools {
if tool["name"] == tt.toolName {
oldTool = tool
break
}
}
require.NotNil(t, oldTool, "old approach should include tool %s", tt.toolName)

oldInputSchema, _ := oldTool["inputSchema"].(map[string]any)
oldProperties, _ := oldInputSchema["properties"].(map[string]any)
oldRepoParam, oldHasRepo := oldProperties["repo"]

// New approach
metaJSON, err := generateToolsMetaJSON(data, ".github/workflows/test.md")
require.NoError(t, err, "generateToolsMetaJSON should not error")

var meta ToolsMeta
require.NoError(t, json.Unmarshal([]byte(metaJSON), &meta), "meta JSON should parse")

newRepoParam, newHasRepo := meta.RepoParams[tt.toolName]

// Both old and new approaches should agree on whether repo param is present
assert.Equal(t, oldHasRepo, newHasRepo,
"old and new approaches should agree on repo param presence for %s", tt.toolName)

if tt.expectRepoParam {
require.True(t, newHasRepo, "tools_meta should include repo_param for %s", tt.toolName)

// Description should be consistent
oldRepoMap, _ := oldRepoParam.(map[string]any)
oldRepoDesc, _ := oldRepoMap["description"].(string)
newRepoDesc, _ := newRepoParam["description"].(string)
assert.Equal(t, oldRepoDesc, newRepoDesc, "repo param description should match between old and new")

if tt.expectWildcard {
assert.Contains(t, newRepoDesc, "Any repository can be targeted",
"wildcard repo param description should mention any repo")
}
} else {
assert.False(t, newHasRepo, "tools_meta should NOT include repo_param for %s", tt.toolName)
}
})
}
}

// TestToolsMetaJSONDynamicToolsFromCustomJobs verifies that custom safe-job tool
// definitions are placed in dynamic_tools in tools_meta.json.
func TestToolsMetaJSONDynamicToolsFromCustomJobs(t *testing.T) {
data := &WorkflowData{
SafeOutputs: &SafeOutputsConfig{
Jobs: map[string]*SafeJobConfig{
"deploy_app": {
Description: "Deploy the application",
Inputs: map[string]*InputDefinition{
"env": {
Type: "choice",
Options: []string{"staging", "production"},
Description: "Target environment",
Required: true,
},
},
},
},
},
}

// Old approach produces a single custom-job tool in filtered JSON
filteredJSON, err := generateFilteredToolsJSON(data, ".github/workflows/test.md")
require.NoError(t, err, "generateFilteredToolsJSON should not error")
var filteredTools []map[string]any
require.NoError(t, json.Unmarshal([]byte(filteredJSON), &filteredTools), "filtered JSON should parse")
require.Len(t, filteredTools, 1, "old approach should have exactly 1 tool")
assert.Equal(t, "deploy_app", filteredTools[0]["name"], "tool should be named deploy_app")

// New approach places it in dynamic_tools
metaJSON, err := generateToolsMetaJSON(data, ".github/workflows/test.md")
require.NoError(t, err, "generateToolsMetaJSON should not error")
var meta ToolsMeta
require.NoError(t, json.Unmarshal([]byte(metaJSON), &meta), "meta JSON should parse")

require.Len(t, meta.DynamicTools, 1, "tools_meta should have exactly 1 dynamic tool")
dynTool := meta.DynamicTools[0]
assert.Equal(t, "deploy_app", dynTool["name"], "dynamic tool name should match")
assert.Equal(t, "Deploy the application", dynTool["description"], "dynamic tool description should match")

inputSchema, ok := dynTool["inputSchema"].(map[string]any)
require.True(t, ok, "dynamic tool should have inputSchema")
properties, ok := inputSchema["properties"].(map[string]any)
require.True(t, ok, "inputSchema should have properties")
envProp, ok := properties["env"].(map[string]any)
require.True(t, ok, "env property should exist")
assert.Equal(t, "string", envProp["type"], "choice type should be mapped to string in JSON Schema")
assert.Equal(t, []any{"staging", "production"}, envProp["enum"], "enum values should match options")
}

// TestToolsMetaJSONCompiledWorkflowEmbedsMeta verifies end-to-end that a compiled lock file
// contains tools_meta.json (not an inlined tools.json) with the right description constraints.
func TestToolsMetaJSONCompiledWorkflowEmbedsMeta(t *testing.T) {
tmpDir := testutil.TempDir(t, "tools-meta-compiled-test")
Copy link

Copilot AI Apr 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This comment refers to TestToolsMetaJSONContainsDescriptionSuffixes, but the function below is TestToolsMetaJSONCompiledWorkflowEmbedsMeta. Please rename or rewrite the comment so it matches the current test name/intent (tools_meta embedding verification).

This issue also appears on line 79 of the same file.

Copilot uses AI. Check for mistakes.
@@ -154,38 +132,6 @@ func TestFlattenEvent(t *testing.T) {
}
}

Copy link

Copilot AI Apr 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removing TestSaveLoadJSON also removes the only round-trip coverage for Miner.SaveJSON/Miner.LoadJSON, which are still used by Coordinator.SaveSnapshots/LoadSnapshots. Consider keeping a persistence round-trip test by instantiating m2 := NewMiner(cfg) and calling m2.LoadJSON(data) (instead of relying on the deleted LoadMinerJSON helper).

Suggested change
func TestSaveLoadJSON(t *testing.T) {
cfg := DefaultConfig()
m, err := NewMiner(cfg)
if err != nil {
t.Fatalf("NewMiner: %v", err)
}
lines := []string{
"stage=tool_call tool=search query=foo",
"stage=tool_call tool=search query=bar",
"stage=tool_call tool=fetch query=foo",
}
for _, line := range lines {
if _, err := m.Train(line); err != nil {
t.Fatalf("Train(%q): %v", line, err)
}
}
data, err := m.SaveJSON()
if err != nil {
t.Fatalf("SaveJSON: %v", err)
}
m2, err := NewMiner(cfg)
if err != nil {
t.Fatalf("NewMiner (load target): %v", err)
}
if err := m2.LoadJSON(data); err != nil {
t.Fatalf("LoadJSON: %v", err)
}
if m.ClusterCount() == 0 {
t.Fatal("expected source miner to have clusters after training")
}
if got, want := m2.ClusterCount(), m.ClusterCount(); got != want {
t.Fatalf("ClusterCount after load: got %d, want %d", got, want)
}
}

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants