feat: add copilot-token-audit and copilot-token-optimizer workflows#24528
feat: add copilot-token-audit and copilot-token-optimizer workflows#24528
Conversation
Add two new daily Copilot token management workflows: copilot-token-audit (runs daily ~12:00 UTC weekdays): - Downloads 30 days of Copilot logs via gh aw logs --json - Computes per-workflow token usage, cost, turns, and action minutes - Persists daily snapshots to memory/token-audit repo-memory branch - Maintains 90-day rolling summary for trend analysis - Generates charts and publishes audit discussion copilot-token-optimizer (runs daily ~14:00 UTC weekdays): - Reads audit snapshots from repo-memory to select heaviest consumer - Tracks optimization history to avoid re-analyzing recent targets - Uses agentic-workflows MCP tools (logs, audit) for deep per-run analysis - Downloads firewall data via gh aw logs --firewall for token breakdowns - Produces conservative, evidence-based optimization recommendations - Publishes discussion with tool usage matrix and estimated savings Both workflows install gh-aw via gh extension install (no source builds). Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
There was a problem hiding this comment.
Pull request overview
Adds two new scheduled Agentic Workflows jobs to track Copilot token usage over time and to generate daily optimization recommendations based on the audit history.
Changes:
- Introduces
copilot-token-auditworkflow to download 30 days of Copilot run logs, compute per-workflow aggregates, persist daily snapshots to repo-memory, and publish an audit discussion (with charts). - Introduces
copilot-token-optimizerworkflow to pick a high-usage workflow from repo-memory snapshots, perform deeper per-run analysis (incl. firewall breakdowns), and publish optimization recommendations. - Adds the compiled
.lock.ymlworkflow artifacts for both new workflows.
Show a summary per file
| File | Description |
|---|---|
| .github/workflows/copilot-token-audit.md | New workflow prompt + steps to download logs and instruct the agent to compute/persist daily snapshots and publish an audit discussion. |
| .github/workflows/copilot-token-audit.lock.yml | Compiled/locked GitHub Actions workflow for the audit job. |
| .github/workflows/copilot-token-optimizer.md | New workflow prompt + steps to read audit history, analyze recent runs, and publish optimization recommendations. |
| .github/workflows/copilot-token-optimizer.lock.yml | Compiled/locked GitHub Actions workflow for the optimizer job. |
Copilot's findings
Tip
Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Comments suppressed due to low confidence (3)
.github/workflows/copilot-token-optimizer.md:97
- optimization-log.json is referenced at /tmp/gh-aw/repo-memory/default/optimization-log.json, but the repo-memory-standard file-glob only allows memory/token-audit/* paths. To ensure it is persisted (and to avoid colliding with other repo-memory uses), write/read it under /tmp/gh-aw/repo-memory/default/memory/token-audit/optimization-log.json and update the surrounding instructions accordingly.
```bash
# Check if optimization log exists
OPT_LOG="/tmp/gh-aw/repo-memory/default/optimization-log.json"
if [ -f "$OPT_LOG" ]; then
echo "Previous optimizations:"
cat "$OPT_LOG" | jq -r '.[] | "\(.date): \(.workflow_name)"'
else
echo "No previous optimization history found."
fi
.github/workflows/copilot-token-optimizer.md:123
- The gh aw logs command writes to /tmp/gh-aw/token-audit/target-runs.json but the instructions never create /tmp/gh-aw/token-audit, so a straightforward execution will fail with “No such file or directory”. Add a mkdir -p /tmp/gh-aw/token-audit before redirecting output.
```bash
# Download last 7 days of runs for the selected workflow, with firewall data
gh aw logs \
--engine copilot \
--start-date -7d \
--json \
--firewall \
-c 20 \
> /tmp/gh-aw/token-audit/target-runs.json
.github/workflows/copilot-token-optimizer.md:132
- Step 2.1 says “Download … runs for the selected workflow”, but the gh aw logs command doesn’t filter by the selected workflow (only engine + date + count). That can produce mixed-workflow data and make the summary/audit misleading. Either add an explicit workflow filter to the command (if supported) or post-filter the downloaded JSON to the chosen workflow before computing metrics.
```bash
# Download last 7 days of runs for the selected workflow, with firewall data
gh aw logs \
--engine copilot \
--start-date -7d \
--json \
--firewall \
-c 20 \
> /tmp/gh-aw/token-audit/target-runs.json
# Show summary
jq '{
workflow: .runs[0].workflow_name,
total_runs: (.runs | length),
total_tokens: [.runs[].token_usage // 0] | add,
avg_tokens: ([.runs[].token_usage // 0] | add) / ([.runs[].token_usage // 0] | length),
tool_usage: .tool_usage
}' /tmp/gh-aw/token-audit/target-runs.json
</details>
- **Files reviewed:** 4/4 changed files
- **Comments generated:** 2
| 2. Copy it to `/tmp/gh-aw/repo-memory/default/YYYY-MM-DD.json` (today's UTC date). | ||
| 3. This file is what the optimizer workflow reads to identify high-usage workflows. | ||
|
|
||
| Also maintain a rolling summary file at `/tmp/gh-aw/repo-memory/default/rolling-summary.json` that contains an array of daily overall totals (date, total_tokens, total_cost, total_runs, total_action_minutes) for the last 90 entries. Load the existing file, append today's entry, trim to 90, and save. |
There was a problem hiding this comment.
Repo-memory paths in the instructions don’t match the repo-memory-standard file-glob (memory/token-audit/*). Writing snapshots to /tmp/gh-aw/repo-memory/default/YYYY-MM-DD.json and /tmp/gh-aw/repo-memory/default/rolling-summary.json will likely be ignored by the push_repo_memory filter; these should be placed under /tmp/gh-aw/repo-memory/default/memory/token-audit/ instead (e.g., .../memory/token-audit/YYYY-MM-DD.json and .../memory/token-audit/rolling-summary.json).
| 2. Copy it to `/tmp/gh-aw/repo-memory/default/YYYY-MM-DD.json` (today's UTC date). | |
| 3. This file is what the optimizer workflow reads to identify high-usage workflows. | |
| Also maintain a rolling summary file at `/tmp/gh-aw/repo-memory/default/rolling-summary.json` that contains an array of daily overall totals (date, total_tokens, total_cost, total_runs, total_action_minutes) for the last 90 entries. Load the existing file, append today's entry, trim to 90, and save. | |
| 2. Copy it to `/tmp/gh-aw/repo-memory/default/memory/token-audit/YYYY-MM-DD.json` (today's UTC date). | |
| 3. This file is what the optimizer workflow reads to identify high-usage workflows. | |
| Also maintain a rolling summary file at `/tmp/gh-aw/repo-memory/default/memory/token-audit/rolling-summary.json` that contains an array of daily overall totals (date, total_tokens, total_cost, total_runs, total_action_minutes) for the last 90 entries. Load the existing file, append today's entry, trim to 90, and save. |
| ```bash | ||
| # Find the most recent snapshot | ||
| LATEST=$(ls -1 /tmp/gh-aw/repo-memory/default/*.json 2>/dev/null | grep -v rolling | grep -v optimization | sort -r | head -1) | ||
| if [ -z "$LATEST" ]; then | ||
| echo "⚠️ No audit snapshots found. The copilot-token-audit workflow may not have run yet." | ||
| echo "Falling back to live data collection..." | ||
| fi | ||
| echo "Latest snapshot: $LATEST" | ||
| cat "$LATEST" | jq '.workflows[:10]' | ||
| ``` |
There was a problem hiding this comment.
The snapshot discovery script is looking in /tmp/gh-aw/repo-memory/default/.json, but repo-memory-standard constrains writes/reads to the memory/token-audit/ subdirectory. As written, LATEST will never find snapshots persisted under memory/token-audit/, and the subsequent cat/jq will fail when LATEST is empty. Update the glob/pathing to /tmp/gh-aw/repo-memory/default/memory/token-audit/.json and guard the cat/jq behind a non-empty LATEST check.
This issue also appears in the following locations of the same file:
- line 89
- line 114
- line 114
Summary
Two new daily workflows for tracking and optimizing Copilot token usage:
copilot-token-audit(daily ~12:00 UTC weekdays)gh aw logs --jsonmemory/token-auditrepo-memory branchcopilot-token-optimizer(daily ~14:00 UTC weekdays)agentic-workflowsMCP tools (logs,audit) for deep per-run analysisgh aw logs --firewallfor token breakdownsDesign decisions
./gh-aw— installs viagh extension install github/gh-awmemory/token-auditbranch; audit writes daily snapshots, optimizer writesoptimization-log.jsondaily-audit-discussion,repo-memory-standard,reporting,python-datavizagentic-workflowsMCP for per-run audits +githubMCP to read workflow source files