P1: Support bundle command (zip logs+config+doctor+paths+status)#35
P1: Support bundle command (zip logs+config+doctor+paths+status)#35Kinin-Code-Offical merged 2 commits intomainfrom
Conversation
Reviewer's GuideAdds a new Sequence diagram for the new support bundle command executionsequenceDiagram
actor User
participant CLI as cli_ts
participant Support as supportCommand
participant FS as fs_extra
participant Config as core_config
participant Proxy as core_proxy
participant GCloud as core_gcloud
participant Env as system_env
participant Service as system_service
participant PS as powershell_runPs
participant HTTP as axios
participant Log as core_logger
User->>CLI: cloudsqlctl support bundle [--output] [--keep]
CLI->>Support: execute bundle action(options)
Support->>Support: formatTimestamp
Support->>FS: ensureDir(stagingDir)
Support->>FS: ensureDir(dirname(outputPath))
Support->>Proxy: isRunning
Proxy-->>Support: processRunning
Support->>Service: isServiceInstalled
Service-->>Support: serviceInstalled
alt serviceInstalled
Support->>Service: isServiceRunning
Service-->>Support: serviceRunning
else
Support-->>Support: serviceRunning = false
end
Support->>Config: readConfig
Config-->>Support: config
Support->>GCloud: checkGcloudInstalled
GCloud-->>Support: gcloudInstalled
Support->>GCloud: getActiveAccount
GCloud-->>Support: account
Support->>GCloud: checkAdc
GCloud-->>Support: adc
Support->>GCloud: listInstances
GCloud-->>Support: instances or error
Support->>Env: checkEnvironmentDetailed(Machine)
Env-->>Support: machineEnv
Support->>Env: checkEnvironmentDetailed(User)
Env-->>Support: userEnv
Support->>FS: pathExists(PATHS.PROXY_EXE)
FS-->>Support: proxyExists
Support->>Service: isServiceInstalled
Service-->>Support: serviceInstalled2
alt serviceInstalled2
Support->>PS: getEnvVar(GOOGLE_CREDS, Machine)
PS-->>Support: serviceCreds
end
Support->>HTTP: get(https://api.github.com)
HTTP-->>Support: response or error
Support->>FS: writeFile(paths.txt, buildPathsReport)
Support->>FS: writeFile(status.txt, buildStatusReport)
Support->>FS: writeFile(doctor.txt, buildDoctorReport)
Support->>FS: pathExists(PATHS.CONFIG_FILE)
alt config exists
Support->>FS: copy(config.json -> staging/config.json)
else
Support->>FS: writeFile(config-missing.txt)
end
Support->>FS: pathExists(PATHS.LOGS)
alt logs exist
Support->>FS: copy(logs -> staging/logs)
else
Support->>FS: writeFile(logs-missing.txt)
end
Support->>PS: runPs(Compress-Archive, [staging/*, outputPath])
PS-->>Support: zip created
alt keep not set
Support->>FS: remove(stagingDir)
end
Support->>Log: info(Support bundle created)
Log-->>User: message displayed
rect rgb(255,230,230)
Support->>Log: error(Failed to create support bundle)
Support->>CLI: process.exit(1)
end
File-Level Changes
Assessment against linked issues
Possibly linked issues
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Hey - I've found 2 issues, and left some high level feedback:
- In
buildDoctorReport, several diagnostics (e.g.,checkGcloudInstalled,getActiveAccount,checkAdc,checkEnvironmentDetailed) are called without individualtry/catchguards, so a failure in any one of them will abort the whole support bundle; consider wrapping each diagnostic in its own error handling and downgrading failures to WARN lines so the bundle is still generated with partial data. - The
support bundleaction currently mixes CLI wiring with non-trivial business logic (building reports, copying files, calling PowerShell); consider extracting the report-building and bundling logic into separate reusable functions/modules to keep the command handler thin and easier to test and extend.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- In `buildDoctorReport`, several diagnostics (e.g., `checkGcloudInstalled`, `getActiveAccount`, `checkAdc`, `checkEnvironmentDetailed`) are called without individual `try/catch` guards, so a failure in any one of them will abort the whole support bundle; consider wrapping each diagnostic in its own error handling and downgrading failures to WARN lines so the bundle is still generated with partial data.
- The `support bundle` action currently mixes CLI wiring with non-trivial business logic (building reports, copying files, calling PowerShell); consider extracting the report-building and bundling logic into separate reusable functions/modules to keep the command handler thin and easier to test and extend.
## Individual Comments
### Comment 1
<location> `src/commands/support.ts:45` </location>
<code_context>
+ `Service: ${serviceInstalled ? (serviceRunning ? 'RUNNING' : 'STOPPED') : 'NOT INSTALLED'}`,
+ `Process: ${processRunning ? 'RUNNING' : 'STOPPED'}`,
+ `Instance: ${config.selectedInstance || 'Unknown'}`,
+ `Port: ${config.proxyPort || 5432}`,
+ ];
+
</code_context>
<issue_to_address>
**issue:** Using `||` to default the port will treat `0` as falsy, which may be unintended.
If `config.proxyPort` can validly be `0` (or another falsy value like `NaN`), this will incorrectly fall back to `5432`. If you only want a default when the value is `null` or `undefined`, prefer `config.proxyPort ?? 5432`.
</issue_to_address>
### Comment 2
<location> `src/commands/support.ts:132-133` </location>
<code_context>
+ await fs.writeFile(path.join(stagingDir, 'status.txt'), await buildStatusReport());
+ await fs.writeFile(path.join(stagingDir, 'doctor.txt'), await buildDoctorReport());
+
+ if (await fs.pathExists(PATHS.CONFIG_FILE)) {
+ await fs.copy(PATHS.CONFIG_FILE, path.join(stagingDir, 'config.json'));
+ } else {
+ await fs.writeFile(path.join(stagingDir, 'config-missing.txt'), 'config.json not found');
</code_context>
<issue_to_address>
**🚨 suggestion (security):** Copying the raw config into the bundle may leak sensitive information.
`config.json` can contain secrets (tokens, credentials, project IDs, etc.), so copying it as-is into the bundle is risky. Consider redacting known sensitive fields or generating a minimal, non-sensitive config view for the bundle, and make an explicit decision about which fields are safe to include.
Suggested implementation:
```typescript
await fs.writeFile(path.join(stagingDir, 'status.txt'), await buildStatusReport());
await fs.writeFile(path.join(stagingDir, 'doctor.txt'), await buildDoctorReport());
if (await fs.pathExists(PATHS.CONFIG_FILE)) {
const SENSITIVE_KEY_PATTERN = /(token|secret|password|key|credential|project[_-]?id|client[_-]?id)/i;
const redactConfig = (value: unknown): unknown => {
if (Array.isArray(value)) {
return value.map(redactConfig);
}
if (value && typeof value === 'object') {
const result: Record<string, unknown> = {};
for (const [k, v] of Object.entries(value as Record<string, unknown>)) {
if (SENSITIVE_KEY_PATTERN.test(k)) {
result[k] = '[REDACTED]';
} else {
result[k] = redactConfig(v);
}
}
return result;
}
// Primitive values are left as-is unless their key was flagged above.
return value;
};
try {
const rawConfig = await fs.readFile(PATHS.CONFIG_FILE, 'utf8');
const parsedConfig = JSON.parse(rawConfig);
const redactedConfig = redactConfig(parsedConfig);
await fs.writeFile(
path.join(stagingDir, 'config-redacted.json'),
JSON.stringify(redactedConfig, null, 2),
'utf8'
);
} catch (err) {
await fs.writeFile(
path.join(stagingDir, 'config-error.txt'),
`Failed to process config.json: ${(err as Error).message}`,
'utf8'
);
}
} else {
await fs.writeFile(path.join(stagingDir, 'config-missing.txt'), 'config.json not found');
}
```
- If there are other places in the codebase or documentation that reference `config.json` in the support bundle, update them to point to `config-redacted.json` and/or describe the new redaction behavior.
- You may want to tune `SENSITIVE_KEY_PATTERN` based on your actual config schema (e.g., adding/removing field names) to better balance usefulness vs. privacy.
</issue_to_address>Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
| `Service: ${serviceInstalled ? (serviceRunning ? 'RUNNING' : 'STOPPED') : 'NOT INSTALLED'}`, | ||
| `Process: ${processRunning ? 'RUNNING' : 'STOPPED'}`, | ||
| `Instance: ${config.selectedInstance || 'Unknown'}`, | ||
| `Port: ${config.proxyPort || 5432}`, |
There was a problem hiding this comment.
issue: Using || to default the port will treat 0 as falsy, which may be unintended.
If config.proxyPort can validly be 0 (or another falsy value like NaN), this will incorrectly fall back to 5432. If you only want a default when the value is null or undefined, prefer config.proxyPort ?? 5432.
| if (await fs.pathExists(PATHS.CONFIG_FILE)) { | ||
| await fs.copy(PATHS.CONFIG_FILE, path.join(stagingDir, 'config.json')); |
There was a problem hiding this comment.
🚨 suggestion (security): Copying the raw config into the bundle may leak sensitive information.
config.json can contain secrets (tokens, credentials, project IDs, etc.), so copying it as-is into the bundle is risky. Consider redacting known sensitive fields or generating a minimal, non-sensitive config view for the bundle, and make an explicit decision about which fields are safe to include.
Suggested implementation:
await fs.writeFile(path.join(stagingDir, 'status.txt'), await buildStatusReport());
await fs.writeFile(path.join(stagingDir, 'doctor.txt'), await buildDoctorReport());
if (await fs.pathExists(PATHS.CONFIG_FILE)) {
const SENSITIVE_KEY_PATTERN = /(token|secret|password|key|credential|project[_-]?id|client[_-]?id)/i;
const redactConfig = (value: unknown): unknown => {
if (Array.isArray(value)) {
return value.map(redactConfig);
}
if (value && typeof value === 'object') {
const result: Record<string, unknown> = {};
for (const [k, v] of Object.entries(value as Record<string, unknown>)) {
if (SENSITIVE_KEY_PATTERN.test(k)) {
result[k] = '[REDACTED]';
} else {
result[k] = redactConfig(v);
}
}
return result;
}
// Primitive values are left as-is unless their key was flagged above.
return value;
};
try {
const rawConfig = await fs.readFile(PATHS.CONFIG_FILE, 'utf8');
const parsedConfig = JSON.parse(rawConfig);
const redactedConfig = redactConfig(parsedConfig);
await fs.writeFile(
path.join(stagingDir, 'config-redacted.json'),
JSON.stringify(redactedConfig, null, 2),
'utf8'
);
} catch (err) {
await fs.writeFile(
path.join(stagingDir, 'config-error.txt'),
`Failed to process config.json: ${(err as Error).message}`,
'utf8'
);
}
} else {
await fs.writeFile(path.join(stagingDir, 'config-missing.txt'), 'config.json not found');
}- If there are other places in the codebase or documentation that reference
config.jsonin the support bundle, update them to point toconfig-redacted.jsonand/or describe the new redaction behavior. - You may want to tune
SENSITIVE_KEY_PATTERNbased on your actual config schema (e.g., adding/removing field names) to better balance usefulness vs. privacy.
There was a problem hiding this comment.
Pull request overview
This PR adds a support bundle command that collects diagnostic information, logs, and configuration files into a zip archive to facilitate troubleshooting and support scenarios.
Key Changes:
- Introduces a new
supportcommand with abundlesubcommand that generates a comprehensive support bundle - Collects system paths, status, diagnostics, configuration, and logs into a timestamped zip file
- Uses PowerShell's Compress-Archive for creating the bundle on Windows
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 6 comments.
| File | Description |
|---|---|
| src/commands/support.ts | New command that implements support bundle creation with helper functions to generate reports for paths, status, and diagnostics |
| src/cli.ts | Registers the new support command with the CLI program |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| import { PATHS, PATHS_REASON, PATHS_SOURCE, ENV_VARS } from '../system/paths.js'; | ||
|
|
||
| function formatTimestamp(): string { | ||
| return new Date().toISOString().replace(/[:.]/g, '-'); |
There was a problem hiding this comment.
The formatTimestamp function replaces colons and periods with hyphens, which results in timestamps like "2025-12-22T15-30-45-123Z". This format may be confusing as it mixes date separators (hyphens) with time separators (also hyphens). Consider using a more standard filesystem-safe format like "YYYYMMDD-HHMMSS" or keeping underscores for time parts to improve readability (e.g., "20251222_153045").
| return new Date().toISOString().replace(/[:.]/g, '-'); | |
| const now = new Date(); | |
| const pad = (value: number): string => value.toString().padStart(2, '0'); | |
| const year = now.getFullYear(); | |
| const month = pad(now.getMonth() + 1); | |
| const day = pad(now.getDate()); | |
| const hours = pad(now.getHours()); | |
| const minutes = pad(now.getMinutes()); | |
| const seconds = pad(now.getSeconds()); | |
| // Format: YYYYMMDD-HHMMSS (filesystem-safe and unambiguous) | |
| return `${year}${month}${day}-${hours}${minutes}${seconds}`; |
| if (await fs.pathExists(PATHS.LOGS)) { | ||
| await fs.copy(PATHS.LOGS, path.join(stagingDir, 'logs')); |
There was a problem hiding this comment.
When copying the entire logs directory, there is no size limit check. If the logs directory has grown very large over time, the support bundle creation could fail or produce an unexpectedly large zip file. Consider adding a check for the logs directory size and either warning the user or only including recent log files (e.g., last N MB or last N days of logs).
| if (await fs.pathExists(PATHS.LOGS)) { | |
| await fs.copy(PATHS.LOGS, path.join(stagingDir, 'logs')); | |
| const MAX_LOG_DIR_BYTES = 50 * 1024 * 1024; // 50 MB | |
| const walkLogFiles = async (dir: string): Promise<Array<{ path: string; size: number; mtimeMs: number }>> => { | |
| const entries = await fs.readdir(dir); | |
| const files: Array<{ path: string; size: number; mtimeMs: number }> = []; | |
| for (const entry of entries) { | |
| const fullPath = path.join(dir, entry); | |
| const stat = await fs.stat(fullPath); | |
| if (stat.isDirectory()) { | |
| const subFiles = await walkLogFiles(fullPath); | |
| files.push(...subFiles); | |
| } else { | |
| files.push({ | |
| path: fullPath, | |
| size: stat.size, | |
| mtimeMs: stat.mtimeMs | |
| }); | |
| } | |
| } | |
| return files; | |
| }; | |
| if (await fs.pathExists(PATHS.LOGS)) { | |
| const logFiles = await walkLogFiles(PATHS.LOGS); | |
| const totalLogBytes = logFiles.reduce((sum, f) => sum + f.size, 0); | |
| if (totalLogBytes <= MAX_LOG_DIR_BYTES) { | |
| await fs.copy(PATHS.LOGS, path.join(stagingDir, 'logs')); | |
| } else { | |
| logger.warn( | |
| `Logs directory size (${(totalLogBytes / (1024 * 1024)).toFixed(1)} MB) exceeds ` + | |
| `the support bundle limit of ${(MAX_LOG_DIR_BYTES / (1024 * 1024)).toFixed(1)} MB. ` + | |
| `Including only the most recent logs up to the limit.` | |
| ); | |
| const logsDestRoot = path.join(stagingDir, 'logs'); | |
| await fs.ensureDir(logsDestRoot); | |
| // Copy most recent log files first until we reach the size limit. | |
| logFiles.sort((a, b) => b.mtimeMs - a.mtimeMs); | |
| let copiedBytes = 0; | |
| for (const file of logFiles) { | |
| if (copiedBytes >= MAX_LOG_DIR_BYTES) { | |
| break; | |
| } | |
| const remaining = MAX_LOG_DIR_BYTES - copiedBytes; | |
| if (file.size > remaining && copiedBytes > 0) { | |
| // Skip files that would substantially exceed the limit once we already have some logs. | |
| continue; | |
| } | |
| const relativePath = path.relative(PATHS.LOGS, file.path); | |
| const destPath = path.join(logsDestRoot, relativePath); | |
| await fs.ensureDir(path.dirname(destPath)); | |
| await fs.copy(file.path, destPath); | |
| copiedBytes += file.size; | |
| } | |
| await fs.writeFile( | |
| path.join(stagingDir, 'logs-truncated.txt'), | |
| `Logs directory was larger than ${(MAX_LOG_DIR_BYTES / (1024 * 1024)).toFixed(1)} MB.\n` + | |
| `Included only the most recent logs, totaling approximately ` + | |
| `${(copiedBytes / (1024 * 1024)).toFixed(1)} MB out of ` + | |
| `${(totalLogBytes / (1024 * 1024)).toFixed(1)} MB.\n` | |
| ); | |
| } |
| await runPs('& { Compress-Archive -Path $args[0] -DestinationPath $args[1] -Force }', [ | ||
| path.join(stagingDir, '*'), | ||
| outputPath | ||
| ]); | ||
|
|
||
| if (!options.keep) { | ||
| await fs.remove(stagingDir); | ||
| } |
There was a problem hiding this comment.
If the PowerShell Compress-Archive command fails, the staging directory may not be cleaned up even without the --keep flag, as the cleanup happens after the compression. Consider wrapping the cleanup in a finally block or using a try-catch around the compression to ensure the staging directory is cleaned up on error (unless --keep is specified).
| async function buildDoctorReport(): Promise<string> { | ||
| const lines: string[] = []; | ||
| lines.push('CloudSQLCTL Diagnostics'); | ||
|
|
||
| const gcloudInstalled = await checkGcloudInstalled(); | ||
| lines.push(`gcloud: ${gcloudInstalled ? 'OK' : 'FAIL'}`); | ||
|
|
||
| const account = await getActiveAccount(); | ||
| lines.push(`gcloud account: ${account || 'none'}`); | ||
|
|
||
| const adc = await checkAdc(); | ||
| lines.push(`ADC: ${adc ? 'OK' : 'WARN'}`); | ||
|
|
||
| try { | ||
| await listInstances(); | ||
| lines.push('list instances: OK'); | ||
| } catch (error) { | ||
| const message = error instanceof Error ? error.message : String(error); | ||
| lines.push(`list instances: FAIL (${message})`); | ||
| } | ||
|
|
||
| const machineEnv = await checkEnvironmentDetailed('Machine'); | ||
| if (machineEnv.ok) { | ||
| lines.push('env (machine): OK'); | ||
| } else { | ||
| lines.push('env (machine): WARN'); | ||
| machineEnv.problems.forEach(p => lines.push(` - ${p}`)); | ||
| } | ||
|
|
||
| const userEnv = await checkEnvironmentDetailed('User'); | ||
| if (userEnv.ok) { | ||
| lines.push('env (user): OK'); | ||
| } else { | ||
| lines.push('env (user): WARN'); | ||
| userEnv.problems.forEach(p => lines.push(` - ${p}`)); | ||
| } | ||
|
|
||
| const proxyExists = await fs.pathExists(PATHS.PROXY_EXE); | ||
| lines.push(`proxy binary: ${proxyExists ? 'OK' : 'FAIL'}`); | ||
|
|
||
| const serviceInstalled = await isServiceInstalled(); | ||
| lines.push(`service installed: ${serviceInstalled ? 'yes' : 'no'}`); | ||
|
|
||
| if (serviceInstalled) { | ||
| const serviceCreds = await getEnvVar(ENV_VARS.GOOGLE_CREDS, 'Machine'); | ||
| lines.push(`service creds: ${serviceCreds ? 'set' : 'not set'}`); | ||
| } | ||
|
|
||
| try { | ||
| await axios.get('https://api.github.com', { timeout: 5000 }); | ||
| lines.push('github api: OK'); | ||
| } catch { | ||
| lines.push('github api: FAIL'); | ||
| } | ||
|
|
||
| return `${lines.join('\n')}\n`; | ||
| } |
There was a problem hiding this comment.
The buildDoctorReport function duplicates logic from the existing doctor command (src/commands/doctor.ts). This creates a maintenance burden as changes to diagnostic checks would need to be made in two places. Consider extracting the shared diagnostic logic into a reusable function that both the doctor command and the support bundle can use.
| if (await fs.pathExists(PATHS.CONFIG_FILE)) { | ||
| await fs.copy(PATHS.CONFIG_FILE, path.join(stagingDir, 'config.json')); | ||
| } else { |
There was a problem hiding this comment.
The support bundle may contain sensitive information from the config.json file (such as instance connection names, credentials paths, or other configuration details). Consider adding a warning message to the user when creating the bundle, or sanitizing sensitive fields before including the config in the bundle to prevent accidental exposure of sensitive data.
| } | ||
|
|
||
| try { | ||
| await axios.get('https://api.github.com', { timeout: 5000 }); |
There was a problem hiding this comment.
The GitHub API check uses a hardcoded timeout of 5000ms. This should be extracted as a constant at the top of the file for easier configuration and consistency. For example, define a constant like GITHUB_API_TIMEOUT_MS = 5000.
Closes #18
Summary:
support bundlecommand that collects logs/config/doctor/paths/status into a zip.Tests:
Rollback:
Summary by Sourcery
Add a CLI support bundle command to collect diagnostic information into a zipped archive for troubleshooting.
New Features:
support bundlesubcommand that packages logs, config, diagnostics, paths, and status into a zip file.Enhancements: