Skip to content

P1: Support bundle command (zip logs+config+doctor+paths+status)#35

Merged
Kinin-Code-Offical merged 2 commits intomainfrom
p1/support-bundle
Dec 22, 2025
Merged

P1: Support bundle command (zip logs+config+doctor+paths+status)#35
Kinin-Code-Offical merged 2 commits intomainfrom
p1/support-bundle

Conversation

@Kinin-Code-Offical
Copy link
Owner

@Kinin-Code-Offical Kinin-Code-Offical commented Dec 22, 2025

Closes #18

Summary:

  • Add support bundle command that collects logs/config/doctor/paths/status into a zip.
  • Use a temp staging dir and PowerShell Compress-Archive for bundling.

Tests:

  • npm ci (PASS)
  • npm run lint (PASS)
  • npm test (PASS)
  • npm run build (PASS)

Rollback:

  • Revert this commit to remove support bundle functionality.

Summary by Sourcery

Add a CLI support bundle command to collect diagnostic information into a zipped archive for troubleshooting.

New Features:

  • Introduce a support bundle subcommand that packages logs, config, diagnostics, paths, and status into a zip file.

Enhancements:

  • Generate structured paths, status, and diagnostic reports to include in the support bundle output.

Copilot AI review requested due to automatic review settings December 22, 2025 00:07
@sourcery-ai
Copy link

sourcery-ai bot commented Dec 22, 2025

Reviewer's Guide

Adds a new support bundle CLI subcommand that stages diagnostic data (paths, status, doctor output, config, logs) into a temporary directory and produces a zip via PowerShell Compress-Archive, wiring the command into the main CLI entrypoint.

Sequence diagram for the new support bundle command execution

sequenceDiagram
    actor User
    participant CLI as cli_ts
    participant Support as supportCommand
    participant FS as fs_extra
    participant Config as core_config
    participant Proxy as core_proxy
    participant GCloud as core_gcloud
    participant Env as system_env
    participant Service as system_service
    participant PS as powershell_runPs
    participant HTTP as axios
    participant Log as core_logger

    User->>CLI: cloudsqlctl support bundle [--output] [--keep]
    CLI->>Support: execute bundle action(options)

    Support->>Support: formatTimestamp
    Support->>FS: ensureDir(stagingDir)
    Support->>FS: ensureDir(dirname(outputPath))

    Support->>Proxy: isRunning
    Proxy-->>Support: processRunning
    Support->>Service: isServiceInstalled
    Service-->>Support: serviceInstalled
    alt serviceInstalled
        Support->>Service: isServiceRunning
        Service-->>Support: serviceRunning
    else
        Support-->>Support: serviceRunning = false
    end
    Support->>Config: readConfig
    Config-->>Support: config

    Support->>GCloud: checkGcloudInstalled
    GCloud-->>Support: gcloudInstalled
    Support->>GCloud: getActiveAccount
    GCloud-->>Support: account
    Support->>GCloud: checkAdc
    GCloud-->>Support: adc
    Support->>GCloud: listInstances
    GCloud-->>Support: instances or error

    Support->>Env: checkEnvironmentDetailed(Machine)
    Env-->>Support: machineEnv
    Support->>Env: checkEnvironmentDetailed(User)
    Env-->>Support: userEnv

    Support->>FS: pathExists(PATHS.PROXY_EXE)
    FS-->>Support: proxyExists

    Support->>Service: isServiceInstalled
    Service-->>Support: serviceInstalled2
    alt serviceInstalled2
        Support->>PS: getEnvVar(GOOGLE_CREDS, Machine)
        PS-->>Support: serviceCreds
    end

    Support->>HTTP: get(https://api.github.com)
    HTTP-->>Support: response or error

    Support->>FS: writeFile(paths.txt, buildPathsReport)
    Support->>FS: writeFile(status.txt, buildStatusReport)
    Support->>FS: writeFile(doctor.txt, buildDoctorReport)

    Support->>FS: pathExists(PATHS.CONFIG_FILE)
    alt config exists
        Support->>FS: copy(config.json -> staging/config.json)
    else
        Support->>FS: writeFile(config-missing.txt)
    end

    Support->>FS: pathExists(PATHS.LOGS)
    alt logs exist
        Support->>FS: copy(logs -> staging/logs)
    else
        Support->>FS: writeFile(logs-missing.txt)
    end

    Support->>PS: runPs(Compress-Archive, [staging/*, outputPath])
    PS-->>Support: zip created

    alt keep not set
        Support->>FS: remove(stagingDir)
    end

    Support->>Log: info(Support bundle created)
    Log-->>User: message displayed

    rect rgb(255,230,230)
        Support->>Log: error(Failed to create support bundle)
        Support->>CLI: process.exit(1)
    end
Loading

File-Level Changes

Change Details Files
Wire the new support command into the main CLI program so it is available as a subcommand.
  • Import the support command from the new support module into the CLI entrypoint.
  • Register the support command with the Commander program alongside existing commands.
src/cli.ts
Introduce a support command with a bundle subcommand that collects diagnostic information and packages it into a zip support bundle.
  • Define helper functions to build text reports for paths, status, and diagnostics using existing config, environment, gcloud, service, and filesystem utilities.
  • Implement the support bundle subcommand with options for output path and keeping the staging directory.
  • Stage generated reports, config (or a placeholder if missing), and logs (or a placeholder if missing) into a temporary directory under PATHS.TEMP.
  • Use PowerShell Compress-Archive via the existing runPs helper to zip the staged files into the final bundle and optionally clean up the staging directory.
  • Log success with the final bundle path, and on failure log the error and exit with a non-zero code.
src/commands/support.ts

Assessment against linked issues

Issue Objective Addressed Explanation
#18 Add a CLI command to create a support bundle that zips logs, config, doctor output, paths, and status into a single archive.
#18 Wire the new support bundle functionality into the main CLI so it is accessible to users.

Possibly linked issues

  • #P1: PR adds the support bundle command and wiring that exactly implements the support bundle issue request.

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've found 2 issues, and left some high level feedback:

  • In buildDoctorReport, several diagnostics (e.g., checkGcloudInstalled, getActiveAccount, checkAdc, checkEnvironmentDetailed) are called without individual try/catch guards, so a failure in any one of them will abort the whole support bundle; consider wrapping each diagnostic in its own error handling and downgrading failures to WARN lines so the bundle is still generated with partial data.
  • The support bundle action currently mixes CLI wiring with non-trivial business logic (building reports, copying files, calling PowerShell); consider extracting the report-building and bundling logic into separate reusable functions/modules to keep the command handler thin and easier to test and extend.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- In `buildDoctorReport`, several diagnostics (e.g., `checkGcloudInstalled`, `getActiveAccount`, `checkAdc`, `checkEnvironmentDetailed`) are called without individual `try/catch` guards, so a failure in any one of them will abort the whole support bundle; consider wrapping each diagnostic in its own error handling and downgrading failures to WARN lines so the bundle is still generated with partial data.
- The `support bundle` action currently mixes CLI wiring with non-trivial business logic (building reports, copying files, calling PowerShell); consider extracting the report-building and bundling logic into separate reusable functions/modules to keep the command handler thin and easier to test and extend.

## Individual Comments

### Comment 1
<location> `src/commands/support.ts:45` </location>
<code_context>
+        `Service: ${serviceInstalled ? (serviceRunning ? 'RUNNING' : 'STOPPED') : 'NOT INSTALLED'}`,
+        `Process: ${processRunning ? 'RUNNING' : 'STOPPED'}`,
+        `Instance: ${config.selectedInstance || 'Unknown'}`,
+        `Port: ${config.proxyPort || 5432}`,
+    ];
+
</code_context>

<issue_to_address>
**issue:** Using `||` to default the port will treat `0` as falsy, which may be unintended.

If `config.proxyPort` can validly be `0` (or another falsy value like `NaN`), this will incorrectly fall back to `5432`. If you only want a default when the value is `null` or `undefined`, prefer `config.proxyPort ?? 5432`.
</issue_to_address>

### Comment 2
<location> `src/commands/support.ts:132-133` </location>
<code_context>
+            await fs.writeFile(path.join(stagingDir, 'status.txt'), await buildStatusReport());
+            await fs.writeFile(path.join(stagingDir, 'doctor.txt'), await buildDoctorReport());
+
+            if (await fs.pathExists(PATHS.CONFIG_FILE)) {
+                await fs.copy(PATHS.CONFIG_FILE, path.join(stagingDir, 'config.json'));
+            } else {
+                await fs.writeFile(path.join(stagingDir, 'config-missing.txt'), 'config.json not found');
</code_context>

<issue_to_address>
**🚨 suggestion (security):** Copying the raw config into the bundle may leak sensitive information.

`config.json` can contain secrets (tokens, credentials, project IDs, etc.), so copying it as-is into the bundle is risky. Consider redacting known sensitive fields or generating a minimal, non-sensitive config view for the bundle, and make an explicit decision about which fields are safe to include.

Suggested implementation:

```typescript
            await fs.writeFile(path.join(stagingDir, 'status.txt'), await buildStatusReport());
            await fs.writeFile(path.join(stagingDir, 'doctor.txt'), await buildDoctorReport());

            if (await fs.pathExists(PATHS.CONFIG_FILE)) {
                const SENSITIVE_KEY_PATTERN = /(token|secret|password|key|credential|project[_-]?id|client[_-]?id)/i;

                const redactConfig = (value: unknown): unknown => {
                    if (Array.isArray(value)) {
                        return value.map(redactConfig);
                    }

                    if (value && typeof value === 'object') {
                        const result: Record<string, unknown> = {};
                        for (const [k, v] of Object.entries(value as Record<string, unknown>)) {
                            if (SENSITIVE_KEY_PATTERN.test(k)) {
                                result[k] = '[REDACTED]';
                            } else {
                                result[k] = redactConfig(v);
                            }
                        }
                        return result;
                    }

                    // Primitive values are left as-is unless their key was flagged above.
                    return value;
                };

                try {
                    const rawConfig = await fs.readFile(PATHS.CONFIG_FILE, 'utf8');
                    const parsedConfig = JSON.parse(rawConfig);
                    const redactedConfig = redactConfig(parsedConfig);

                    await fs.writeFile(
                        path.join(stagingDir, 'config-redacted.json'),
                        JSON.stringify(redactedConfig, null, 2),
                        'utf8'
                    );
                } catch (err) {
                    await fs.writeFile(
                        path.join(stagingDir, 'config-error.txt'),
                        `Failed to process config.json: ${(err as Error).message}`,
                        'utf8'
                    );
                }
            } else {
                await fs.writeFile(path.join(stagingDir, 'config-missing.txt'), 'config.json not found');
            }

```

- If there are other places in the codebase or documentation that reference `config.json` in the support bundle, update them to point to `config-redacted.json` and/or describe the new redaction behavior.
- You may want to tune `SENSITIVE_KEY_PATTERN` based on your actual config schema (e.g., adding/removing field names) to better balance usefulness vs. privacy.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

`Service: ${serviceInstalled ? (serviceRunning ? 'RUNNING' : 'STOPPED') : 'NOT INSTALLED'}`,
`Process: ${processRunning ? 'RUNNING' : 'STOPPED'}`,
`Instance: ${config.selectedInstance || 'Unknown'}`,
`Port: ${config.proxyPort || 5432}`,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue: Using || to default the port will treat 0 as falsy, which may be unintended.

If config.proxyPort can validly be 0 (or another falsy value like NaN), this will incorrectly fall back to 5432. If you only want a default when the value is null or undefined, prefer config.proxyPort ?? 5432.

Comment on lines +132 to +133
if (await fs.pathExists(PATHS.CONFIG_FILE)) {
await fs.copy(PATHS.CONFIG_FILE, path.join(stagingDir, 'config.json'));
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚨 suggestion (security): Copying the raw config into the bundle may leak sensitive information.

config.json can contain secrets (tokens, credentials, project IDs, etc.), so copying it as-is into the bundle is risky. Consider redacting known sensitive fields or generating a minimal, non-sensitive config view for the bundle, and make an explicit decision about which fields are safe to include.

Suggested implementation:

            await fs.writeFile(path.join(stagingDir, 'status.txt'), await buildStatusReport());
            await fs.writeFile(path.join(stagingDir, 'doctor.txt'), await buildDoctorReport());

            if (await fs.pathExists(PATHS.CONFIG_FILE)) {
                const SENSITIVE_KEY_PATTERN = /(token|secret|password|key|credential|project[_-]?id|client[_-]?id)/i;

                const redactConfig = (value: unknown): unknown => {
                    if (Array.isArray(value)) {
                        return value.map(redactConfig);
                    }

                    if (value && typeof value === 'object') {
                        const result: Record<string, unknown> = {};
                        for (const [k, v] of Object.entries(value as Record<string, unknown>)) {
                            if (SENSITIVE_KEY_PATTERN.test(k)) {
                                result[k] = '[REDACTED]';
                            } else {
                                result[k] = redactConfig(v);
                            }
                        }
                        return result;
                    }

                    // Primitive values are left as-is unless their key was flagged above.
                    return value;
                };

                try {
                    const rawConfig = await fs.readFile(PATHS.CONFIG_FILE, 'utf8');
                    const parsedConfig = JSON.parse(rawConfig);
                    const redactedConfig = redactConfig(parsedConfig);

                    await fs.writeFile(
                        path.join(stagingDir, 'config-redacted.json'),
                        JSON.stringify(redactedConfig, null, 2),
                        'utf8'
                    );
                } catch (err) {
                    await fs.writeFile(
                        path.join(stagingDir, 'config-error.txt'),
                        `Failed to process config.json: ${(err as Error).message}`,
                        'utf8'
                    );
                }
            } else {
                await fs.writeFile(path.join(stagingDir, 'config-missing.txt'), 'config.json not found');
            }
  • If there are other places in the codebase or documentation that reference config.json in the support bundle, update them to point to config-redacted.json and/or describe the new redaction behavior.
  • You may want to tune SENSITIVE_KEY_PATTERN based on your actual config schema (e.g., adding/removing field names) to better balance usefulness vs. privacy.

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds a support bundle command that collects diagnostic information, logs, and configuration files into a zip archive to facilitate troubleshooting and support scenarios.

Key Changes:

  • Introduces a new support command with a bundle subcommand that generates a comprehensive support bundle
  • Collects system paths, status, diagnostics, configuration, and logs into a timestamped zip file
  • Uses PowerShell's Compress-Archive for creating the bundle on Windows

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 6 comments.

File Description
src/commands/support.ts New command that implements support bundle creation with helper functions to generate reports for paths, status, and diagnostics
src/cli.ts Registers the new support command with the CLI program

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

import { PATHS, PATHS_REASON, PATHS_SOURCE, ENV_VARS } from '../system/paths.js';

function formatTimestamp(): string {
return new Date().toISOString().replace(/[:.]/g, '-');
Copy link

Copilot AI Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The formatTimestamp function replaces colons and periods with hyphens, which results in timestamps like "2025-12-22T15-30-45-123Z". This format may be confusing as it mixes date separators (hyphens) with time separators (also hyphens). Consider using a more standard filesystem-safe format like "YYYYMMDD-HHMMSS" or keeping underscores for time parts to improve readability (e.g., "20251222_153045").

Suggested change
return new Date().toISOString().replace(/[:.]/g, '-');
const now = new Date();
const pad = (value: number): string => value.toString().padStart(2, '0');
const year = now.getFullYear();
const month = pad(now.getMonth() + 1);
const day = pad(now.getDate());
const hours = pad(now.getHours());
const minutes = pad(now.getMinutes());
const seconds = pad(now.getSeconds());
// Format: YYYYMMDD-HHMMSS (filesystem-safe and unambiguous)
return `${year}${month}${day}-${hours}${minutes}${seconds}`;

Copilot uses AI. Check for mistakes.
Comment on lines +138 to +139
if (await fs.pathExists(PATHS.LOGS)) {
await fs.copy(PATHS.LOGS, path.join(stagingDir, 'logs'));
Copy link

Copilot AI Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When copying the entire logs directory, there is no size limit check. If the logs directory has grown very large over time, the support bundle creation could fail or produce an unexpectedly large zip file. Consider adding a check for the logs directory size and either warning the user or only including recent log files (e.g., last N MB or last N days of logs).

Suggested change
if (await fs.pathExists(PATHS.LOGS)) {
await fs.copy(PATHS.LOGS, path.join(stagingDir, 'logs'));
const MAX_LOG_DIR_BYTES = 50 * 1024 * 1024; // 50 MB
const walkLogFiles = async (dir: string): Promise<Array<{ path: string; size: number; mtimeMs: number }>> => {
const entries = await fs.readdir(dir);
const files: Array<{ path: string; size: number; mtimeMs: number }> = [];
for (const entry of entries) {
const fullPath = path.join(dir, entry);
const stat = await fs.stat(fullPath);
if (stat.isDirectory()) {
const subFiles = await walkLogFiles(fullPath);
files.push(...subFiles);
} else {
files.push({
path: fullPath,
size: stat.size,
mtimeMs: stat.mtimeMs
});
}
}
return files;
};
if (await fs.pathExists(PATHS.LOGS)) {
const logFiles = await walkLogFiles(PATHS.LOGS);
const totalLogBytes = logFiles.reduce((sum, f) => sum + f.size, 0);
if (totalLogBytes <= MAX_LOG_DIR_BYTES) {
await fs.copy(PATHS.LOGS, path.join(stagingDir, 'logs'));
} else {
logger.warn(
`Logs directory size (${(totalLogBytes / (1024 * 1024)).toFixed(1)} MB) exceeds ` +
`the support bundle limit of ${(MAX_LOG_DIR_BYTES / (1024 * 1024)).toFixed(1)} MB. ` +
`Including only the most recent logs up to the limit.`
);
const logsDestRoot = path.join(stagingDir, 'logs');
await fs.ensureDir(logsDestRoot);
// Copy most recent log files first until we reach the size limit.
logFiles.sort((a, b) => b.mtimeMs - a.mtimeMs);
let copiedBytes = 0;
for (const file of logFiles) {
if (copiedBytes >= MAX_LOG_DIR_BYTES) {
break;
}
const remaining = MAX_LOG_DIR_BYTES - copiedBytes;
if (file.size > remaining && copiedBytes > 0) {
// Skip files that would substantially exceed the limit once we already have some logs.
continue;
}
const relativePath = path.relative(PATHS.LOGS, file.path);
const destPath = path.join(logsDestRoot, relativePath);
await fs.ensureDir(path.dirname(destPath));
await fs.copy(file.path, destPath);
copiedBytes += file.size;
}
await fs.writeFile(
path.join(stagingDir, 'logs-truncated.txt'),
`Logs directory was larger than ${(MAX_LOG_DIR_BYTES / (1024 * 1024)).toFixed(1)} MB.\n` +
`Included only the most recent logs, totaling approximately ` +
`${(copiedBytes / (1024 * 1024)).toFixed(1)} MB out of ` +
`${(totalLogBytes / (1024 * 1024)).toFixed(1)} MB.\n`
);
}

Copilot uses AI. Check for mistakes.
Comment on lines +144 to +151
await runPs('& { Compress-Archive -Path $args[0] -DestinationPath $args[1] -Force }', [
path.join(stagingDir, '*'),
outputPath
]);

if (!options.keep) {
await fs.remove(stagingDir);
}
Copy link

Copilot AI Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the PowerShell Compress-Archive command fails, the staging directory may not be cleaned up even without the --keep flag, as the cleanup happens after the compression. Consider wrapping the cleanup in a finally block or using a try-catch around the compression to ensure the staging directory is cleaned up on error (unless --keep is specified).

Copilot uses AI. Check for mistakes.
Comment on lines +51 to +107
async function buildDoctorReport(): Promise<string> {
const lines: string[] = [];
lines.push('CloudSQLCTL Diagnostics');

const gcloudInstalled = await checkGcloudInstalled();
lines.push(`gcloud: ${gcloudInstalled ? 'OK' : 'FAIL'}`);

const account = await getActiveAccount();
lines.push(`gcloud account: ${account || 'none'}`);

const adc = await checkAdc();
lines.push(`ADC: ${adc ? 'OK' : 'WARN'}`);

try {
await listInstances();
lines.push('list instances: OK');
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
lines.push(`list instances: FAIL (${message})`);
}

const machineEnv = await checkEnvironmentDetailed('Machine');
if (machineEnv.ok) {
lines.push('env (machine): OK');
} else {
lines.push('env (machine): WARN');
machineEnv.problems.forEach(p => lines.push(` - ${p}`));
}

const userEnv = await checkEnvironmentDetailed('User');
if (userEnv.ok) {
lines.push('env (user): OK');
} else {
lines.push('env (user): WARN');
userEnv.problems.forEach(p => lines.push(` - ${p}`));
}

const proxyExists = await fs.pathExists(PATHS.PROXY_EXE);
lines.push(`proxy binary: ${proxyExists ? 'OK' : 'FAIL'}`);

const serviceInstalled = await isServiceInstalled();
lines.push(`service installed: ${serviceInstalled ? 'yes' : 'no'}`);

if (serviceInstalled) {
const serviceCreds = await getEnvVar(ENV_VARS.GOOGLE_CREDS, 'Machine');
lines.push(`service creds: ${serviceCreds ? 'set' : 'not set'}`);
}

try {
await axios.get('https://api.github.com', { timeout: 5000 });
lines.push('github api: OK');
} catch {
lines.push('github api: FAIL');
}

return `${lines.join('\n')}\n`;
}
Copy link

Copilot AI Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The buildDoctorReport function duplicates logic from the existing doctor command (src/commands/doctor.ts). This creates a maintenance burden as changes to diagnostic checks would need to be made in two places. Consider extracting the shared diagnostic logic into a reusable function that both the doctor command and the support bundle can use.

Copilot uses AI. Check for mistakes.
Comment on lines +132 to +134
if (await fs.pathExists(PATHS.CONFIG_FILE)) {
await fs.copy(PATHS.CONFIG_FILE, path.join(stagingDir, 'config.json'));
} else {
Copy link

Copilot AI Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The support bundle may contain sensitive information from the config.json file (such as instance connection names, credentials paths, or other configuration details). Consider adding a warning message to the user when creating the bundle, or sanitizing sensitive fields before including the config in the bundle to prevent accidental exposure of sensitive data.

Copilot uses AI. Check for mistakes.
}

try {
await axios.get('https://api.github.com', { timeout: 5000 });
Copy link

Copilot AI Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The GitHub API check uses a hardcoded timeout of 5000ms. This should be extracted as a constant at the top of the file for easier configuration and consistency. For example, define a constant like GITHUB_API_TIMEOUT_MS = 5000.

Copilot uses AI. Check for mistakes.
@Kinin-Code-Offical Kinin-Code-Offical merged commit b5ebf20 into main Dec 22, 2025
5 checks passed
@Kinin-Code-Offical Kinin-Code-Offical deleted the p1/support-bundle branch December 22, 2025 00:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

P1: Support bundle command (zip logs+config+doctor+paths+status)

1 participant