diff --git a/docs/cli/index.md b/docs/cli/index.md index 3e34ff12..d1b7aa78 100644 --- a/docs/cli/index.md +++ b/docs/cli/index.md @@ -26,6 +26,7 @@ These flags are available on all commands that interact with B2C instances: ## Command Topics - [Code Commands](./code) - Deploy cartridges and manage code versions +- [Job Commands](./jobs) - Execute and monitor jobs, import/export site archives - [Sites Commands](./sites) - List and manage sites - [Sandbox Commands](./sandbox) - Create and manage sandboxes - [MRT Commands](./mrt) - Manage Managed Runtime environments diff --git a/docs/cli/jobs.md b/docs/cli/jobs.md new file mode 100644 index 00000000..bed75c0a --- /dev/null +++ b/docs/cli/jobs.md @@ -0,0 +1,303 @@ +# Job Commands + +Commands for executing and monitoring jobs on B2C Commerce instances. + +## b2c job run + +Execute a job on a B2C Commerce instance. + +### Usage + +```bash +b2c job run JOBID +``` + +### Arguments + +| Argument | Description | Required | +|----------|-------------|----------| +| `JOBID` | Job ID to execute | Yes | + +### Flags + +In addition to [global flags](./index#global-flags): + +| Flag | Short | Description | Default | +|------|-------|-------------|---------| +| `--wait` | `-w` | Wait for job to complete | `false` | +| `--timeout` | `-t` | Timeout in seconds when waiting | No timeout | +| `--param` | `-P` | Job parameter in format "name=value" (repeatable) | | +| `--no-wait-running` | | Do not wait for running job to finish before starting | `false` | +| `--show-log` | | Show job log on failure | `true` | + +### Examples + +```bash +# Execute a job +b2c job run my-custom-job + +# Execute and wait for completion +b2c job run my-custom-job --wait + +# Execute with timeout +b2c job run my-custom-job --wait --timeout 600 + +# Execute with parameters +b2c job run my-custom-job -P "SiteScope={\"all_storefront_sites\":true}" -P OtherParam=value + +# Output as JSON +b2c job run my-custom-job --wait --json +``` + +### Authentication + +This command requires OAuth authentication with OCAPI permissions for the `/jobs` resource. + +--- + +## b2c job wait + +Wait for a job execution to complete. + +### Usage + +```bash +b2c job wait JOBID EXECUTIONID +``` + +### Arguments + +| Argument | Description | Required | +|----------|-------------|----------| +| `JOBID` | Job ID | Yes | +| `EXECUTIONID` | Execution ID to wait for | Yes | + +### Flags + +In addition to [global flags](./index#global-flags): + +| Flag | Short | Description | Default | +|------|-------|-------------|---------| +| `--timeout` | `-t` | Timeout in seconds | No timeout | +| `--poll-interval` | | Polling interval in seconds | `3` | +| `--show-log` | | Show job log on failure | `true` | + +### Examples + +```bash +# Wait for a job execution +b2c job wait my-job abc123-def456 + +# Wait with timeout +b2c job wait my-job abc123-def456 --timeout 600 + +# Wait with custom polling interval +b2c job wait my-job abc123-def456 --poll-interval 5 +``` + +### Authentication + +This command requires OAuth authentication with OCAPI permissions for the `/jobs` resource. + +--- + +## b2c job search + +Search for job executions on a B2C Commerce instance. + +### Usage + +```bash +b2c job search +``` + +### Flags + +In addition to [global flags](./index#global-flags): + +| Flag | Short | Description | Default | +|------|-------|-------------|---------| +| `--job-id` | `-j` | Filter by job ID | | +| `--status` | | Filter by status (comma-separated: RUNNING,PENDING,OK,ERROR) | | +| `--count` | `-n` | Maximum number of results | `25` | +| `--start` | | Starting index for pagination | `0` | +| `--sort-by` | | Sort by field (start_time, end_time, job_id, status) | `start_time` | +| `--sort-order` | | Sort order (asc, desc) | `desc` | + +### Examples + +```bash +# Search all recent job executions +b2c job search + +# Search for a specific job +b2c job search --job-id my-custom-job + +# Search for running or pending jobs +b2c job search --status RUNNING,PENDING + +# Get more results +b2c job search --count 50 + +# Output as JSON +b2c job search --json +``` + +### Output + +The command displays a table of job executions with: + +- Execution ID +- Job ID +- Status +- Start Time + +### Authentication + +This command requires OAuth authentication with OCAPI permissions for the `/job_execution_search` resource. + +--- + +## b2c job import + +Import a site archive to a B2C Commerce instance using the `sfcc-site-archive-import` system job. + +### Usage + +```bash +b2c job import TARGET +``` + +### Arguments + +| Argument | Description | Required | +|----------|-------------|----------| +| `TARGET` | Directory, zip file, or remote filename to import | Yes | + +### Flags + +In addition to [global flags](./index#global-flags): + +| Flag | Short | Description | Default | +|------|-------|-------------|---------| +| `--keep-archive` | `-k` | Keep archive on instance after import | `false` | +| `--remote` | `-r` | Target is a filename already on the instance (in Impex/src/instance/) | `false` | +| `--timeout` | `-t` | Timeout in seconds | No timeout | +| `--show-log` | | Show job log on failure | `true` | + +### Examples + +```bash +# Import from a local directory (will be zipped automatically) +b2c job import ./my-site-data + +# Import from a zip file +b2c job import ./export.zip + +# Keep archive on instance after import +b2c job import ./my-site-data --keep-archive + +# Import from existing file on instance +b2c job import existing-archive.zip --remote + +# With timeout +b2c job import ./my-site-data --timeout 300 +``` + +### Notes + +- When importing a directory, it will be automatically zipped before upload +- The archive is uploaded to `Impex/src/instance/` on the instance +- By default, the archive is deleted after successful import (use `--keep-archive` to retain) + +### Authentication + +This command requires OAuth authentication with OCAPI permissions for the `/jobs` resource and WebDAV access for file upload. + +--- + +## b2c job export + +Export a site archive from a B2C Commerce instance using the `sfcc-site-archive-export` system job. + +### Usage + +```bash +b2c job export +``` + +### Flags + +In addition to [global flags](./index#global-flags): + +| Flag | Short | Description | Default | +|------|-------|-------------|---------| +| `--output` | `-o` | Output path for the export | `.` (current directory) | +| `--data-units` | | Data units JSON configuration | | +| `--site` | | Site ID(s) to export (comma-separated, repeatable) | | +| `--site-data` | | Site data types to export (comma-separated) | | +| `--global-data` | | Global data types to export (comma-separated) | | +| `--catalog` | | Catalog ID(s) to export (comma-separated) | | +| `--pricebook` | | Pricebook ID(s) to export (comma-separated) | | +| `--library` | | Library ID(s) to export (comma-separated) | | +| `--inventory` | | Inventory list ID(s) to export (comma-separated) | | +| `--keep-archive` | `-k` | Keep archive on instance after download | `false` | +| `--no-download` | | Do not download archive (implies --keep-archive) | `false` | +| `--zip-only` | | Save as zip file without extracting | `false` | +| `--timeout` | `-t` | Timeout in seconds | No timeout | +| `--show-log` | | Show job log on failure | `true` | + +### Examples + +```bash +# Export global metadata +b2c job export --global-data meta_data + +# Export a site's content and preferences +b2c job export --site RefArch --site-data content,site_preferences + +# Export catalogs +b2c job export --catalog storefront-catalog + +# Export with custom data units JSON +b2c job export --data-units '{"global_data":{"meta_data":true}}' + +# Export to a specific directory +b2c job export --output ./exports + +# Keep archive on instance +b2c job export --global-data meta_data --keep-archive + +# Output as JSON +b2c job export --global-data meta_data --json +``` + +### Data Units + +The export is configured using "data units" which specify what data to export. You can use convenience flags (`--site`, `--global-data`, etc.) or provide a full JSON configuration with `--data-units`. + +#### Site Data Types + +When using `--site-data`, available types include: +- `all` - Export all site data +- `content` - Content assets and slots +- `site_preferences` - Site preferences +- `campaigns_and_promotions` - Marketing campaigns +- `customer_groups` - Customer groups +- `payment_methods` - Payment configurations +- And more (see OCAPI documentation) + +#### Global Data Types + +When using `--global-data`, available types include: +- `all` - Export all global data +- `meta_data` - System and custom object metadata +- `custom_types` - Custom object type definitions +- `preferences` - Global preferences +- `locales` - Locale configurations +- `services` - Service configurations +- And more (see OCAPI documentation) + +### Authentication + +This command requires OAuth authentication with OCAPI permissions for the `/jobs` resource and WebDAV access for file download. diff --git a/packages/b2c-cli/.gitignore b/packages/b2c-cli/.gitignore index de7f3175..c2161849 100644 --- a/packages/b2c-cli/.gitignore +++ b/packages/b2c-cli/.gitignore @@ -12,3 +12,5 @@ yarn.lock package-lock.json dw.json + +export/ diff --git a/packages/b2c-cli/package.json b/packages/b2c-cli/package.json index a6b9f278..0bd3d2fb 100644 --- a/packages/b2c-cli/package.json +++ b/packages/b2c-cli/package.json @@ -70,22 +70,35 @@ "topicSeparator": " ", "topics": { "auth": { - "description": "Authentication and token management" - }, - "hello": { - "description": "Say hello to the world and others" + "description": "Manage authentication credentials and tokens" }, "code": { - "description": "Manage cartridge code on instances" - }, - "sites": { - "description": "Manage sites on instances" + "description": "Deploy and manage code versions on instances" }, - "sandbox": { - "description": "Manage on-demand sandboxes" + "job": { + "description": "Run jobs and import/export site archives" }, "mrt": { - "description": "Managed Runtime operations" + "description": "Manage Managed Runtime projects and deployments", + "subtopics": { + "env-var": { + "description": "Manage environment variables on MRT projects" + } + } + }, + "ods": { + "description": "Manage On-Demand Sandboxes" + }, + "sites": { + "description": "List and inspect storefront sites" + }, + "slas": { + "description": "Manage SLAS API clients and credentials", + "subtopics": { + "client": { + "description": "Manage SLAS client configurations" + } + } } } }, diff --git a/packages/b2c-cli/src/commands/job/export.ts b/packages/b2c-cli/src/commands/job/export.ts new file mode 100644 index 00000000..377704a4 --- /dev/null +++ b/packages/b2c-cli/src/commands/job/export.ts @@ -0,0 +1,314 @@ +import {Flags} from '@oclif/core'; +import {JobCommand} from '@salesforce/b2c-tooling/cli'; +import { + siteArchiveExportToPath, + JobExecutionError, + type SiteArchiveExportResult, + type ExportDataUnitsConfiguration, +} from '@salesforce/b2c-tooling/operations/jobs'; +import {t} from '../../i18n/index.js'; + +export default class JobExport extends JobCommand { + static description = t('commands.job.export.description', 'Job execution and site archive import/export (IMPEX)'); + + static enableJsonFlag = true; + + static examples = [ + '<%= config.bin %> <%= command.id %> --global-data meta_data', + '<%= config.bin %> <%= command.id %> --site RefArch --site-data content,site_preferences', + '<%= config.bin %> <%= command.id %> --catalog storefront-catalog', + '<%= config.bin %> <%= command.id %> --data-units \'{"global_data":{"meta_data":true}}\'', + '<%= config.bin %> <%= command.id %> --output ./exports --no-download', + ]; + + static flags = { + ...JobCommand.baseFlags, + output: Flags.string({ + char: 'o', + description: 'Output path (directory or .zip file)', + default: './export', + }), + site: Flags.string({ + description: 'Site IDs to export (comma-separated)', + multiple: true, + multipleNonGreedy: true, + delimiter: ',', + }), + 'site-data': Flags.string({ + description: 'Site data units to export (comma-separated: content,site_preferences,etc.)', + }), + catalog: Flags.string({ + description: 'Catalog IDs to export (comma-separated)', + multiple: true, + multipleNonGreedy: true, + delimiter: ',', + }), + library: Flags.string({ + description: 'Library IDs to export (comma-separated)', + multiple: true, + multipleNonGreedy: true, + delimiter: ',', + }), + 'inventory-list': Flags.string({ + description: 'Inventory list IDs to export (comma-separated)', + multiple: true, + multipleNonGreedy: true, + delimiter: ',', + }), + 'price-book': Flags.string({ + description: 'Price book IDs to export (comma-separated)', + multiple: true, + multipleNonGreedy: true, + delimiter: ',', + }), + 'global-data': Flags.string({ + description: 'Global data units to export (comma-separated: meta_data,custom_types,etc.)', + }), + 'data-units': Flags.string({ + char: 'd', + description: 'Full data units configuration as JSON string', + }), + 'keep-archive': Flags.boolean({ + char: 'k', + description: 'Keep archive on instance after download', + default: false, + }), + 'no-download': Flags.boolean({ + description: 'Do not download the archive (leave on instance)', + default: false, + }), + 'zip-only': Flags.boolean({ + description: 'Save as zip file without extracting', + default: false, + }), + timeout: Flags.integer({ + char: 't', + description: 'Timeout in seconds (default: no timeout)', + }), + 'show-log': Flags.boolean({ + description: 'Show job log on failure', + default: true, + }), + }; + + async run(): Promise { + this.requireOAuthCredentials(); + this.requireWebDavCredentials(); + + const { + output, + site, + 'site-data': siteData, + catalog, + library, + 'inventory-list': inventoryList, + 'price-book': priceBook, + 'global-data': globalData, + 'data-units': dataUnitsJson, + 'keep-archive': keepArchive, + 'no-download': noDownload, + 'zip-only': zipOnly, + timeout, + 'show-log': showLog, + } = this.flags; + + const hostname = this.resolvedConfig.hostname!; + + // Build data units configuration + const dataUnits = this.buildDataUnits({ + dataUnitsJson, + site, + siteData, + catalog, + library, + inventoryList, + priceBook, + globalData, + }); + + if (Object.keys(dataUnits).length === 0) { + this.error( + t( + 'commands.job.export.noDataUnits', + 'No data units specified. Use --global-data, --site, --catalog, etc. or --data-units', + ), + ); + } + + this.log( + t('commands.job.export.exporting', 'Exporting data from {{hostname}}...', { + hostname, + }), + ); + + this.log(t('commands.job.export.dataUnits', 'Data units: {{dataUnits}}', {dataUnits: JSON.stringify(dataUnits)})); + + try { + const result = await siteArchiveExportToPath(this.instance, dataUnits, output, { + keepArchive: keepArchive || noDownload, + extractZip: !zipOnly, + waitOptions: { + timeout: timeout ? timeout * 1000 : undefined, + onProgress: (exec, elapsed) => { + if (!this.jsonEnabled()) { + const elapsedSec = Math.floor(elapsed / 1000); + this.log( + t('commands.job.export.progress', ' Status: {{status}} ({{elapsed}}s elapsed)', { + status: exec.execution_status, + elapsed: elapsedSec.toString(), + }), + ); + } + }, + }, + }); + + const durationSec = result.execution.duration ? (result.execution.duration / 1000).toFixed(1) : 'N/A'; + this.log( + t('commands.job.export.completed', 'Export completed: {{status}} (duration: {{duration}}s)', { + status: result.execution.exit_status?.code || result.execution.execution_status, + duration: durationSec, + }), + ); + + if (result.localPath) { + this.log( + t('commands.job.export.savedTo', 'Saved to: {{path}}', { + path: result.localPath, + }), + ); + } + + if (result.archiveKept) { + this.log( + t('commands.job.export.archiveKept', 'Archive kept at: Impex/src/instance/{{filename}}', { + filename: result.archiveFilename, + }), + ); + } + + return result; + } catch (error) { + if (error instanceof JobExecutionError) { + if (showLog) { + await this.showJobLog(error.execution); + } + this.error( + t('commands.job.export.failed', 'Export failed: {{status}}', { + status: error.execution.exit_status?.code || 'ERROR', + }), + ); + } + if (error instanceof Error) { + this.error( + t('commands.job.export.error', 'Export error: {{message}}', { + message: error.message, + }), + ); + } + throw error; + } + } + + private buildDataUnits(params: { + dataUnitsJson?: string; + site?: string[]; + siteData?: string; + catalog?: string[]; + library?: string[]; + inventoryList?: string[]; + priceBook?: string[]; + globalData?: string; + }): Partial { + // If JSON is provided, use it directly + if (params.dataUnitsJson) { + try { + return JSON.parse(params.dataUnitsJson) as Partial; + } catch { + this.error( + t('commands.job.export.invalidJson', 'Invalid JSON for --data-units: {{json}}', { + json: params.dataUnitsJson, + }), + ); + } + } + + const dataUnits: Partial = {}; + + // Sites + if (params.site && params.site.length > 0) { + dataUnits.sites = {}; + const siteDataUnits = this.parseSiteDataUnits(params.siteData); + + for (const siteId of params.site) { + dataUnits.sites[siteId] = siteDataUnits || {all: true}; + } + } + + // Catalogs + if (params.catalog && params.catalog.length > 0) { + dataUnits.catalogs = {}; + for (const catalogId of params.catalog) { + dataUnits.catalogs[catalogId] = true; + } + } + + // Libraries + if (params.library && params.library.length > 0) { + dataUnits.libraries = {}; + for (const libraryId of params.library) { + dataUnits.libraries[libraryId] = true; + } + } + + // Inventory lists (API uses snake_case keys) + if (params.inventoryList && params.inventoryList.length > 0) { + // eslint-disable-next-line camelcase + dataUnits.inventory_lists = {}; + for (const listId of params.inventoryList) { + dataUnits.inventory_lists[listId] = true; + } + } + + // Price books (API uses snake_case keys) + if (params.priceBook && params.priceBook.length > 0) { + // eslint-disable-next-line camelcase + dataUnits.price_books = {}; + for (const bookId of params.priceBook) { + dataUnits.price_books[bookId] = true; + } + } + + // Global data (API uses snake_case keys) + if (params.globalData) { + // eslint-disable-next-line camelcase + dataUnits.global_data = this.parseGlobalDataUnits(params.globalData); + } + + return dataUnits; + } + + private parseGlobalDataUnits(globalData: string): Record { + const units = globalData.split(',').map((s) => s.trim()); + const result: Record = {}; + + for (const unit of units) { + result[unit] = true; + } + + return result; + } + + private parseSiteDataUnits(siteData?: string): Record | undefined { + if (!siteData) return undefined; + + const units = siteData.split(',').map((s) => s.trim()); + const result: Record = {}; + + for (const unit of units) { + result[unit] = true; + } + + return result; + } +} diff --git a/packages/b2c-cli/src/commands/job/import.ts b/packages/b2c-cli/src/commands/job/import.ts new file mode 100644 index 00000000..73e6ab0c --- /dev/null +++ b/packages/b2c-cli/src/commands/job/import.ts @@ -0,0 +1,138 @@ +import {Args, Flags} from '@oclif/core'; +import {JobCommand} from '@salesforce/b2c-tooling/cli'; +import { + siteArchiveImport, + JobExecutionError, + type SiteArchiveImportResult, +} from '@salesforce/b2c-tooling/operations/jobs'; +import {t} from '../../i18n/index.js'; + +export default class JobImport extends JobCommand { + static args = { + target: Args.string({ + description: 'Directory, zip file, or remote filename to import', + required: true, + }), + }; + + static description = t( + 'commands.job.import.description', + 'Import a site archive to a B2C Commerce instance using sfcc-site-archive-import job', + ); + + static enableJsonFlag = true; + + static examples = [ + '<%= config.bin %> <%= command.id %> ./my-site-data', + '<%= config.bin %> <%= command.id %> ./export.zip', + '<%= config.bin %> <%= command.id %> ./my-site-data --keep-archive', + '<%= config.bin %> <%= command.id %> existing-archive.zip --remote', + ]; + + static flags = { + ...JobCommand.baseFlags, + 'keep-archive': Flags.boolean({ + char: 'k', + description: 'Keep archive on instance after import', + default: false, + }), + remote: Flags.boolean({ + char: 'r', + description: 'Target is a filename already on the instance (in Impex/src/instance/)', + default: false, + }), + timeout: Flags.integer({ + char: 't', + description: 'Timeout in seconds (default: no timeout)', + }), + 'show-log': Flags.boolean({ + description: 'Show job log on failure', + default: true, + }), + }; + + async run(): Promise { + this.requireOAuthCredentials(); + this.requireWebDavCredentials(); + + const {target} = this.args; + const {'keep-archive': keepArchive, remote, timeout, 'show-log': showLog} = this.flags; + + const hostname = this.resolvedConfig.hostname!; + + if (remote) { + this.log( + t('commands.job.import.importingRemote', 'Importing {{target}} from {{hostname}}...', { + target, + hostname, + }), + ); + } else { + this.log( + t('commands.job.import.importing', 'Importing {{target}} to {{hostname}}...', { + target, + hostname, + }), + ); + } + + try { + const importTarget = remote ? {remoteFilename: target} : target; + + const result = await siteArchiveImport(this.instance, importTarget, { + keepArchive, + waitOptions: { + timeout: timeout ? timeout * 1000 : undefined, + onProgress: (exec, elapsed) => { + if (!this.jsonEnabled()) { + const elapsedSec = Math.floor(elapsed / 1000); + this.log( + t('commands.job.import.progress', ' Status: {{status}} ({{elapsed}}s elapsed)', { + status: exec.execution_status, + elapsed: elapsedSec.toString(), + }), + ); + } + }, + }, + }); + + const durationSec = result.execution.duration ? (result.execution.duration / 1000).toFixed(1) : 'N/A'; + this.log( + t('commands.job.import.completed', 'Import completed: {{status}} (duration: {{duration}}s)', { + status: result.execution.exit_status?.code || result.execution.execution_status, + duration: durationSec, + }), + ); + + if (result.archiveKept) { + this.log( + t('commands.job.import.archiveKept', 'Archive kept at: Impex/src/instance/{{filename}}', { + filename: result.archiveFilename, + }), + ); + } + + return result; + } catch (error) { + if (error instanceof JobExecutionError) { + if (showLog) { + await this.showJobLog(error.execution); + } + this.error( + t('commands.job.import.failed', 'Import failed: {{status}}', { + status: error.execution.exit_status?.code || 'ERROR', + }), + ); + } + if (error instanceof Error) { + this.error( + t('commands.job.import.error', 'Import error: {{message}}', { + message: error.message, + }), + ); + } + throw error; + } + } +} diff --git a/packages/b2c-cli/src/commands/job/run.ts b/packages/b2c-cli/src/commands/job/run.ts new file mode 100644 index 00000000..f01f4ad9 --- /dev/null +++ b/packages/b2c-cli/src/commands/job/run.ts @@ -0,0 +1,154 @@ +import {Args, Flags} from '@oclif/core'; +import {JobCommand} from '@salesforce/b2c-tooling/cli'; +import {executeJob, waitForJob, JobExecutionError, type JobExecution} from '@salesforce/b2c-tooling/operations/jobs'; +import {t} from '../../i18n/index.js'; + +export default class JobRun extends JobCommand { + static args = { + jobId: Args.string({ + description: 'Job ID to execute', + required: true, + }), + }; + + static description = t('commands.job.run.description', 'Execute a job on a B2C Commerce instance'); + + static enableJsonFlag = true; + + static examples = [ + '<%= config.bin %> <%= command.id %> my-custom-job', + '<%= config.bin %> <%= command.id %> my-custom-job --wait', + String.raw`<%= config.bin %> <%= command.id %> my-custom-job -P "SiteScope={\"all_storefront_sites\":true}" -P OtherParam=value`, + '<%= config.bin %> <%= command.id %> my-custom-job --wait --timeout 600', + ]; + + static flags = { + ...JobCommand.baseFlags, + wait: Flags.boolean({ + char: 'w', + description: 'Wait for job to complete', + default: false, + }), + timeout: Flags.integer({ + char: 't', + description: 'Timeout in seconds when waiting (default: no timeout)', + }), + param: Flags.string({ + char: 'P', + description: 'Job parameter in format "name=value" (use -P multiple times for multiple params)', + multiple: true, + multipleNonGreedy: true, + }), + 'no-wait-running': Flags.boolean({ + description: 'Do not wait for running job to finish before starting', + default: false, + }), + 'show-log': Flags.boolean({ + description: 'Show job log on failure', + default: true, + }), + }; + + async run(): Promise { + this.requireOAuthCredentials(); + + const {jobId} = this.args; + const {wait, timeout, param, 'no-wait-running': noWaitRunning, 'show-log': showLog} = this.flags; + + // Parse parameters + const parameters = this.parseParameters(param || []); + + this.log( + t('commands.job.run.executing', 'Executing job {{jobId}} on {{hostname}}...', { + jobId, + hostname: this.resolvedConfig.hostname!, + }), + ); + + let execution: JobExecution; + try { + execution = await executeJob(this.instance, jobId, { + parameters, + waitForRunning: !noWaitRunning, + }); + } catch (error) { + if (error instanceof Error) { + this.error( + t('commands.job.run.executionFailed', 'Failed to execute job: {{message}}', {message: error.message}), + ); + } + throw error; + } + + this.log( + t('commands.job.run.started', 'Job started: {{executionId}} (status: {{status}})', { + executionId: execution.id, + status: execution.execution_status, + }), + ); + + // Wait for completion if requested + if (wait) { + this.log(t('commands.job.run.waiting', 'Waiting for job to complete...')); + + try { + execution = await waitForJob(this.instance, jobId, execution.id!, { + timeout: timeout ? timeout * 1000 : undefined, + onProgress: (exec, elapsed) => { + if (!this.jsonEnabled()) { + const elapsedSec = Math.floor(elapsed / 1000); + this.log( + t('commands.job.run.progress', ' Status: {{status}} ({{elapsed}}s elapsed)', { + status: exec.execution_status, + elapsed: elapsedSec.toString(), + }), + ); + } + }, + }); + + const durationSec = execution.duration ? (execution.duration / 1000).toFixed(1) : 'N/A'; + this.log( + t('commands.job.run.completed', 'Job completed: {{status}} (duration: {{duration}}s)', { + status: execution.exit_status?.code || execution.execution_status, + duration: durationSec, + }), + ); + } catch (error) { + if (error instanceof JobExecutionError) { + if (showLog) { + await this.showJobLog(error.execution); + } + this.error( + t('commands.job.run.jobFailed', 'Job failed: {{status}}', { + status: error.execution.exit_status?.code || 'ERROR', + }), + ); + } + throw error; + } + } + + // JSON output handled by oclif + if (this.jsonEnabled()) { + return execution; + } + + return execution; + } + + private parseParameters(params: string[]): Array<{name: string; value: string}> { + return params.map((p) => { + const eqIndex = p.indexOf('='); + if (eqIndex === -1) { + this.error( + t('commands.job.run.invalidParam', 'Invalid parameter format: {{param}}. Expected "name=value"', {param: p}), + ); + } + return { + name: p.slice(0, eqIndex), + value: p.slice(eqIndex + 1), + }; + }); + } +} diff --git a/packages/b2c-cli/src/commands/job/search.ts b/packages/b2c-cli/src/commands/job/search.ts new file mode 100644 index 00000000..ef4a8196 --- /dev/null +++ b/packages/b2c-cli/src/commands/job/search.ts @@ -0,0 +1,185 @@ +import {Flags, ux} from '@oclif/core'; +import cliui from 'cliui'; +import {InstanceCommand} from '@salesforce/b2c-tooling/cli'; +import { + searchJobExecutions, + type JobExecutionSearchResult, + type JobExecution, +} from '@salesforce/b2c-tooling/operations/jobs'; +import {t} from '../../i18n/index.js'; + +/** + * Column definition for table output. + */ +interface ColumnDef { + header: string; + get: (e: JobExecution) => string; +} + +/** + * Available columns for job execution list output. + */ +const COLUMNS: Record = { + id: { + header: 'Execution ID', + get: (e) => e.id ?? '-', + }, + jobId: { + header: 'Job ID', + get: (e) => e.job_id ?? '-', + }, + status: { + header: 'Status', + get: (e) => e.exit_status?.code || e.execution_status || '-', + }, + startTime: { + header: 'Start Time', + get: (e) => (e.start_time ? new Date(e.start_time).toISOString().replace('T', ' ').slice(0, 19) : '-'), + }, +}; + +const DEFAULT_COLUMNS = ['id', 'jobId', 'status', 'startTime']; + +export default class JobSearch extends InstanceCommand { + static description = t('commands.job.search.description', 'Search for job executions on a B2C Commerce instance'); + + static enableJsonFlag = true; + + static examples = [ + '<%= config.bin %> <%= command.id %>', + '<%= config.bin %> <%= command.id %> --job-id my-custom-job', + '<%= config.bin %> <%= command.id %> --status RUNNING,PENDING', + '<%= config.bin %> <%= command.id %> --count 50', + '<%= config.bin %> <%= command.id %> --json', + ]; + + static flags = { + ...InstanceCommand.baseFlags, + 'job-id': Flags.string({ + char: 'j', + description: 'Filter by job ID', + }), + status: Flags.string({ + description: 'Filter by status (comma-separated: RUNNING,PENDING,OK,ERROR)', + multiple: true, + multipleNonGreedy: true, + delimiter: ',', + }), + count: Flags.integer({ + char: 'n', + description: 'Maximum number of results', + default: 25, + }), + start: Flags.integer({ + description: 'Starting index for pagination', + default: 0, + }), + 'sort-by': Flags.string({ + description: 'Sort by field', + options: ['start_time', 'end_time', 'job_id', 'status'], + default: 'start_time', + }), + 'sort-order': Flags.string({ + description: 'Sort order', + options: ['asc', 'desc'], + default: 'desc', + }), + }; + + async run(): Promise { + this.requireOAuthCredentials(); + + const {'job-id': jobId, status, count, start, 'sort-by': sortBy, 'sort-order': sortOrder} = this.flags; + + this.log( + t('commands.job.search.searching', 'Searching job executions on {{hostname}}...', { + hostname: this.resolvedConfig.hostname!, + }), + ); + + const results = await searchJobExecutions(this.instance, { + jobId, + status, + count, + start, + sortBy, + sortOrder: sortOrder as 'asc' | 'desc', + }); + + // JSON output handled by oclif + if (this.jsonEnabled()) { + return results; + } + + // Human-readable output + if (results.total === 0) { + ux.stdout(t('commands.job.search.noResults', 'No job executions found.')); + return results; + } + + this.log( + t('commands.job.search.found', 'Found {{total}} job execution(s) (showing {{count}})', { + total: results.total, + count: results.hits.length, + }), + ); + + this.printExecutionsTable(results.hits); + + return results; + } + + /** + * Calculate dynamic column widths based on content. + */ + private calculateColumnWidths(executions: JobExecution[], columnKeys: string[]): Map { + const widths = new Map(); + const padding = 2; + + for (const key of columnKeys) { + const col = COLUMNS[key]; + let maxWidth = col.header.length; + + for (const exec of executions) { + const value = col.get(exec); + maxWidth = Math.max(maxWidth, value.length); + } + + widths.set(key, maxWidth + padding); + } + + return widths; + } + + private printExecutionsTable(executions: JobExecution[]): void { + const termWidth = process.stdout.columns || 120; + const ui = cliui({width: termWidth}); + const columnKeys = DEFAULT_COLUMNS; + + const widths = this.calculateColumnWidths(executions, columnKeys); + + // Header + const headerCols = columnKeys.map((key) => ({ + text: COLUMNS[key].header, + width: widths.get(key), + padding: [0, 1, 0, 0] as [number, number, number, number], + })); + ui.div(...headerCols); + + // Separator + const totalWidth = [...widths.values()].reduce((sum, w) => sum + w, 0); + ui.div({text: '─'.repeat(Math.min(totalWidth, termWidth)), padding: [0, 0, 0, 0]}); + + // Rows + for (const exec of executions) { + const rowCols = columnKeys.map((key) => ({ + text: COLUMNS[key].get(exec), + width: widths.get(key), + padding: [0, 1, 0, 0] as [number, number, number, number], + })); + ui.div(...rowCols); + } + + ux.stdout(ui.toString()); + } +} diff --git a/packages/b2c-cli/src/commands/job/wait.ts b/packages/b2c-cli/src/commands/job/wait.ts new file mode 100644 index 00000000..fafde227 --- /dev/null +++ b/packages/b2c-cli/src/commands/job/wait.ts @@ -0,0 +1,97 @@ +import {Args, Flags} from '@oclif/core'; +import {JobCommand} from '@salesforce/b2c-tooling/cli'; +import {waitForJob, JobExecutionError, type JobExecution} from '@salesforce/b2c-tooling/operations/jobs'; +import {t} from '../../i18n/index.js'; + +export default class JobWait extends JobCommand { + static args = { + jobId: Args.string({ + description: 'Job ID', + required: true, + }), + executionId: Args.string({ + description: 'Execution ID to wait for', + required: true, + }), + }; + + static description = t('commands.job.wait.description', 'Wait for a job execution to complete'); + + static enableJsonFlag = true; + + static examples = [ + '<%= config.bin %> <%= command.id %> my-job abc123-def456', + '<%= config.bin %> <%= command.id %> my-job abc123-def456 --timeout 600', + '<%= config.bin %> <%= command.id %> my-job abc123-def456 --poll-interval 5', + ]; + + static flags = { + ...JobCommand.baseFlags, + timeout: Flags.integer({ + char: 't', + description: 'Timeout in seconds (default: no timeout)', + }), + 'poll-interval': Flags.integer({ + description: 'Polling interval in seconds', + default: 3, + }), + 'show-log': Flags.boolean({ + description: 'Show job log on failure', + default: true, + }), + }; + + async run(): Promise { + this.requireOAuthCredentials(); + + const {jobId, executionId} = this.args; + const {timeout, 'poll-interval': pollInterval, 'show-log': showLog} = this.flags; + + this.log( + t('commands.job.wait.waiting', 'Waiting for job {{jobId}} execution {{executionId}}...', { + jobId, + executionId, + }), + ); + + try { + const execution = await waitForJob(this.instance, jobId, executionId, { + timeout: timeout ? timeout * 1000 : undefined, + pollInterval: pollInterval * 1000, + onProgress: (exec, elapsed) => { + if (!this.jsonEnabled()) { + const elapsedSec = Math.floor(elapsed / 1000); + this.log( + t('commands.job.wait.progress', ' Status: {{status}} ({{elapsed}}s elapsed)', { + status: exec.execution_status, + elapsed: elapsedSec.toString(), + }), + ); + } + }, + }); + + const durationSec = execution.duration ? (execution.duration / 1000).toFixed(1) : 'N/A'; + this.log( + t('commands.job.wait.completed', 'Job completed: {{status}} (duration: {{duration}}s)', { + status: execution.exit_status?.code || execution.execution_status, + duration: durationSec, + }), + ); + + return execution; + } catch (error) { + if (error instanceof JobExecutionError) { + if (showLog) { + await this.showJobLog(error.execution); + } + this.error( + t('commands.job.wait.jobFailed', 'Job failed: {{status}}', { + status: error.execution.exit_status?.code || 'ERROR', + }), + ); + } + throw error; + } + } +} diff --git a/packages/b2c-cli/src/commands/slas/client/create.ts b/packages/b2c-cli/src/commands/slas/client/create.ts index 04ff621b..ce98ca26 100644 --- a/packages/b2c-cli/src/commands/slas/client/create.ts +++ b/packages/b2c-cli/src/commands/slas/client/create.ts @@ -4,7 +4,6 @@ import { SlasClientCommand, type Client, type ClientOutput, - parseMultiple, normalizeClientResponse, printClientDetails, formatApiError, @@ -35,23 +34,31 @@ export default class SlasClientCreate extends SlasClientCommand v.split(',').map((s) => s.trim())); -} - /** * JSON output structure for SLAS client commands */ diff --git a/packages/b2c-tooling/package.json b/packages/b2c-tooling/package.json index 5e879960..01e3f947 100644 --- a/packages/b2c-tooling/package.json +++ b/packages/b2c-tooling/package.json @@ -137,7 +137,6 @@ "@oclif/core": "^4", "@oclif/prettier-config": "^0.2.1", "@salesforce/dev-config": "^4.3.2", - "@types/archiver": "^7.0.0", "@types/node": "^18.19.130", "eslint": "^9", "eslint-config-prettier": "^10", @@ -160,10 +159,10 @@ "node": ">=22.16.0" }, "dependencies": { - "archiver": "^7.0.1", "chokidar": "^5.0.0", "glob": "^13.0.0", "i18next": "^25.6.3", + "jszip": "^3.10.1", "open": "^11.0.0", "openapi-fetch": "^0.15.0", "pino": "^10.1.0", diff --git a/packages/b2c-tooling/src/cli/cartridge-command.ts b/packages/b2c-tooling/src/cli/cartridge-command.ts index c5da5147..3dec3f69 100644 --- a/packages/b2c-tooling/src/cli/cartridge-command.ts +++ b/packages/b2c-tooling/src/cli/cartridge-command.ts @@ -28,13 +28,17 @@ export abstract class CartridgeCommand extends Instanc static cartridgeFlags = { cartridge: Flags.string({ char: 'c', - description: 'Include specific cartridge(s) (can be specified multiple times)', + description: 'Include specific cartridge(s) (comma-separated)', multiple: true, + multipleNonGreedy: true, + delimiter: ',', }), 'exclude-cartridge': Flags.string({ char: 'x', - description: 'Exclude specific cartridge(s) (can be specified multiple times)', + description: 'Exclude specific cartridge(s) (comma-separated)', multiple: true, + multipleNonGreedy: true, + delimiter: ',', }), }; diff --git a/packages/b2c-tooling/src/cli/index.ts b/packages/b2c-tooling/src/cli/index.ts index add0cb47..1adfcdf1 100644 --- a/packages/b2c-tooling/src/cli/index.ts +++ b/packages/b2c-tooling/src/cli/index.ts @@ -4,6 +4,7 @@ export type {Flags, Args} from './base-command.js'; export {OAuthCommand} from './oauth-command.js'; export {InstanceCommand} from './instance-command.js'; export {CartridgeCommand} from './cartridge-command.js'; +export {JobCommand} from './job-command.js'; export {MrtCommand} from './mrt-command.js'; export {OdsCommand} from './ods-command.js'; diff --git a/packages/b2c-tooling/src/cli/job-command.ts b/packages/b2c-tooling/src/cli/job-command.ts new file mode 100644 index 00000000..d4a35e0b --- /dev/null +++ b/packages/b2c-tooling/src/cli/job-command.ts @@ -0,0 +1,64 @@ +import {Command} from '@oclif/core'; +import {InstanceCommand} from './instance-command.js'; +import {getJobLog, getJobErrorMessage, type JobExecution} from '../operations/jobs/index.js'; +import {t} from '../i18n/index.js'; + +/** + * Base command for job operations. + * + * Extends InstanceCommand with job-specific functionality like + * displaying job logs on failure. + * + * @example + * export default class MyJobCommand extends JobCommand { + * async run(): Promise { + * try { + * await executeJob(this.instance, 'my-job'); + * } catch (error) { + * if (error instanceof JobExecutionError) { + * await this.showJobLog(error.execution); + * } + * throw error; + * } + * } + * } + */ +export abstract class JobCommand extends InstanceCommand { + /** + * Display a job's log file content and error message if available. + * Outputs to stderr since this is typically shown for failed jobs. + * + * @param execution - Job execution with log file info + */ + protected async showJobLog(execution: JobExecution): Promise { + // Extract error message from failed step executions + const errorMessage = getJobErrorMessage(execution); + + if (!execution.is_log_file_existing) { + // No log file, but we may still have an error message + if (errorMessage) { + this.logger.error({errorMessage}, errorMessage); + } + return; + } + + try { + const log = await getJobLog(this.instance, execution); + const logFileName = execution.log_file_path?.split('/').pop() ?? 'job.log'; + + const header = t('cli.job.logHeader', 'Job log ({{logFileName}}):', {logFileName}); + this.logger.error({log, errorMessage}, `${header}\n${log}`); + + // Log the error message separately if available + if (errorMessage) { + this.logger.error(t('cli.job.errorMessage', 'Error: {{message}}', {message: errorMessage})); + } + } catch { + this.warn(t('cli.job.logFetchFailed', 'Could not retrieve job log')); + // Still try to show error message even if log fetch failed + if (errorMessage) { + this.logger.error({errorMessage}, errorMessage); + } + } + } +} diff --git a/packages/b2c-tooling/src/cli/oauth-command.ts b/packages/b2c-tooling/src/cli/oauth-command.ts index 27871d6a..5a446659 100644 --- a/packages/b2c-tooling/src/cli/oauth-command.ts +++ b/packages/b2c-tooling/src/cli/oauth-command.ts @@ -31,9 +31,11 @@ export abstract class OAuthCommand extends BaseCommand helpGroup: 'AUTH', }), scope: Flags.string({ - description: 'OAuth scopes to request (can be specified multiple times)', + description: 'OAuth scopes to request (comma-separated)', env: 'SFCC_OAUTH_SCOPES', multiple: true, + multipleNonGreedy: true, + delimiter: ',', helpGroup: 'AUTH', }), 'short-code': Flags.string({ @@ -42,9 +44,11 @@ export abstract class OAuthCommand extends BaseCommand helpGroup: 'AUTH', }), 'auth-methods': Flags.string({ - description: 'Allowed auth methods in priority order (comma-separated or multiple flags)', + description: 'Allowed auth methods in priority order (comma-separated)', env: 'SFCC_AUTH_METHODS', multiple: true, + multipleNonGreedy: true, + delimiter: ',', options: ALL_AUTH_METHODS, helpGroup: 'AUTH', }), @@ -57,7 +61,7 @@ export abstract class OAuthCommand extends BaseCommand }; /** - * Parses auth methods from flags, supporting both comma-separated values and multiple flags. + * Parses auth methods from flags. * Returns methods in the order specified (priority order). */ protected parseAuthMethods(): AuthMethod[] | undefined { @@ -66,16 +70,10 @@ export abstract class OAuthCommand extends BaseCommand return undefined; } - // Flatten comma-separated values while preserving order - const methods: AuthMethod[] = []; - for (const value of flagValues) { - const parts = value.split(',').map((s) => s.trim()); - for (const part of parts) { - if (part && ALL_AUTH_METHODS.includes(part as AuthMethod)) { - methods.push(part as AuthMethod); - } - } - } + // Filter to valid auth methods (oclif handles comma splitting via delimiter) + const methods = flagValues + .map((s) => s.trim()) + .filter((s): s is AuthMethod => ALL_AUTH_METHODS.includes(s as AuthMethod)); return methods.length > 0 ? methods : undefined; } diff --git a/packages/b2c-tooling/src/index.ts b/packages/b2c-tooling/src/index.ts index 3410ddbc..bcb38fe0 100644 --- a/packages/b2c-tooling/src/index.ts +++ b/packages/b2c-tooling/src/index.ts @@ -106,8 +106,36 @@ export type { } from './operations/code/index.js'; // Operations - Jobs -export {runJob, getJobStatus} from './operations/jobs/index.js'; -export type {JobExecutionResult} from './operations/jobs/index.js'; +export { + executeJob, + getJobExecution, + waitForJob, + searchJobExecutions, + findRunningJobExecution, + getJobLog, + getJobErrorMessage, + JobExecutionError, + siteArchiveImport, + siteArchiveExport, + siteArchiveExportToPath, +} from './operations/jobs/index.js'; +export type { + JobExecution, + JobStepExecution, + JobExecutionStatus, + JobExecutionParameter, + ExecuteJobOptions, + WaitForJobOptions, + SearchJobExecutionsOptions, + JobExecutionSearchResult, + SiteArchiveImportOptions, + SiteArchiveImportResult, + SiteArchiveExportOptions, + SiteArchiveExportResult, + ExportDataUnitsConfiguration, + ExportSitesConfiguration, + ExportGlobalDataConfiguration, +} from './operations/jobs/index.js'; // Operations - Sites export {listSites, getSite} from './operations/sites/index.js'; diff --git a/packages/b2c-tooling/src/operations/code/deploy.ts b/packages/b2c-tooling/src/operations/code/deploy.ts index 18ace63c..41d5043f 100644 --- a/packages/b2c-tooling/src/operations/code/deploy.ts +++ b/packages/b2c-tooling/src/operations/code/deploy.ts @@ -1,5 +1,6 @@ import path from 'node:path'; -import archiver from 'archiver'; +import fs from 'node:fs'; +import JSZip from 'jszip'; import type {B2CInstance} from '../../instance/index.js'; import {getLogger} from '../../logging/logger.js'; import {findCartridges, type CartridgeMapping, type FindCartridgesOptions} from './cartridges.js'; @@ -30,18 +31,22 @@ export interface DeployResult { } /** - * Converts an archiver stream to a Buffer. + * Recursively adds a directory to a JSZip instance. */ -async function archiverToBuffer(archive: archiver.Archiver): Promise { - return new Promise((resolve, reject) => { - const chunks: Buffer[] = []; - - archive.on('data', (chunk: Buffer) => chunks.push(chunk)); - archive.on('end', () => resolve(Buffer.concat(chunks))); - archive.on('error', reject); - - archive.finalize(); - }); +async function addDirectoryToZip(zip: JSZip, dirPath: string, zipPath: string): Promise { + const entries = await fs.promises.readdir(dirPath, {withFileTypes: true}); + + for (const entry of entries) { + const fullPath = path.join(dirPath, entry.name); + const entryZipPath = path.join(zipPath, entry.name); + + if (entry.isDirectory()) { + await addDirectoryToZip(zip, fullPath, entryZipPath); + } else if (entry.isFile()) { + const content = await fs.promises.readFile(fullPath); + zip.file(entryZipPath, content); + } + } } /** @@ -129,15 +134,17 @@ export async function uploadCartridges(instance: B2CInstance, cartridges: Cartri // Create zip archive logger.debug('Creating cartridge archive...'); - const archive = archiver('zip', { - zlib: {level: 9}, - }); + const zip = new JSZip(); for (const c of cartridges) { - archive.directory(c.src, path.join(codeVersion, c.dest)); + await addDirectoryToZip(zip, c.src, path.join(codeVersion, c.dest)); } - const buffer = await archiverToBuffer(archive); + const buffer = await zip.generateAsync({ + type: 'nodebuffer', + compression: 'DEFLATE', + compressionOptions: {level: 9}, + }); logger.debug({size: buffer.length}, `Archive created: ${buffer.length} bytes`); // Upload archive diff --git a/packages/b2c-tooling/src/operations/code/watch.ts b/packages/b2c-tooling/src/operations/code/watch.ts index 47817a3f..41c3f677 100644 --- a/packages/b2c-tooling/src/operations/code/watch.ts +++ b/packages/b2c-tooling/src/operations/code/watch.ts @@ -1,7 +1,7 @@ import path from 'node:path'; import fs from 'node:fs'; import {watch, type FSWatcher} from 'chokidar'; -import archiver from 'archiver'; +import JSZip from 'jszip'; import type {B2CInstance} from '../../instance/index.js'; import {getLogger} from '../../logging/logger.js'; import {findCartridges, type CartridgeMapping, type FindCartridgesOptions} from './cartridges.js'; @@ -80,21 +80,6 @@ function debounce void>(fn: T, delay: number): T { }) as T; } -/** - * Converts an archiver stream to a Buffer. - */ -async function archiverToBuffer(archive: archiver.Archiver): Promise { - return new Promise((resolve, reject) => { - const chunks: Buffer[] = []; - - archive.on('data', (chunk: Buffer) => chunks.push(chunk)); - archive.on('end', () => resolve(Buffer.concat(chunks))); - archive.on('error', reject); - - archive.finalize(); - }); -} - /** * Watches cartridge directories and syncs changes to an instance. * @@ -204,19 +189,22 @@ export async function watchCartridges( const uploadPath = `${webdavLocation}/_upload-${now}.zip`; try { - const archive = archiver('zip', { - zlib: {level: 5}, - }); + const zip = new JSZip(); for (const f of validUploadFiles) { try { - archive.file(f.src, {name: f.dest}); + const content = await fs.promises.readFile(f.src); + zip.file(f.dest, content); } catch (error) { logger.debug({file: f.src, error}, 'Failed to add file to archive'); } } - const buffer = await archiverToBuffer(archive); + const buffer = await zip.generateAsync({ + type: 'nodebuffer', + compression: 'DEFLATE', + compressionOptions: {level: 5}, + }); await webdav.put(uploadPath, buffer, 'application/zip'); logger.debug({uploadPath}, 'Archive uploaded'); diff --git a/packages/b2c-tooling/src/operations/jobs/index.ts b/packages/b2c-tooling/src/operations/jobs/index.ts index 324f9161..7f30f14c 100644 --- a/packages/b2c-tooling/src/operations/jobs/index.ts +++ b/packages/b2c-tooling/src/operations/jobs/index.ts @@ -4,42 +4,94 @@ * This module provides functions for running and monitoring jobs * on B2C Commerce instances via OCAPI. * - * ## Functions + * ## Core Job Functions * - * - {@link runJob} - Start a job execution - * - {@link getJobStatus} - Check the status of a running job + * - {@link executeJob} - Start a job execution + * - {@link getJobExecution} - Get the status of a job execution + * - {@link waitForJob} - Wait for a job to complete + * - {@link searchJobExecutions} - Search for job executions + * - {@link findRunningJobExecution} - Find a running execution + * - {@link getJobLog} - Retrieve job log file content + * + * ## System Jobs + * + * - {@link siteArchiveImport} - Import a site archive + * - {@link siteArchiveExport} - Export a site archive + * - {@link siteArchiveExportToPath} - Export and save to local path * * ## Usage * * ```typescript - * import { runJob, getJobStatus } from '@salesforce/b2c-tooling/operations/jobs'; - * import { B2CInstance, OAuthStrategy } from '@salesforce/b2c-tooling'; + * import { + * executeJob, + * waitForJob, + * searchJobExecutions, + * siteArchiveImport, + * siteArchiveExport, + * } from '@salesforce/b2c-tooling/operations/jobs'; + * import { B2CInstance } from '@salesforce/b2c-tooling'; + * + * const instance = B2CInstance.fromDwJson(); + * + * // Run a custom job and wait for completion + * const execution = await executeJob(instance, 'my-job-id'); + * const result = await waitForJob(instance, 'my-job-id', execution.id); + * + * // Search for recent job executions + * const results = await searchJobExecutions(instance, { + * jobId: 'my-job-id', + * count: 10 + * }); + * + * // Import a site archive + * await siteArchiveImport(instance, './my-import-data'); * - * const auth = new OAuthStrategy({ - * clientId: 'your-client-id', - * clientSecret: 'your-client-secret', + * // Export site data + * const exportResult = await siteArchiveExport(instance, { + * global_data: { meta_data: true } * }); - * const instance = new B2CInstance( - * { hostname: 'your-sandbox.demandware.net' }, - * auth - * ); - * - * // Start a job - * const result = await runJob(instance, 'my-job-id'); - * - * // Poll for completion - * let status = await getJobStatus(instance, result.jobId, result.executionId); - * while (status.status === 'running') { - * await new Promise(resolve => setTimeout(resolve, 5000)); - * status = await getJobStatus(instance, result.jobId, result.executionId); - * } * ``` * * ## Authentication * - * Job operations require OAuth authentication with appropriate OCAPI permissions. + * Job operations require OAuth authentication with appropriate OCAPI permissions + * for the /jobs and /job_execution_search resources. * * @module operations/jobs */ -export {runJob, getJobStatus} from './run.js'; -export type {JobExecutionResult} from './run.js'; + +// Core job execution +export { + executeJob, + getJobExecution, + waitForJob, + searchJobExecutions, + findRunningJobExecution, + getJobLog, + getJobErrorMessage, + JobExecutionError, +} from './run.js'; + +export type { + JobExecution, + JobStepExecution, + JobExecutionStatus, + JobExecutionParameter, + ExecuteJobOptions, + WaitForJobOptions, + SearchJobExecutionsOptions, + JobExecutionSearchResult, +} from './run.js'; + +// Site archive import/export +export {siteArchiveImport, siteArchiveExport, siteArchiveExportToPath} from './site-archive.js'; + +export type { + SiteArchiveImportOptions, + SiteArchiveImportResult, + SiteArchiveExportOptions, + SiteArchiveExportResult, + ExportDataUnitsConfiguration, + ExportSitesConfiguration, + ExportGlobalDataConfiguration, +} from './site-archive.js'; diff --git a/packages/b2c-tooling/src/operations/jobs/run.ts b/packages/b2c-tooling/src/operations/jobs/run.ts index 698c79f8..56e28903 100644 --- a/packages/b2c-tooling/src/operations/jobs/run.ts +++ b/packages/b2c-tooling/src/operations/jobs/run.ts @@ -1,45 +1,458 @@ +/** + * Job execution operations for B2C Commerce. + * + * Provides functions for executing and monitoring jobs on B2C Commerce instances. + */ import {B2CInstance} from '../../instance/index.js'; +import type {components} from '../../clients/ocapi.generated.js'; +import {getLogger} from '../../logging/logger.js'; + +/** + * Job execution from OCAPI. + * Type alias to the generated schema. + */ +export type JobExecution = components['schemas']['job_execution']; + +/** + * Job step execution from OCAPI. + * Type alias to the generated schema. + */ +export type JobStepExecution = components['schemas']['job_step_execution']; + +/** + * Job execution status from OCAPI. + * Type alias to the generated schema's execution_status field. + */ +export type JobExecutionStatus = NonNullable; + +/** + * Job execution parameter for starting jobs. + * Type alias to the generated schema. + */ +export type JobExecutionParameter = components['schemas']['job_execution_parameter']; -export interface JobExecutionResult { - jobId: string; - status: 'running' | 'completed' | 'failed'; - startTime: Date; - endTime?: Date; +/** + * Options for executing a job. + */ +export interface ExecuteJobOptions { + /** Job parameters to pass */ + parameters?: JobExecutionParameter[]; + /** Wait for running jobs to finish before starting (default: true) */ + waitForRunning?: boolean; } /** - * Runs a job on an instance. + * Options for waiting on a job. */ -export async function runJob(instance: B2CInstance, jobId: string): Promise { - console.log(`Running job ${jobId} on ${instance.config.hostname}...`); +export interface WaitForJobOptions { + /** Polling interval in milliseconds (default: 3000) */ + pollInterval?: number; + /** Maximum time to wait in milliseconds (default: no limit) */ + timeout?: number; + /** Callback for progress updates */ + onProgress?: (execution: JobExecution, elapsedMs: number) => void; +} - // TODO: Implement actual job execution via OCAPI - // POST /s/-/dw/data/v21_10/jobs/{job_id}/executions +/** + * Executes a job on a B2C Commerce instance. + * + * Starts a job execution and returns immediately with the execution details. + * Use {@link waitForJob} to wait for completion. + * + * @param instance - B2C instance to execute on + * @param jobId - Job ID to execute + * @param options - Execution options + * @returns Job execution details + * @throws Error if job is already running (when waitForRunning is false) + * @throws Error if job not found or cannot be executed + * + * @example + * ```typescript + * // Execute a simple job + * const execution = await executeJob(instance, 'my-job-id'); + * + * // Execute with parameters + * const execution = await executeJob(instance, 'CustomerImportJob', { + * parameters: [ + * { name: 'SiteScope', value: '{"all_storefront_sites":true}' } + * ] + * }); + * ``` + */ +export async function executeJob( + instance: B2CInstance, + jobId: string, + options: ExecuteJobOptions = {}, +): Promise { + const logger = getLogger(); + const {parameters = [], waitForRunning = true} = options; - return { - jobId, - status: 'running', - startTime: new Date(), - }; + logger.debug({jobId, parameters}, `Executing job ${jobId}`); + + // Build request body - OCAPI accepts either parameters array or job-specific fields + const body = parameters.length > 0 ? {parameters} : undefined; + + const {data, error, response} = await instance.ocapi.POST('/jobs/{job_id}/executions', { + params: {path: {job_id: jobId}}, + body: body as unknown as string, + }); + + // Handle JobAlreadyRunningException + if (response.status === 400) { + // Need to check fault type - read raw response + const errorBody = await response.text().catch(() => ''); + if (errorBody.includes('JobAlreadyRunningException')) { + if (waitForRunning) { + logger.warn(`Job ${jobId} already running, waiting for it to finish...`); + + // Search for the running execution + const runningExecution = await findRunningJobExecution(instance, jobId); + if (runningExecution) { + logger.debug({executionId: runningExecution.id}, `Found running execution ${runningExecution.id}`); + await waitForJob(instance, jobId, runningExecution.id!); + // Retry execution after the running job finishes + return executeJob(instance, jobId, {...options, waitForRunning: false}); + } + // Couldn't find running job, try again + return executeJob(instance, jobId, {...options, waitForRunning: false}); + } + throw new Error(`Job ${jobId} is already running`); + } + } + + if (error || !data) { + const message = error?.fault?.message ?? `Failed to execute job ${jobId}`; + throw new Error(message); + } + + logger.debug({executionId: data.id, status: data.execution_status}, `Job ${jobId} started: ${data.id}`); + + return data; +} + +/** + * Gets the current status of a job execution. + * + * @param instance - B2C instance + * @param jobId - Job ID + * @param executionId - Execution ID + * @returns Current execution status + * @throws Error if execution not found + * + * @example + * ```typescript + * const status = await getJobExecution(instance, 'my-job', 'exec-123'); + * console.log(`Status: ${status.execution_status}`); + * ``` + */ +export async function getJobExecution( + instance: B2CInstance, + jobId: string, + executionId: string, +): Promise { + const {data, error} = await instance.ocapi.GET('/jobs/{job_id}/executions/{id}', { + params: {path: {job_id: jobId, id: executionId}}, + }); + + if (error || !data) { + const message = error?.fault?.message ?? `Failed to get job execution ${executionId}`; + throw new Error(message); + } + + return data; } /** - * Gets the status of a job execution. + * Waits for a job execution to complete. + * + * Polls the job status until it reaches a terminal state (finished or aborted). + * + * @param instance - B2C instance + * @param jobId - Job ID + * @param executionId - Execution ID to wait for + * @param options - Wait options + * @returns Final execution status + * @throws Error if job fails (status ERROR or aborted) + * @throws Error if timeout is exceeded + * + * @example + * ```typescript + * // Simple wait + * const result = await waitForJob(instance, 'my-job', 'exec-123'); + * + * // With progress callback + * const result = await waitForJob(instance, 'my-job', 'exec-123', { + * onProgress: (exec, elapsed) => { + * console.log(`Status: ${exec.execution_status}, elapsed: ${elapsed}ms`); + * } + * }); + * ``` */ -export async function getJobStatus( +export async function waitForJob( instance: B2CInstance, jobId: string, executionId: string, -): Promise { - console.log(`Getting status of job ${jobId} execution ${executionId}...`); + options: WaitForJobOptions = {}, +): Promise { + const logger = getLogger(); + const {pollInterval = 3000, timeout, onProgress} = options; + + const startTime = Date.now(); + let ticks = 0; + + while (true) { + await sleep(pollInterval); - // TODO: Implement actual status check via OCAPI - // GET /s/-/dw/data/v21_10/jobs/{job_id}/executions/{execution_id} + const elapsed = Date.now() - startTime; + if (timeout && elapsed > timeout) { + throw new Error(`Timeout waiting for job ${jobId} execution ${executionId}`); + } + + const execution = await getJobExecution(instance, jobId, executionId); + + // Call progress callback + if (onProgress) { + onProgress(execution, elapsed); + } + + // Check for terminal states + if (execution.execution_status === 'aborted' || execution.exit_status?.code === 'ERROR') { + logger.debug({execution}, `Job ${jobId} failed`); + throw new JobExecutionError(`Job ${jobId} failed`, execution); + } + + if (execution.execution_status === 'finished') { + const durationSec = (execution.duration ?? 0) / 1000; + logger.debug( + {executionId, status: execution.exit_status?.code, duration: durationSec}, + `Job ${jobId} finished. Status: ${execution.exit_status?.code} (duration: ${durationSec}s)`, + ); + return execution; + } + + // Log periodic updates + if (ticks % 5 === 0) { + logger.debug( + {executionId, status: execution.execution_status, elapsed: elapsed / 1000}, + `Waiting for job ${jobId} to finish (${(elapsed / 1000).toFixed(0)}s elapsed)...`, + ); + } + + ticks++; + } +} + +/** + * Error thrown when a job execution fails. + */ +export class JobExecutionError extends Error { + constructor( + message: string, + public readonly execution: JobExecution, + ) { + super(message); + this.name = 'JobExecutionError'; + } +} + +/** + * Extracts the error message from a failed job execution. + * + * Looks for the last step execution with exit_status code 'ERROR' and returns its message. + * + * @param execution - The job execution to extract the error message from + * @returns The error message if found, undefined otherwise + * + * @example + * ```typescript + * const errorMsg = getJobErrorMessage(execution); + * if (errorMsg) { + * console.error(`Job failed: ${errorMsg}`); + * } + * ``` + */ +export function getJobErrorMessage(execution: JobExecution): string | undefined { + if (!execution.step_executions || execution.step_executions.length === 0) { + return undefined; + } + + // Find the last step with ERROR status that has a message + for (let i = execution.step_executions.length - 1; i >= 0; i--) { + const step = execution.step_executions[i]; + if (step.exit_status?.code === 'ERROR' && step.exit_status?.message) { + return step.exit_status.message; + } + } + + return undefined; +} + +/** + * Search options for job executions. + */ +export interface SearchJobExecutionsOptions { + /** Filter by job ID */ + jobId?: string; + /** Filter by status (RUNNING, PENDING, OK, ERROR, etc.) */ + status?: string | string[]; + /** Maximum results to return (default: 25) */ + count?: number; + /** Starting index for pagination */ + start?: number; + /** Sort by field (default: start_time desc) */ + sortBy?: string; + /** Sort order */ + sortOrder?: 'asc' | 'desc'; +} + +/** + * Search results for job executions. + */ +export interface JobExecutionSearchResult { + /** Total matching executions */ + total: number; + /** Number of results returned */ + count: number; + /** Starting index */ + start: number; + /** Job executions */ + hits: JobExecution[]; +} + +/** + * Searches for job executions. + * + * @param instance - B2C instance + * @param options - Search options + * @returns Search results + * + * @example + * ```typescript + * // Search for all running jobs + * const results = await searchJobExecutions(instance, { + * status: ['RUNNING', 'PENDING'] + * }); + * + * // Search for a specific job's recent executions + * const results = await searchJobExecutions(instance, { + * jobId: 'my-job', + * count: 10 + * }); + * ``` + */ +export async function searchJobExecutions( + instance: B2CInstance, + options: SearchJobExecutionsOptions = {}, +): Promise { + const {jobId, status, count = 25, start = 0, sortBy = 'start_time', sortOrder = 'desc'} = options; + + // Build query + const queries: unknown[] = []; + + if (jobId) { + queries.push({ + term_query: {fields: ['job_id'], operator: 'is', values: [jobId]}, + }); + } + + if (status) { + const statusValues = Array.isArray(status) ? status : [status]; + queries.push({ + term_query: {fields: ['status'], operator: 'one_of', values: statusValues}, + }); + } + + // Build the query object + let query: unknown; + if (queries.length === 0) { + query = {match_all_query: {}}; + } else if (queries.length === 1) { + query = queries[0]; + } else { + query = {bool_query: {must: queries}}; + } + + const {data, error} = await instance.ocapi.POST('/job_execution_search', { + body: { + query, + count, + start, + sorts: [{field: sortBy, sort_order: sortOrder}], + } as unknown as components['schemas']['search_request'], + }); + + if (error || !data) { + const message = error?.fault?.message ?? 'Failed to search job executions'; + throw new Error(message); + } return { - jobId, - status: 'completed', - startTime: new Date(), - endTime: new Date(), + total: data.total ?? 0, + count: data.count ?? 0, + start: data.start ?? 0, + hits: (data.hits ?? []) as JobExecution[], }; } + +/** + * Finds a currently running job execution. + * + * @param instance - B2C instance + * @param jobId - Job ID to search for + * @returns Running execution or undefined if none found + */ +export async function findRunningJobExecution(instance: B2CInstance, jobId: string): Promise { + const results = await searchJobExecutions(instance, { + jobId, + status: ['RUNNING', 'PENDING'], + sortBy: 'start_time', + sortOrder: 'asc', + count: 1, + }); + + return results.hits[0]; +} + +/** + * Gets the log file content for a job execution. + * + * @param instance - B2C instance + * @param execution - Job execution with log file path + * @returns Log file content as string + * @throws Error if log file doesn't exist or cannot be retrieved + * + * @example + * ```typescript + * try { + * const result = await waitForJob(instance, 'my-job', 'exec-123'); + * } catch (error) { + * if (error instanceof JobExecutionError && error.execution.is_log_file_existing) { + * const log = await getJobLog(instance, error.execution); + * console.error('Job log:', log); + * } + * } + * ``` + */ +export async function getJobLog(instance: B2CInstance, execution: JobExecution): Promise { + if (!execution.log_file_path) { + throw new Error('No log file path available'); + } + + if (!execution.is_log_file_existing) { + throw new Error('Log file does not exist'); + } + + // log_file_path from OCAPI is "/Sites/LOGS/jobs/..." + // WebDAV client base is /webdav/Sites, so strip the leading /Sites/ + const logPath = execution.log_file_path.replace(/^\/Sites\//, ''); + + const content = await instance.webdav.get(logPath); + return new TextDecoder().decode(content); +} + +/** + * Helper function for sleeping. + */ +function sleep(ms: number): Promise { + return new Promise((resolve) => setTimeout(resolve, ms)); +} diff --git a/packages/b2c-tooling/src/operations/jobs/site-archive.ts b/packages/b2c-tooling/src/operations/jobs/site-archive.ts new file mode 100644 index 00000000..6a57f1e8 --- /dev/null +++ b/packages/b2c-tooling/src/operations/jobs/site-archive.ts @@ -0,0 +1,567 @@ +/** + * Site archive import/export operations for B2C Commerce. + * + * Provides functions for importing and exporting site archives using + * the sfcc-site-archive-import and sfcc-site-archive-export system jobs. + */ +import * as fs from 'node:fs'; +import * as path from 'node:path'; +import JSZip from 'jszip'; +import {B2CInstance} from '../../instance/index.js'; +import {getLogger} from '../../logging/logger.js'; +import { + executeJob, + waitForJob, + JobExecutionError, + getJobLog, + type JobExecution, + type WaitForJobOptions, +} from './run.js'; + +const IMPORT_JOB_ID = 'sfcc-site-archive-import'; +const EXPORT_JOB_ID = 'sfcc-site-archive-export'; + +/** + * Options for site archive import. + */ +export interface SiteArchiveImportOptions { + /** Keep archive on instance after import (default: false) */ + keepArchive?: boolean; + /** Wait options for job completion */ + waitOptions?: WaitForJobOptions; +} + +/** + * Result of a site archive import. + */ +export interface SiteArchiveImportResult { + /** Job execution details */ + execution: JobExecution; + /** Archive filename on instance */ + archiveFilename: string; + /** Whether archive was kept on instance */ + archiveKept: boolean; +} + +/** + * Imports a site archive to a B2C Commerce instance. + * + * Supports importing from: + * - A local directory (will be zipped automatically) + * - A local zip file + * - A Buffer containing zip data + * - A filename already on the instance (in Impex/src/instance/) + * + * @param instance - B2C instance to import to + * @param target - Source to import (directory path, zip file path, Buffer, or remote filename) + * @param options - Import options + * @returns Import result with execution details + * @throws JobExecutionError if import job fails + * + * @example + * ```typescript + * // Import from a local directory + * const result = await siteArchiveImport(instance, './my-site-data'); + * + * // Import from a zip file + * const result = await siteArchiveImport(instance, './export.zip'); + * + * // Import from a buffer + * const zipBuffer = await fs.promises.readFile('./export.zip'); + * const result = await siteArchiveImport(instance, zipBuffer, { + * archiveName: 'my-import' + * }); + * + * // Import from existing file on instance + * const result = await siteArchiveImport(instance, { + * remoteFilename: 'existing-archive.zip' + * }); + * ``` + */ +export async function siteArchiveImport( + instance: B2CInstance, + target: string | Buffer | {remoteFilename: string; archiveName?: string}, + options: SiteArchiveImportOptions & {archiveName?: string} = {}, +): Promise { + const logger = getLogger(); + const {keepArchive = false, waitOptions, archiveName} = options; + + let zipFilename: string; + let needsUpload = true; + let archiveContent: Buffer | NodeJS.ReadableStream | undefined; + + // Handle different input types + if (typeof target === 'object' && 'remoteFilename' in target) { + // Remote filename - no upload needed + zipFilename = target.remoteFilename; + needsUpload = false; + } else if (Buffer.isBuffer(target)) { + // Buffer - use provided archive name + if (!archiveName) { + throw new Error('archiveName is required when importing from a Buffer'); + } + zipFilename = archiveName.endsWith('.zip') ? archiveName : `${archiveName}.zip`; + archiveContent = target; + } else { + // File path - check if directory or zip file + const targetPath = target as string; + + if (!fs.existsSync(targetPath)) { + throw new Error(`Target not found: ${targetPath}`); + } + + const stat = await fs.promises.stat(targetPath); + + if (stat.isFile()) { + // Existing zip file + archiveContent = await fs.promises.readFile(targetPath); + zipFilename = path.basename(targetPath); + } else if (stat.isDirectory()) { + // Directory - create zip archive + const timestamp = Date.now(); + const archiveDirName = archiveName || `import-${timestamp}`; + zipFilename = `${archiveDirName}.zip`; + + logger.debug({directory: targetPath}, `Creating archive from directory: ${targetPath}`); + archiveContent = await createArchiveFromDirectory(targetPath, archiveDirName); + } else { + throw new Error(`Target must be a file or directory: ${targetPath}`); + } + } + + // Upload archive if needed + const uploadPath = `Impex/src/instance/${zipFilename}`; + + if (needsUpload && archiveContent) { + logger.debug({path: uploadPath}, `Uploading archive to ${uploadPath}`); + await instance.webdav.put(uploadPath, archiveContent as Buffer, 'application/zip'); + logger.debug(`Archive uploaded: ${uploadPath}`); + } + + // Execute the import job + logger.debug(`Executing ${IMPORT_JOB_ID} job`); + + let execution: JobExecution; + try { + // Try the standard form first (external users) + execution = await executeJob(instance, IMPORT_JOB_ID, { + parameters: [], + waitForRunning: true, + }); + + // The job needs the file_name - try with parameters form + // Different SFCC versions accept different formats + } catch (error) { + // If we get UnknownPropertyException, try the parameters format + if (error instanceof Error && error.message.includes('UnknownPropertyException')) { + logger.warn('Using parameters format for import job'); + } + throw error; + } + + // Execute with the correct format - try file_name first, then parameters + try { + const {data, error} = await instance.ocapi.POST('/jobs/{job_id}/executions', { + params: {path: {job_id: IMPORT_JOB_ID}}, + body: {file_name: zipFilename} as unknown as string, + }); + + if (error || !data) { + throw new Error(error?.fault?.message ?? 'Failed to execute import job'); + } + + execution = data; + } catch { + // Try with parameters format for internal users + logger.warn('Retrying with parameters format for internal users'); + + const {data, error} = await instance.ocapi.POST('/jobs/{job_id}/executions', { + params: {path: {job_id: IMPORT_JOB_ID}}, + body: { + parameters: [{name: 'ImportFile', value: zipFilename}], + } as unknown as string, + }); + + if (error || !data) { + throw new Error(error?.fault?.message ?? 'Failed to execute import job'); + } + + execution = data; + } + + logger.debug({executionId: execution.id}, `Import job started: ${execution.id}`); + + // Wait for completion + try { + execution = await waitForJob(instance, IMPORT_JOB_ID, execution.id!, waitOptions); + } catch (error) { + if (error instanceof JobExecutionError) { + // Try to get log file + try { + const log = await getJobLog(instance, error.execution); + logger.error({logFile: error.execution.log_file_path}, `Job log:\n${log}`); + } catch { + logger.error('Could not retrieve job log'); + } + } + throw error; + } + + // Clean up archive if not keeping + if (!keepArchive && needsUpload) { + await instance.webdav.delete(uploadPath); + logger.debug(`Archive deleted: ${uploadPath}`); + } + + return { + execution, + archiveFilename: zipFilename, + archiveKept: keepArchive, + }; +} + +/** + * Creates a zip archive from a directory. + */ +async function createArchiveFromDirectory(dirPath: string, archiveDirName: string): Promise { + const zip = new JSZip(); + const rootFolder = zip.folder(archiveDirName)!; + + await addDirectoryToZip(rootFolder, dirPath); + + return zip.generateAsync({ + type: 'nodebuffer', + compression: 'DEFLATE', + compressionOptions: {level: 9}, + }); +} + +/** + * Recursively adds directory contents to a JSZip folder. + */ +async function addDirectoryToZip(zipFolder: JSZip, dirPath: string): Promise { + const entries = await fs.promises.readdir(dirPath, {withFileTypes: true}); + + for (const entry of entries) { + const fullPath = path.join(dirPath, entry.name); + + if (entry.isDirectory()) { + const subFolder = zipFolder.folder(entry.name)!; + await addDirectoryToZip(subFolder, fullPath); + } else if (entry.isFile()) { + const content = await fs.promises.readFile(fullPath); + zipFolder.file(entry.name, content); + } + } +} + +/** + * Configuration for sites in export. + */ +export interface ExportSitesConfiguration { + ab_tests?: boolean; + active_data_feeds?: boolean; + all?: boolean; + cache_settings?: boolean; + campaigns_and_promotions?: boolean; + content?: boolean; + coupons?: boolean; + custom_objects?: boolean; + customer_cdn_settings?: boolean; + customer_groups?: boolean; + distributed_commerce_extensions?: boolean; + dynamic_file_resources?: boolean; + gift_certificates?: boolean; + ocapi_settings?: boolean; + payment_methods?: boolean; + payment_processors?: boolean; + redirect_urls?: boolean; + search_settings?: boolean; + shipping?: boolean; + site_descriptor?: boolean; + site_preferences?: boolean; + sitemap_settings?: boolean; + slots?: boolean; + sorting_rules?: boolean; + source_codes?: boolean; + static_dynamic_alias_mappings?: boolean; + stores?: boolean; + tax?: boolean; + url_rules?: boolean; +} + +/** + * Configuration for global data in export. + */ +export interface ExportGlobalDataConfiguration { + access_roles?: boolean; + all?: boolean; + csc_settings?: boolean; + csrf_whitelists?: boolean; + custom_preference_groups?: boolean; + custom_quota_settings?: boolean; + custom_types?: boolean; + geolocations?: boolean; + global_custom_objects?: boolean; + job_schedules?: boolean; + job_schedules_deprecated?: boolean; + locales?: boolean; + meta_data?: boolean; + oauth_providers?: boolean; + ocapi_settings?: boolean; + page_meta_tags?: boolean; + preferences?: boolean; + price_adjustment_limits?: boolean; + services?: boolean; + sorting_rules?: boolean; + static_resources?: boolean; + system_type_definitions?: boolean; + users?: boolean; + webdav_client_permissions?: boolean; +} + +/** + * Data units configuration for export. + */ +export interface ExportDataUnitsConfiguration { + /** Catalog static resources to export (catalog_id: true) */ + catalog_static_resources?: Record; + /** Catalogs to export (catalog_id: true) */ + catalogs?: Record; + /** Customer lists to export (list_id: true) */ + customer_lists?: Record; + /** Inventory lists to export (list_id: true) */ + inventory_lists?: Record; + /** Library static resources to export (library_id: true) */ + library_static_resources?: Record; + /** Libraries to export (library_id: true) */ + libraries?: Record; + /** Price books to export (pricebook_id: true) */ + price_books?: Record; + /** Sites to export (site_id: ExportSitesConfiguration) */ + sites?: Record | boolean>; + /** Global data to export */ + global_data?: Partial; +} + +/** + * Options for site archive export. + */ +export interface SiteArchiveExportOptions { + /** Keep archive on instance after download (default: false) */ + keepArchive?: boolean; + /** Wait options for job completion */ + waitOptions?: WaitForJobOptions; +} + +/** + * Result of a site archive export. + */ +export interface SiteArchiveExportResult { + /** Job execution details */ + execution: JobExecution; + /** Archive filename on instance */ + archiveFilename: string; + /** Archive content as buffer (if downloaded) */ + data?: Buffer; + /** Whether archive was kept on instance */ + archiveKept: boolean; +} + +/** + * Exports a site archive from a B2C Commerce instance. + * + * @param instance - B2C instance to export from + * @param dataUnits - Data units configuration specifying what to export + * @param options - Export options + * @returns Export result with archive data + * @throws JobExecutionError if export job fails + * + * @example + * ```typescript + * // Export global meta data + * const result = await siteArchiveExport(instance, { + * global_data: { meta_data: true } + * }); + * + * // Export a site's content + * const result = await siteArchiveExport(instance, { + * sites: { + * 'RefArch': { content: true, site_preferences: true } + * } + * }); + * + * // Export catalogs + * const result = await siteArchiveExport(instance, { + * catalogs: { 'storefront-catalog': true } + * }); + * ``` + */ +export async function siteArchiveExport( + instance: B2CInstance, + dataUnits: Partial, + options: SiteArchiveExportOptions = {}, +): Promise { + const logger = getLogger(); + const {keepArchive = false, waitOptions} = options; + + // Generate archive filename + const timestamp = new Date().toISOString().replace(/[:.-]+/g, ''); + const archiveDirName = `${timestamp}_export`; + const zipFilename = `${archiveDirName}.zip`; + const webdavPath = `Impex/src/instance/${zipFilename}`; + + logger.debug(`Executing ${EXPORT_JOB_ID} job`); + logger.debug({dataUnits}, 'Export data units'); + + let execution: JobExecution; + + // Execute export job - try export_file format first + try { + const {data, error} = await instance.ocapi.POST('/jobs/{job_id}/executions', { + params: {path: {job_id: EXPORT_JOB_ID}}, + body: { + export_file: zipFilename, + data_units: dataUnits, + } as unknown as string, + }); + + if (error || !data) { + throw new Error(error?.fault?.message ?? 'Failed to execute export job'); + } + + execution = data; + } catch { + // Try parameters format for internal users + logger.warn('Retrying with parameters format for internal users'); + + const {data, error} = await instance.ocapi.POST('/jobs/{job_id}/executions', { + params: {path: {job_id: EXPORT_JOB_ID}}, + body: { + parameters: [ + {name: 'ExportFile', value: zipFilename}, + {name: 'DataUnits', value: JSON.stringify(dataUnits)}, + ], + } as unknown as string, + }); + + if (error || !data) { + throw new Error(error?.fault?.message ?? 'Failed to execute export job'); + } + + execution = data; + } + + logger.debug({executionId: execution.id}, `Export job started: ${execution.id}`); + + // Wait for completion + try { + execution = await waitForJob(instance, EXPORT_JOB_ID, execution.id!, waitOptions); + } catch (error) { + if (error instanceof JobExecutionError) { + // Try to get log file + try { + const log = await getJobLog(instance, error.execution); + logger.error({logFile: error.execution.log_file_path}, `Job log:\n${log}`); + } catch { + logger.error('Could not retrieve job log'); + } + } + throw error; + } + + // Download archive + logger.debug(`Downloading archive: ${webdavPath}`); + const archiveData = await instance.webdav.get(webdavPath); + + // Clean up if not keeping + if (!keepArchive) { + await instance.webdav.delete(webdavPath); + logger.debug(`Archive deleted: ${webdavPath}`); + } + + return { + execution, + archiveFilename: zipFilename, + data: Buffer.from(archiveData), + archiveKept: keepArchive, + }; +} + +/** + * Exports a site archive and saves it to a local path. + * + * @param instance - B2C instance to export from + * @param dataUnits - Data units configuration + * @param outputPath - Local path to save the archive + * @param options - Export options + * @returns Export result + * + * @example + * ```typescript + * // Export and save to a directory (extracts zip) + * await siteArchiveExportToPath(instance, { global_data: { meta_data: true } }, './exports'); + * + * // Export and save as zip + * await siteArchiveExportToPath(instance, { global_data: { meta_data: true } }, './exports/archive.zip'); + * ``` + */ +export async function siteArchiveExportToPath( + instance: B2CInstance, + dataUnits: Partial, + outputPath: string, + options: SiteArchiveExportOptions & {extractZip?: boolean} = {}, +): Promise { + const logger = getLogger(); + const {extractZip = true, ...exportOptions} = options; + + const result = await siteArchiveExport(instance, dataUnits, exportOptions); + + if (!result.data) { + throw new Error('No archive data returned'); + } + + // Determine output handling + const isZipPath = outputPath.endsWith('.zip'); + + if (isZipPath || !extractZip) { + // Save as zip file + const zipPath = isZipPath ? outputPath : path.join(outputPath, result.archiveFilename); + + // Ensure directory exists + await fs.promises.mkdir(path.dirname(zipPath), {recursive: true}); + await fs.promises.writeFile(zipPath, result.data); + + logger.debug(`Archive saved to: ${zipPath}`); + + return { + ...result, + localPath: zipPath, + }; + } else { + // Extract to directory + await fs.promises.mkdir(outputPath, {recursive: true}); + + const zip = await JSZip.loadAsync(result.data); + + for (const [relativePath, zipEntry] of Object.entries(zip.files)) { + const fullPath = path.join(outputPath, relativePath); + + if (zipEntry.dir) { + await fs.promises.mkdir(fullPath, {recursive: true}); + } else { + // Ensure parent directory exists + await fs.promises.mkdir(path.dirname(fullPath), {recursive: true}); + const content = await zipEntry.async('nodebuffer'); + await fs.promises.writeFile(fullPath, content); + } + } + + logger.debug(`Archive extracted to: ${outputPath}`); + + return { + ...result, + localPath: outputPath, + }; + } +} diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index 04fb19cb..e8ac89c6 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -105,9 +105,6 @@ importers: packages/b2c-tooling: dependencies: - archiver: - specifier: ^7.0.1 - version: 7.0.1 chokidar: specifier: ^5.0.0 version: 5.0.0 @@ -117,6 +114,9 @@ importers: i18next: specifier: ^25.6.3 version: 25.6.3(typescript@5.9.3) + jszip: + specifier: ^3.10.1 + version: 3.10.1 open: specifier: ^11.0.0 version: 11.0.0 @@ -142,9 +142,6 @@ importers: '@salesforce/dev-config': specifier: ^4.3.2 version: 4.3.2 - '@types/archiver': - specifier: ^7.0.0 - version: 7.0.0 '@types/node': specifier: ^18.19.130 version: 18.19.130 @@ -1031,10 +1028,6 @@ packages: resolution: {integrity: sha512-ZT55BDLV0yv0RBm2czMiZ+SqCGO7AvmOM3G/w2xhVPH+te0aKgFjmBvGlL1dH+ql2tgGO3MVrbb3jCKyvpgnxA==} engines: {node: 20 || >=22} - '@isaacs/cliui@8.0.2': - resolution: {integrity: sha512-O8jcjabXaleOG9DQ0+ARXWZBTfnP4WNAqzuiJK7ll44AmxGKv/J2M4TPjxjY3znBCfvBXFzucm1twdyFybFqEA==} - engines: {node: '>=12'} - '@jridgewell/sourcemap-codec@1.5.5': resolution: {integrity: sha512-cYQ9310grqxueWbl+WuIUIaiUaDcj7WOq5fVhEljNVgRfOUhY9fy2zTvfoqWsnebh8Sl70VScFbICvJnLKB0Og==} @@ -1097,10 +1090,6 @@ packages: '@pinojs/redact@0.4.0': resolution: {integrity: sha512-k2ENnmBugE/rzQfEcdWHcCY+/FM3VLzH9cYEsbdsoqrvzAKRhUZeRNhAZvB8OitQJ1TBed3yqWtdjzS6wJKBwg==} - '@pkgjs/parseargs@0.11.0': - resolution: {integrity: sha512-+1VkjdD0QBLPodGrJUeqarH8VAIvQODIbwh9XpP5Syisf7YoQgsJKPNFoqqLQlu+VQ/tVSshMR6loPMn8U+dPg==} - engines: {node: '>=14'} - '@pkgr/core@0.2.9': resolution: {integrity: sha512-QNqXyfVS2wm9hweSYD2O7F0G06uurj9kZ96TRQE5Y9hU7+tgdZwIkbAKc5Ocy1HxEY2kuDQa6cQ1WRs/O5LFKA==} engines: {node: ^12.20.0 || ^14.18.0 || >=16.0.0} @@ -1529,9 +1518,6 @@ packages: '@tybys/wasm-util@0.10.1': resolution: {integrity: sha512-9tTaPJLSiejZKx+Bmog4uSubteqTvFrVrURwkmHixBo0G4seD0zUxp98E1DzUBJxLQ3NPwXrGKDiVjwx/DpPsg==} - '@types/archiver@7.0.0': - resolution: {integrity: sha512-/3vwGwx9n+mCQdYZ2IKGGHEFL30I96UgBlk8EtRDDFQ9uxM1l4O5Ci6r00EMAkiDaTqD9DQ6nVrWRICnBPtzzg==} - '@types/chai@4.3.20': resolution: {integrity: sha512-/pC9HAB5I/xMlc5FP77qjCnI16ChlJfW0tGa0IUcFn38VJrTV6DeZ60NU5KZBtaOZqjdpwTWohz5HU1RrhiYxQ==} @@ -1577,9 +1563,6 @@ packages: '@types/normalize-package-data@2.4.4': resolution: {integrity: sha512-37i+OaWTh9qeK4LSHPsyRC7NahnGotNuZvjLSgcPzblpHB3rrCJxAOgI5gCdKm7coonsaX1Of0ILiTcnZjbfxA==} - '@types/readdir-glob@1.1.5': - resolution: {integrity: sha512-raiuEPUYqXu+nvtY2Pe8s8FEmZ3x5yAH4VkLdihcPdalvsHltomrRC9BzuStrJ9yk06470hS0Crw0f1pXqD+Hg==} - '@types/unist@3.0.3': resolution: {integrity: sha512-ko/gIFJRv177XgZsZcBwnqJN5x/Gien8qNOn0D5bQU/zAzVf9Zt3BlcUiLqhV9y4ARk0GbT3tnUiPNgnTXzc/Q==} @@ -1849,10 +1832,6 @@ packages: '@vueuse/shared@12.8.2': resolution: {integrity: sha512-dznP38YzxZoNloI0qpEfpkms8knDtaoQ6Y/sfS0L7Yki4zh40LFHEhur0odJC6xTHG5dxWVPiUWBXn+wCG2s5w==} - abort-controller@3.0.0: - resolution: {integrity: sha512-h8lQ8tacZYnR3vNQTgibj+tODHI5/+l06Au2Pcriv/Gmet0eaj4TwWH41sO9wnHDiQsEj19q0drzdWdeAHtweg==} - engines: {node: '>=6.5'} - acorn-jsx@5.3.2: resolution: {integrity: sha512-rq9s+JNhf0IChjtDXxllJ7g41oZk5SlXtp0LHwyA5cejwn7vKmKp4pPri6YEePv2PU65sAsegbXtIinmDFDXgQ==} peerDependencies: @@ -1906,14 +1885,6 @@ packages: resolution: {integrity: sha512-KMReFUr0B4t+D+OBkjR3KYqvocp2XaSzO55UcB6mgQMd3KbcE+mWTyvVV7D/zsdEbNnV6acZUutkiHQXvTr1Rw==} engines: {node: '>= 8'} - archiver-utils@5.0.2: - resolution: {integrity: sha512-wuLJMmIBQYCsGZgYLTy5FIB2pF6Lfb6cXMSF8Qywwk3t20zWnAi7zLcQFdKQmIB8wyZpY5ER38x08GbwtR2cLA==} - engines: {node: '>= 14'} - - archiver@7.0.1: - resolution: {integrity: sha512-ZcbTaIqJOfCc03QwD468Unz/5Ir8ATtvAHsK+FdXbDIbGfihqh9mrvdcYunQzqn4HrvWWaFyaxJhGZagaJJpPQ==} - engines: {node: '>= 14'} - are-docs-informative@0.0.2: resolution: {integrity: sha512-ixiS0nLNNG5jNQzgZJNoUpBKdo9yTYZMGJ+QgT2jmjR7G7+QHRCc4v6LQ3NgE7EBJq+o0ams3waJwkrlBom8Ig==} engines: {node: '>=14'} @@ -1966,28 +1937,9 @@ packages: resolution: {integrity: sha512-wvUjBtSGN7+7SjNpq/9M2Tg350UZD3q62IFZLbRAR1bSMlCo1ZaeW+BJ+D090e4hIIZLBcTDWe4Mh4jvUDajzQ==} engines: {node: '>= 0.4'} - b4a@1.7.3: - resolution: {integrity: sha512-5Q2mfq2WfGuFp3uS//0s6baOJLMoVduPYVeNmDYxu5OUA1/cBfvr2RIS7vi62LdNj/urk1hfmj867I3qt6uZ7Q==} - peerDependencies: - react-native-b4a: '*' - peerDependenciesMeta: - react-native-b4a: - optional: true - balanced-match@1.0.2: resolution: {integrity: sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw==} - bare-events@2.8.2: - resolution: {integrity: sha512-riJjyv1/mHLIPX4RwiK+oW9/4c3TEUeORHKefKAKnZ5kyslbN+HXowtbaVEqt4IMUB7OXlfixcs6gsFeo/jhiQ==} - peerDependencies: - bare-abort-controller: '*' - peerDependenciesMeta: - bare-abort-controller: - optional: true - - base64-js@1.5.1: - resolution: {integrity: sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA==} - baseline-browser-mapping@2.8.26: resolution: {integrity: sha512-73lC1ugzwoaWCLJ1LvOgrR5xsMLTqSKIEoMHVtL9E/HNk0PXtTM76ZIm84856/SF7Nv8mPZxKoBsgpm0tR1u1Q==} hasBin: true @@ -2020,13 +1972,6 @@ packages: engines: {node: ^6 || ^7 || ^8 || ^9 || ^10 || ^11 || ^12 || >=13.7} hasBin: true - buffer-crc32@1.0.0: - resolution: {integrity: sha512-Db1SbgBS/fg/392AblrMJk97KggmvYhr4pB5ZIMTWtaivCPMWLkmb7m21cJvpvgK+J3nsU2CmmixNBZx4vFj/w==} - engines: {node: '>=8.0.0'} - - buffer@6.0.3: - resolution: {integrity: sha512-FTiCpNxtwiZZHEZbcbTIcZjERVICn9yq/pDFkTl95/AxzD1naBctN7YO68riM/gLSDY7sdrMby8hofADYuuqOA==} - builtin-modules@3.3.0: resolution: {integrity: sha512-zhaCDicdLuWN5UbN5IMnFqNMhNfo919sH85y2/ea+5Yg9TsTkeZxpL+JLbp6cgYFS4sRLp3YV4S6yDuqVWHYOw==} engines: {node: '>=6'} @@ -2163,10 +2108,6 @@ packages: resolution: {integrity: sha512-buhp5kePrmda3vhc5B9t7pUQXAb2Tnd0qgpkIhPhkHXxJpiPJ11H0ZEU0oBpJ2QztSbzG/ZxMj/CHsYJqRHmyg==} engines: {node: '>= 12.0.0'} - compress-commons@6.0.2: - resolution: {integrity: sha512-6FqVXeETqWPoGcfzrXb37E50NP0LXT8kAMu5ooZayhWWdgEY4lBEEcbQNXtkuKQsGduxiIcI4gOTsxTmuq/bSg==} - engines: {node: '>= 14'} - concat-map@0.0.1: resolution: {integrity: sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg==} @@ -2193,15 +2134,6 @@ packages: core-util-is@1.0.3: resolution: {integrity: sha512-ZQBvi1DcpJ4GDqanjucZ2Hj3wEO5pZDS89BWbkcrvdxksJorwUDDZamX9ldFkp9aw2lmBDLgkObEA4DWNJ9FYQ==} - crc-32@1.2.2: - resolution: {integrity: sha512-ROmzCKrTnOwybPcJApAA6WBWij23HVfGVNKqqrZpuyZOHqK2CwHSvpGuyt/UNNvaIjEd8X5IFGp4Mh+Ie1IHJQ==} - engines: {node: '>=0.8'} - hasBin: true - - crc32-stream@6.0.0: - resolution: {integrity: sha512-piICUB6ei4IlTv1+653yq5+KoqfBYmj9bw6LqXoOneTMDXk5nM1qt12mFW1caG3LlJXEKW1Bp0WggEmIfQB34g==} - engines: {node: '>= 14'} - cross-spawn@7.0.6: resolution: {integrity: sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==} engines: {node: '>= 8'} @@ -2310,9 +2242,6 @@ packages: resolution: {integrity: sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==} engines: {node: '>= 0.4'} - eastasianwidth@0.2.0: - resolution: {integrity: sha512-I88TYZWc9XiYHRQ4/3c5rjjfgkjhLyW2luGIheGERbNQ6OY7yTybanSpDXZa8y7VUP9YmDcYa+eyq4ca7iLqWA==} - ejs@3.1.10: resolution: {integrity: sha512-UeJmFfOrAQS8OJWPZ4qtgHyWExa088/MtK5UEyoJGFH67cDEXkZSviOiKRCZ4Xij0zxI3JECgYs3oKx+AizQBA==} engines: {node: '>=0.10.0'} @@ -2330,9 +2259,6 @@ packages: emoji-regex@8.0.0: resolution: {integrity: sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==} - emoji-regex@9.2.2: - resolution: {integrity: sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg==} - end-of-stream@1.4.5: resolution: {integrity: sha512-ooEGc6HP26xXq/N+GCGOT0JKCLDGrq2bQUZrQ7gyrJiZANJ/8YDTxTpQBXGMn+WbIQXNVpyWymm7KYVICQnyOg==} @@ -2613,17 +2539,6 @@ packages: resolution: {integrity: sha512-kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g==} engines: {node: '>=0.10.0'} - event-target-shim@5.0.1: - resolution: {integrity: sha512-i/2XbnSz/uxRCU6+NdVJgKWDTM427+MqYbkQzD321DuCQJUqOuJKIA0IM2+W2xtYHdKOmZ4dR6fExsd4SXL+WQ==} - engines: {node: '>=6'} - - events-universal@1.0.1: - resolution: {integrity: sha512-LUd5euvbMLpwOF8m6ivPCbhQeSiYVNb8Vs0fQ8QjXo0JTkEHpz8pxdQf0gStltaPpw0Cca8b39KxvK9cfKRiAw==} - - events@3.3.0: - resolution: {integrity: sha512-mQw+2fkQbALzQ7V0MY0IqdnXNOeTtP4r0lN9z7AAawCXgqea7bDii20AYrIBrFd/Hx0M2Ocz6S111CaFkUcb0Q==} - engines: {node: '>=0.8.x'} - fast-copy@3.0.2: resolution: {integrity: sha512-dl0O9Vhju8IrcLndv2eU4ldt1ftXMqqfgN4H1cpmGV7P6jeB9FwpN9a2c8DPGE1Ys88rNUJVYDHq73CGAGOPfQ==} @@ -2633,9 +2548,6 @@ packages: fast-diff@1.3.0: resolution: {integrity: sha512-VxPP4NqbUjj6MaAOafWeUn2cXWLcCtljklUtZf0Ind4XQ+QPtmA0b18zZy0jIQx+ExRVCR/ZQpBmik5lXshNsw==} - fast-fifo@1.3.2: - resolution: {integrity: sha512-/d9sfos4yxzpwkDkuN7k2SqFKtYNmCTzgfEpz82x34IM9/zc8KGxQoXg1liNC/izpRM/MBdt44Nmx41ZWqk+FQ==} - fast-glob@3.3.3: resolution: {integrity: sha512-7MptL8U0cqcFdzIzwOTHoilX9x5BrNqye7Z/LuC7kCMRio1EMSyqRK3BEAUD7sXRq4iT4AzTVuZdhgQ2TCvYLg==} engines: {node: '>=8.6.0'} @@ -2715,10 +2627,6 @@ packages: resolution: {integrity: sha512-dKx12eRCVIzqCxFGplyFKJMPvLEWgmNtUrpTiJIR5u97zEhRG8ySrtboPHZXx7daLxQVrl643cTzbab2tkQjxg==} engines: {node: '>= 0.4'} - foreground-child@3.3.1: - resolution: {integrity: sha512-gIXjKqtFuWEgzFRJA9WCQeSJLZDjgJUOMCMzxtvFq/37KojM1BFGufqsCy0r4qSQmYLsZYMeyRqzIWOMup03sw==} - engines: {node: '>=14'} - form-data-encoder@2.1.4: resolution: {integrity: sha512-yDYSgNMraqvnxiEXO4hi88+YZxaHC6QKzb5N84iRCTDeRO7ZALpir/lVmf/uXUhnwUr2O4HU8s/n6x+yNjQkHw==} engines: {node: '>= 14.17'} @@ -2801,10 +2709,6 @@ packages: resolution: {integrity: sha512-XxwI8EOhVQgWp6iDL+3b0r86f4d6AX6zSU55HfB4ydCEuXLXc5FcYeOu+nnGftS4TEju/11rt4KJPTMgbfmv4A==} engines: {node: '>=10.13.0'} - glob@10.5.0: - resolution: {integrity: sha512-DfXN8DfhJ7NH3Oe7cFmu3NCu1wKbkReJ8TorzSAFbSKrlNaQSKfIzqYqVY8zlbs2NLBbWpRiU52GX2PbaBVNkg==} - hasBin: true - glob@13.0.0: resolution: {integrity: sha512-tvZgpqk6fz4BaNZ66ZsRaZnbHvP/jG3uKJvAZOwEVUL4RTA5nJeeLYfyN9/VA8NX/V3IBG+hkeuGpKjvELkVhA==} engines: {node: 20 || >=22} @@ -2941,9 +2845,6 @@ packages: resolution: {integrity: sha512-cf6L2Ds3h57VVmkZe+Pn+5APsT7FpqJtEhhieDCvrE2MK5Qk9MyffgQyuxQTm6BChfeZNtcOLHp9IcWRVcIcBQ==} engines: {node: '>=0.10.0'} - ieee754@1.2.1: - resolution: {integrity: sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA==} - ignore@5.3.2: resolution: {integrity: sha512-hsBTNUqQTDwkWtcdYI2i06Y/nUBEsNEDJKjWdigLvegy8kDuJAS8uRlpkkcQpyEXL0Z/pjDy5HBmMjRCJ2gq+g==} engines: {node: '>= 4'} @@ -2952,6 +2853,9 @@ packages: resolution: {integrity: sha512-Hs59xBNfUIunMFgWAbGX5cq6893IbWg4KnrjbYwX3tx0ztorVgTDA6B2sxf8ejHJ4wz8BqGUMYlnzNBer5NvGg==} engines: {node: '>= 4'} + immediate@3.0.6: + resolution: {integrity: sha512-XXOFtyqDjNDAQxVfYxuF7g9Il/IbWmmlQg2MYKOH8ExIT1qg6xc4zyS3HaEEATgs1btfzxq15ciUiY7gjSXRGQ==} + import-fresh@3.3.1: resolution: {integrity: sha512-TR3KfrTZTYLPB6jUjfx6MF9WcWrHL9su5TObK4ZkYgBdWKPOFoSoQIdEuTuR82pmtxH2spWG9h6etwfr1pLBqQ==} engines: {node: '>=6'} @@ -3168,9 +3072,6 @@ packages: resolution: {integrity: sha512-LpB/54B+/2J5hqQ7imZHfdU31OlgQqx7ZicVlkm9kzg9/w8GKLEcFfJl/t7DCEDueOyBAD6zCCwTO6Fzs0NoEQ==} engines: {node: '>=16'} - jackspeak@3.4.3: - resolution: {integrity: sha512-OGlZQpz2yfahA/Rd1Y8Cd9SIEsqvXkLVoSw/cgwhnhFMDbsQFeZYoJJ7bIZBS9BcamUW96asq/npPWugM+RQBw==} - jake@10.9.4: resolution: {integrity: sha512-wpHYzhxiVQL+IV05BLE2Xn34zW1S223hvjtqk0+gsPrwd/8JNLXJgZZM/iPFsYc1xyphF+6M6EvdE5E9MBGkDA==} engines: {node: '>=10'} @@ -3229,17 +3130,19 @@ packages: jsonfile@4.0.0: resolution: {integrity: sha512-m6F1R3z8jjlf2imQHS2Qez5sjKWQzbuuhuJ/FKYFRZvPE3PuHcSMVZzfsLhGVOkfd20obL5SWEBew5ShlquNxg==} + jszip@3.10.1: + resolution: {integrity: sha512-xXDvecyTpGLrqFrvkrUSoxxfJI5AH7U8zxxtVclpsUtMCq4JQ290LY8AW5c7Ggnr/Y/oK+bQMbqK2qmtk3pN4g==} + keyv@4.5.4: resolution: {integrity: sha512-oxVHkHR/EJf2CNXnWxRLW6mg7JyCCUcG0DtEGmL2ctUo1PNTin1PUil+r/+4r5MpVgC/fn1kjsx7mjSujKqIpw==} - lazystream@1.0.1: - resolution: {integrity: sha512-b94GiNHQNy6JNTrt5w6zNyffMrNkXZb3KTkCZJb2V1xaEGCk093vkZ2jk3tpaeP33/OiXC+WvK9AxUebnf5nbw==} - engines: {node: '>= 0.6.3'} - levn@0.4.1: resolution: {integrity: sha512-+bT2uH4E5LGE7h/n3evcS/sQlJXCpIp6ym8OWJ5eV6+67Dsql/LaaT7qJBAt2rzfoa/5QBGBhxDix1dMt2kQKQ==} engines: {node: '>= 0.8.0'} + lie@3.3.0: + resolution: {integrity: sha512-UaiMJzeWRlEujzAuw5LokY1L5ecNQYZKfmyZ9L7wDHb/p5etKaxXhohBcrw0EYby+G/NA52vRSN4N39dxHAIwQ==} + lilconfig@3.1.3: resolution: {integrity: sha512-/vlFKAoH5Cgt3Ie+JLhRbwOsCQePABiU3tJ1egGvyQ+33R/vcwM2Zl2QR/LzjsBeItPt3oSVXapn+m4nQDvpzw==} engines: {node: '>=14'} @@ -3601,8 +3504,8 @@ packages: resolution: {integrity: sha512-R4nPAVTAU0B9D35/Gk3uJf/7XYbQcyohSKdvAxIRSNghFl4e71hVoGnBNQz9cWaXxO2I10KTC+3jMdvvoKw6dQ==} engines: {node: '>=6'} - package-json-from-dist@1.0.1: - resolution: {integrity: sha512-UEZIS3/by4OC8vL3P2dTXRETpebLI2NiI5vIrjaD/5UtrkFX/tNbwjTSRAGC/+7CAo2pIcBaRgWmcBBHcsaCIw==} + pako@1.0.11: + resolution: {integrity: sha512-4hLB8Py4zZce5s4yd9XzopqwVv/yGNhV1Bl8NTmCq1763HeK2+EwVTv+leGeL13Dnh2wfbqowVPXCIO0z4taYw==} param-case@3.0.4: resolution: {integrity: sha512-RXlj7zCYokReqWpOPH9oYivUzLYZ5vAPIfEmCTNViosC78F8F0H9y7T7gG2M39ymgutxF5gcFEsyZQSph9Bp3A==} @@ -3654,10 +3557,6 @@ packages: path-parse@1.0.7: resolution: {integrity: sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==} - path-scurry@1.11.1: - resolution: {integrity: sha512-Xa4Nw17FS9ApQFJ9umLiJS4orGjm7ZzwUrwamcGQuHSzDyth9boKDaycYdDcZDuqYATXw4HFXgaqWTctW/v1HA==} - engines: {node: '>=16 || 14 >=14.18'} - path-scurry@2.0.1: resolution: {integrity: sha512-oWyT4gICAu+kaA7QWk/jvCHWarMKNs6pXOGWKDTr7cw4IGcUbW+PeTfbaQiLGheFRpjo6O9J0PmyMfQPjH71oA==} engines: {node: 20 || >=22} @@ -3735,10 +3634,6 @@ packages: process-warning@5.0.0: resolution: {integrity: sha512-a39t9ApHNx2L4+HBnQKqxxHNs1r7KF+Intd8Q/g1bUh6q0WIp9voPXJ/x0j+ZL45KF1pJd9+q2jLIRMfvEshkA==} - process@0.11.10: - resolution: {integrity: sha512-cdGef/drWFoydD1JsMzuFf8100nZl+GT+yacc2bEced5f9Rjk4z+WtFUTBu9PhOi9j/jfmBPu0mMEY4wIdAF8A==} - engines: {node: '>= 0.6.0'} - property-information@7.1.0: resolution: {integrity: sha512-TwEZ+X+yCJmYfL7TPUOcvBZ4QfoT5YenQiJuX//0th53DE6w0xxLEtfK3iyryQFddXuvkIk51EEgrJQ0WJkOmQ==} @@ -3783,13 +3678,6 @@ packages: readable-stream@2.3.8: resolution: {integrity: sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA==} - readable-stream@4.7.0: - resolution: {integrity: sha512-oIGGmcpTLwPga8Bn6/Z75SVaH1z5dUut2ibSyAMVhmUggWpmDn2dapB0n7f8nwaSiRtepAsfJyfXIO5DCVAODg==} - engines: {node: ^12.22.0 || ^14.17.0 || >=16.0.0} - - readdir-glob@1.1.3: - resolution: {integrity: sha512-v05I2k7xN8zXvPD9N+z/uhXPaj0sUFCe2rcWZIpBsqxfP7xXFQ0tipAd/wjj1YxWyWtUS5IDJpOG82JKt2EAVA==} - readdirp@3.6.0: resolution: {integrity: sha512-hOS089on8RduqdbhvQ5Z37A0ESjsqz6qnRcffsMU3495FuTdqSm+7bhJ29JvIOsBDEEnan5DPu9t3To9VRlMzA==} engines: {node: '>=8.10.0'} @@ -3951,6 +3839,9 @@ packages: resolution: {integrity: sha512-RJRdvCo6IAnPdsvP/7m6bsQqNnn1FCBX5ZNtFL98MmFF/4xAIJTIg1YbHW5DC2W5SKZanrC6i4HsJqlajw/dZw==} engines: {node: '>= 0.4'} + setimmediate@1.0.5: + resolution: {integrity: sha512-MATJdZp8sLqDl/68LfQmbP8zKPLQNV6BIZoIgrscFDQ+RsvK/BxeDQOgyxKKoh0y/8h3BqVFnCqQ/gd+reiIXA==} + shebang-command@2.0.0: resolution: {integrity: sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA==} engines: {node: '>=8'} @@ -4042,17 +3933,10 @@ packages: resolution: {integrity: sha512-eLoXW/DHyl62zxY4SCaIgnRhuMr6ri4juEYARS8E6sCEqzKpOiE521Ucofdx+KnDZl5xmvGYaaKCk5FEOxJCoQ==} engines: {node: '>= 0.4'} - streamx@2.23.0: - resolution: {integrity: sha512-kn+e44esVfn2Fa/O0CPFcex27fjIL6MkVae0Mm6q+E6f0hWv578YCERbv+4m02cjxvDsPKLnmxral/rR6lBMAg==} - string-width@4.2.3: resolution: {integrity: sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==} engines: {node: '>=8'} - string-width@5.1.2: - resolution: {integrity: sha512-HnLOCR3vjcY8beoNLtcjZ5/nxn2afmME6lhrDrebokqMap+XbeW8n9TXpPDOqdGK5qcI3oT0GKTW6wC7EMiVqA==} - engines: {node: '>=12'} - string-width@7.2.0: resolution: {integrity: sha512-tsaTIkKW9b4N+AEj+SVA+WhJzV7/zMhcSu78mLKWSk7cXMOSHsBKFWUs0fWwq8QyK3MgJBQRX6Gbi4kYbdvGkQ==} engines: {node: '>=18'} @@ -4072,9 +3956,6 @@ packages: string_decoder@1.1.1: resolution: {integrity: sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg==} - string_decoder@1.3.0: - resolution: {integrity: sha512-hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA==} - stringify-entities@4.0.4: resolution: {integrity: sha512-IwfBptatlO+QCJUo19AqvrPNqlVMpW9YEL2LIVY+Rpv2qsjCGxaDLNRgeGsQWJhfItebuJhsGSLjaBbNSQ+ieg==} @@ -4136,12 +4017,6 @@ packages: resolution: {integrity: sha512-g9ljZiwki/LfxmQADO3dEY1CbpmXT5Hm2fJ+QaGKwSXUylMybePR7/67YW7jOrrvjEgL1Fmz5kzyAjWVWLlucg==} engines: {node: '>=6'} - tar-stream@3.1.7: - resolution: {integrity: sha512-qJj60CXt7IU1Ffyc3NJMjh6EkuCFej46zUqJ4J7pqYlThyd9bO0XBTmcOIhSzZJVWfsLks0+nle/j538YAW9RQ==} - - text-decoder@1.2.3: - resolution: {integrity: sha512-3/o9z3X0X0fTupwsYvR03pJ/DjWuqqrfwBgTQzdWDiQSm9KitAyz/9WqsT2JQW7KV2m+bC2ol/zqpW37NHxLaA==} - thread-stream@3.1.0: resolution: {integrity: sha512-OqyPZ9u96VohAyMfJykzmivOrY2wfMSf3C5TtFJVgN+Hm6aj+voFhlK+kZEIv2FBh1X6Xp3DlnCOfEQ3B2J86A==} @@ -4423,10 +4298,6 @@ packages: resolution: {integrity: sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==} engines: {node: '>=10'} - wrap-ansi@8.1.0: - resolution: {integrity: sha512-si7QWI6zUMq56bESFvagtmzMdGOtoxfR+Sez11Mobfc7tm+VkUckk9bW2UeffTGVUbOksxmSw0AA2gs8g71NCQ==} - engines: {node: '>=12'} - wrap-ansi@9.0.2: resolution: {integrity: sha512-42AtmgqjV+X1VpdOfyTGOYRi0/zsoLqtXQckTmqTeybT+BDIbM/Guxo7x3pE2vtpr1ok6xRqM9OpBe+Jyoqyww==} engines: {node: '>=18'} @@ -4479,10 +4350,6 @@ packages: resolution: {integrity: sha512-U/PBtDf35ff0D8X8D0jfdzHYEPFxAI7jJlxZXwCSez5M3190m+QobIfh+sWDWSHMCWWJN2AWamkegn6vr6YBTw==} engines: {node: '>=18'} - zip-stream@6.0.1: - resolution: {integrity: sha512-zK7YHHz4ZXpW89AHXUPbQVGKI7uvkd3hzusTdotCg1UxyaVtg0zFJSTfW/Dq5f7OBBVnq6cZIaC8Ti4hb6dtCA==} - engines: {node: '>= 14'} - zwitch@2.0.4: resolution: {integrity: sha512-bXE4cR/kVZhKZX/RjPEflHaKVhUVl85noU3v6b8apfQEc1x4A+zBxjZ4lN8LqGd6WZ3dl98pY4o717VFmoPp+A==} @@ -5612,15 +5479,6 @@ snapshots: dependencies: '@isaacs/balanced-match': 4.0.1 - '@isaacs/cliui@8.0.2': - dependencies: - string-width: 5.1.2 - string-width-cjs: string-width@4.2.3 - strip-ansi: 7.1.2 - strip-ansi-cjs: strip-ansi@6.0.1 - wrap-ansi: 8.1.0 - wrap-ansi-cjs: wrap-ansi@7.0.0 - '@jridgewell/sourcemap-codec@1.5.5': {} '@napi-rs/wasm-runtime@0.2.12': @@ -5731,9 +5589,6 @@ snapshots: '@pinojs/redact@0.4.0': {} - '@pkgjs/parseargs@0.11.0': - optional: true - '@pkgr/core@0.2.9': {} '@pnpm/config.env-replace@1.1.0': {} @@ -6270,10 +6125,6 @@ snapshots: tslib: 2.8.1 optional: true - '@types/archiver@7.0.0': - dependencies: - '@types/readdir-glob': 1.1.5 - '@types/chai@4.3.20': {} '@types/estree@1.0.8': {} @@ -6317,10 +6168,6 @@ snapshots: '@types/normalize-package-data@2.4.4': {} - '@types/readdir-glob@1.1.5': - dependencies: - '@types/node': 18.19.130 - '@types/unist@3.0.3': {} '@types/web-bluetooth@0.0.21': {} @@ -6586,10 +6433,6 @@ snapshots: transitivePeerDependencies: - typescript - abort-controller@3.0.0: - dependencies: - event-target-shim: 5.0.1 - acorn-jsx@5.3.2(acorn@8.15.0): dependencies: acorn: 8.15.0 @@ -6645,29 +6488,6 @@ snapshots: normalize-path: 3.0.0 picomatch: 2.3.1 - archiver-utils@5.0.2: - dependencies: - glob: 10.5.0 - graceful-fs: 4.2.11 - is-stream: 2.0.1 - lazystream: 1.0.1 - lodash: 4.17.21 - normalize-path: 3.0.0 - readable-stream: 4.7.0 - - archiver@7.0.1: - dependencies: - archiver-utils: 5.0.2 - async: 3.2.6 - buffer-crc32: 1.0.0 - readable-stream: 4.7.0 - readdir-glob: 1.1.3 - tar-stream: 3.1.7 - zip-stream: 6.0.1 - transitivePeerDependencies: - - bare-abort-controller - - react-native-b4a - are-docs-informative@0.0.2: {} argparse@2.0.1: {} @@ -6738,14 +6558,8 @@ snapshots: dependencies: possible-typed-array-names: 1.1.0 - b4a@1.7.3: {} - balanced-match@1.0.2: {} - bare-events@2.8.2: {} - - base64-js@1.5.1: {} - baseline-browser-mapping@2.8.26: {} binary-extensions@2.3.0: {} @@ -6777,13 +6591,6 @@ snapshots: node-releases: 2.0.27 update-browserslist-db: 1.1.4(browserslist@4.28.0) - buffer-crc32@1.0.0: {} - - buffer@6.0.3: - dependencies: - base64-js: 1.5.1 - ieee754: 1.2.1 - builtin-modules@3.3.0: {} builtins@5.1.0: @@ -6942,14 +6749,6 @@ snapshots: comment-parser@1.4.1: {} - compress-commons@6.0.2: - dependencies: - crc-32: 1.2.2 - crc32-stream: 6.0.0 - is-stream: 2.0.1 - normalize-path: 3.0.0 - readable-stream: 4.7.0 - concat-map@0.0.1: {} config-chain@1.1.13: @@ -6977,13 +6776,6 @@ snapshots: core-util-is@1.0.3: {} - crc-32@1.2.2: {} - - crc32-stream@6.0.0: - dependencies: - crc-32: 1.2.2 - readable-stream: 4.7.0 - cross-spawn@7.0.6: dependencies: path-key: 3.1.1 @@ -7090,8 +6882,6 @@ snapshots: es-errors: 1.3.0 gopd: 1.2.0 - eastasianwidth@0.2.0: {} - ejs@3.1.10: dependencies: jake: 10.9.4 @@ -7104,8 +6894,6 @@ snapshots: emoji-regex@8.0.0: {} - emoji-regex@9.2.2: {} - end-of-stream@1.4.5: dependencies: once: 1.4.0 @@ -7592,24 +7380,12 @@ snapshots: esutils@2.0.3: {} - event-target-shim@5.0.1: {} - - events-universal@1.0.1: - dependencies: - bare-events: 2.8.2 - transitivePeerDependencies: - - bare-abort-controller - - events@3.3.0: {} - fast-copy@3.0.2: {} fast-deep-equal@3.1.3: {} fast-diff@1.3.0: {} - fast-fifo@1.3.2: {} - fast-glob@3.3.3: dependencies: '@nodelib/fs.stat': 2.0.5 @@ -7687,11 +7463,6 @@ snapshots: dependencies: is-callable: 1.2.7 - foreground-child@3.3.1: - dependencies: - cross-spawn: 7.0.6 - signal-exit: 4.1.0 - form-data-encoder@2.1.4: {} fs-extra@8.1.0: @@ -7772,15 +7543,6 @@ snapshots: dependencies: is-glob: 4.0.3 - glob@10.5.0: - dependencies: - foreground-child: 3.3.1 - jackspeak: 3.4.3 - minimatch: 9.0.5 - minipass: 7.1.2 - package-json-from-dist: 1.0.1 - path-scurry: 1.11.1 - glob@13.0.0: dependencies: minimatch: 10.1.1 @@ -7937,12 +7699,12 @@ snapshots: dependencies: safer-buffer: 2.1.2 - ieee754@1.2.1: {} - ignore@5.3.2: {} ignore@7.0.5: {} + immediate@3.0.6: {} + import-fresh@3.3.1: dependencies: parent-module: 1.0.1 @@ -8133,12 +7895,6 @@ snapshots: isexe@3.1.1: {} - jackspeak@3.4.3: - dependencies: - '@isaacs/cliui': 8.0.2 - optionalDependencies: - '@pkgjs/parseargs': 0.11.0 - jake@10.9.4: dependencies: async: 3.2.6 @@ -8181,19 +7937,26 @@ snapshots: optionalDependencies: graceful-fs: 4.2.11 - keyv@4.5.4: + jszip@3.10.1: dependencies: - json-buffer: 3.0.1 + lie: 3.3.0 + pako: 1.0.11 + readable-stream: 2.3.8 + setimmediate: 1.0.5 - lazystream@1.0.1: + keyv@4.5.4: dependencies: - readable-stream: 2.3.8 + json-buffer: 3.0.1 levn@0.4.1: dependencies: prelude-ls: 1.2.1 type-check: 0.4.0 + lie@3.3.0: + dependencies: + immediate: 3.0.6 + lilconfig@3.1.3: {} lines-and-columns@1.2.4: {} @@ -8534,7 +8297,7 @@ snapshots: p-try@2.2.0: {} - package-json-from-dist@1.0.1: {} + pako@1.0.11: {} param-case@3.0.4: dependencies: @@ -8589,11 +8352,6 @@ snapshots: path-parse@1.0.7: {} - path-scurry@1.11.1: - dependencies: - lru-cache: 10.4.3 - minipass: 7.1.2 - path-scurry@2.0.1: dependencies: lru-cache: 11.2.2 @@ -8673,8 +8431,6 @@ snapshots: process-warning@5.0.0: {} - process@0.11.10: {} - property-information@7.1.0: {} proto-list@1.2.4: {} @@ -8723,18 +8479,6 @@ snapshots: string_decoder: 1.1.1 util-deprecate: 1.0.2 - readable-stream@4.7.0: - dependencies: - abort-controller: 3.0.0 - buffer: 6.0.3 - events: 3.3.0 - process: 0.11.10 - string_decoder: 1.3.0 - - readdir-glob@1.1.3: - dependencies: - minimatch: 5.1.6 - readdirp@3.6.0: dependencies: picomatch: 2.3.1 @@ -8918,6 +8662,8 @@ snapshots: es-errors: 1.3.0 es-object-atoms: 1.1.1 + setimmediate@1.0.5: {} + shebang-command@2.0.0: dependencies: shebang-regex: 3.0.0 @@ -9032,27 +8778,12 @@ snapshots: es-errors: 1.3.0 internal-slot: 1.1.0 - streamx@2.23.0: - dependencies: - events-universal: 1.0.1 - fast-fifo: 1.3.2 - text-decoder: 1.2.3 - transitivePeerDependencies: - - bare-abort-controller - - react-native-b4a - string-width@4.2.3: dependencies: emoji-regex: 8.0.0 is-fullwidth-code-point: 3.0.0 strip-ansi: 6.0.1 - string-width@5.1.2: - dependencies: - eastasianwidth: 0.2.0 - emoji-regex: 9.2.2 - strip-ansi: 7.1.2 - string-width@7.2.0: dependencies: emoji-regex: 10.6.0 @@ -9086,10 +8817,6 @@ snapshots: dependencies: safe-buffer: 5.1.2 - string_decoder@1.3.0: - dependencies: - safe-buffer: 5.2.1 - stringify-entities@4.0.4: dependencies: character-entities-html4: 2.1.0 @@ -9139,21 +8866,6 @@ snapshots: tapable@2.3.0: {} - tar-stream@3.1.7: - dependencies: - b4a: 1.7.3 - fast-fifo: 1.3.2 - streamx: 2.23.0 - transitivePeerDependencies: - - bare-abort-controller - - react-native-b4a - - text-decoder@1.2.3: - dependencies: - b4a: 1.7.3 - transitivePeerDependencies: - - react-native-b4a - thread-stream@3.1.0: dependencies: real-require: 0.2.0 @@ -9517,12 +9229,6 @@ snapshots: string-width: 4.2.3 strip-ansi: 6.0.1 - wrap-ansi@8.1.0: - dependencies: - ansi-styles: 6.2.3 - string-width: 5.1.2 - strip-ansi: 7.1.2 - wrap-ansi@9.0.2: dependencies: ansi-styles: 6.2.3 @@ -9569,10 +9275,4 @@ snapshots: yoctocolors-cjs@2.1.3: {} - zip-stream@6.0.1: - dependencies: - archiver-utils: 5.0.2 - compress-commons: 6.0.2 - readable-stream: 4.7.0 - zwitch@2.0.4: {}