Skip to content
Create account or Sign in
The Stripe Docs logo
/
Ask AI
Create accountSign in
Get started
Payments
Revenue
Platforms and marketplaces
Money management
Developer resources
APIs & SDKsHelp
Overview
Versioning
Changelog
Upgrade your API version
Upgrade your SDK version
Essentials
SDKs
API
    Overview
    API v2
    Rate limits
    Authentication
    API keys
    Specify request context
    Domains and IP addresses
    Make requests
    Expand responses
    Pagination
    Search objects
    Localize content
    Batch Jobs
    Testing and data
    Metadata
    Test your application
    Error handling
    Handle errors
    Error codes
Testing
Stripe CLI
Sample projects
Tools
Stripe Dashboard
Workbench
Developers Dashboard
Stripe for Visual Studio Code
Terraform
Features
Workflows
Event destinations
Stripe health alertsFile uploads
AI solutions
Agent toolkit
Model Context ProtocolBuild agentic AI SaaS Billing workflows
Security and privacy
Security
Stripebot web crawler
Privacy
Extend Stripe
Extension points
Build Stripe apps
Use apps from Stripe
Partners
Partner ecosystem
Partner certification
United States
English (United States)
  1. Home/
  2. Developer resources/
  3. API

Batch jobsPublic preview

Process multiple API requests asynchronously with a single file upload.

The Batch Jobs API allows you to perform bulk operations on Stripe resources. Instead of making individual API calls for each operation that could trigger rate limits, you can upload a file with all of your operations and let Stripe process them asynchronously. Use this for one-time migrations, bulk updates, or any operation that requires processing many resources.

When to use batch jobs

Batch jobs work well for:

  • Bulk migrations: Move large numbers of subscriptions to new billing modes.
  • Mass updates: Update many accounts or subscriptions at once.

Batch jobs don’t work well for:

  • Operations that require an immediate synchronous response.
  • Real-time processing with tight timing requirements.
  • A single asynchronous call.

To process a batch job, follow these steps:

  1. Create a batch job and specify the target API endpoint.
  2. Upload the input file with your batch requests.
  3. Monitor job status through webhooks or polling.
  4. Download the results.

Supported endpoints

The Batch Jobs API supports the following endpoints. Each batch job targets a single endpoint, and all requests in the batch go to that endpoint.

EndpointPath
Update a customerPOST /v1/customers/:id
Create a promotion codePOST /v1/promotion_codes
Update a promotion codePOST /v1/promotion_codes/:id
Migrate a subscriptionPOST /v1/subscriptions/:id/migrate
Update a subscriptionPOST /v1/subscriptions/:id

Limitations

Review the following limitations:

  • Batch files are limited to 5 GB. If you need to process a larger file for a higher volume of requests, split it into multiple batches.

  • Batch jobs only support JSONL (newline-delimited JSON) files. Batch jobs don’t accept CSV or other formats.

  • Requests in a batch can only use POST or DELETE. Batch jobs don’t support GET.

  • All requests in a batch must target the same API endpoint.

  • Batch jobs don’t guarantee the order of request processing.

  • Batch jobs have a maximum processing duration of 24 hours. Jobs that exceed this limit transition to timeout status, with partial results available.

  • Results are available for download for 7 days after the job completes.

  • The upload URL expires 5 minutes after job creation. After that period, the job transitions to upload_timeout and you need to create a new one.

  • Upload the file with a direct HTTP PUT request to the presigned URL.

Create a batch job

To start, create a batch job by sending a POST request to /v2/core/batch_jobs. Specify the target endpoint and any processing options:

Command Line
curl https://api.stripe.com/v2/core/batch_jobs \ -u
sk_test_BQokikJOvBiI2HlWgH4olfQ2
:
\ -H "Content-Type: application/json" \ -H "Stripe-Version: 2026-03-25.preview" \ -d '{ "endpoint": { "path": "/v1/subscriptions/:id/migrate", "http_method": "post" }, "maximum_rps": 10, "skip_validation": false }'

The content type for this request is a JSON file. This returns a batch job object with a ready_for_upload status. The upload URL and its expiration time are in the status_details field:

{ "id": "batchv2_AbCdEfGhIjKlMnOpQrStUvWxYz", "object": "v2.core.batch_job", "created": "2026-03-09T20:55:31.000Z", "maximum_rps": 10, "skip_validation": false, "status": "ready_for_upload", "status_details": { "ready_for_upload": { "upload_url": { "expires_at": "2026-03-09T21:00:31.000Z", "url": "https://stripeusercontent.com/files/upload/..." } } } }

The status_details object changes based on the current status. When the job is ready_for_upload, it contains the presigned upload URL and its expiration timestamp.

Parameters

ParameterRequiredDescription
endpoint.pathYesThe API endpoint to target (for example, /v1/subscriptions/:id/migrate). See Supported endpoints.
endpoint.http_methodYesThe HTTP method for the endpoint. Currently only post is supported.
maximum_rpsNoMaximum requests processed per second (1–100). Defaults to 10.
skip_validationNoSet to true to skip input file validation and start processing immediately. Defaults to false.
notification_suppressionNoControls whether webhooks from the underlying API operations are delivered. Set {"scope": "all"} to suppress operation-level webhooks. Batch-level events are always delivered regardless of this setting. Defaults to {"scope": "none"}.
metadataNoKey-value pairs for your internal tracking. Metadata is included in batch job events, including failure events.

Upload the input file

After creating the batch job, upload your input file to the URL in status_details.ready_for_upload.upload_url.url. Use a PUT request with the file contents:

Command Line
curl {UPLOAD_URL} \ -X PUT \ -T input.jsonl \ -H "Content-Type: application/jsonlines"

The input file for this request must be a JSONL file, and the content type must be application/octet-stream. After the upload completes, Stripe automatically starts processing. There’s no separate start step.

The upload URL expires 5 minutes after batch job creation. Check the expires_at field for the exact deadline. If the URL expires before you upload the file, the job status changes to upload_timeout, and you must create a new batch job. Generate the input file before you create the batch job so you can upload it promptly.

Input file format

The file must be UTF-8 encoded and use JSONL format (newline-delimited JSON, one object per line). Each line represents a single API request to the target endpoint. CSV and other formats aren’t supported.

Each JSON object supports these fields:

FieldRequiredDescription
idYesA unique identifier to correlate this request with its result. The IDs on the path parameters and the IDs on the endpoint must match, though the user is free to choose how to name them: must match /^[A-Za-z0-9_-]+$/.
path_paramsConditionalPath parameters for the endpoint. Required when the endpoint path includes placeholders (for example, :id). The keys in path_params must match the placeholders in the endpoint path exactly.
paramsNoRequest body parameters for the API call. Variations can occur based on the API method.
contextNoA Stripe account ID. Use this to execute the request against a specific account, such as a connected account.

Example input file

For the POST /v1/customers/:id endpoint:

{"id": "req_001", "path_params": {"id": "cus_1AbCdEfGhIjKlMn"}, "params": {"name": "Jenny Rosen", "email": "jenny@example.com"}} {"id": "req_002", "path_params": {"id": "cus_2BcDeFgHiJkLmNo"}, "params": {"name": "John Smith", "metadata": {"tier": "premium"}}} {"id": "req_003", "context": "acct_1234567890", "path_params": {"id": "cus_3CdEfGhIjKlMnOp"}, "params": {"description": "Updated by batch"}}

Each id must be unique within the file. Stripe uses it to correlate requests with results, because the results file isn’t ordered the same way as the input file.

Monitor job status

You can track your batch job by polling the retrieve endpoint or by listening for webhook events. We recommend using webhook events for production integrations.

Poll for status

Command Line
curl https://api.stripe.com/v2/core/batch_jobs/{BATCH_JOB_ID} \ -u
sk_test_BQokikJOvBiI2HlWgH4olfQ2
:
\ -H "Stripe-Version: 2026-03-25.preview"

The content type for this request is a json file. While the job is running, status_details includes real-time progress counts:

{ "status": "in_progress", "status_details": { "in_progress": { "success_count": "1", "failure_count": "0" } } }

During the validating phase, status_details includes a validated_count field that shows how many rows Stripe has validated so far.

Batch job API calls appear in the Stripe Dashboard or Workbench request logs. The underlying API calls don’t appear in the request logs. Use the retrieve endpoint or webhook events to monitor progress. To debug individual request failures, check the results file.

Job lifecycle

After you upload the input file, the batch job progresses through these statuses:

StatusDescription
ready_for_uploadThe batch job was created and is waiting for the input file.
validatingThe input file was uploaded and Stripe is validating it. Skipped when skip_validation is true.
in_progressValidation passed (or was skipped) and Stripe is processing requests.
completeAll requests have been processed. Results are available for download.
cancellingA cancelation was requested. Stripe is finishing in-flight requests.

Terminal statuses

StatusDescription
validation_failedThe input file contains errors. No requests were processed. Check the batch job object for error details. This is only applicable when skip_validation: false.
batch_failedAn unexpected error occurred during processing.
cancelledThe batch job was canceled. Partial results might be available.
upload_timeoutThe upload URL expired before the file was uploaded. Create a new batch job.
timeoutThe batch job exceeded the maximum processing duration of 24 hours. Partial results might be available.

Validation

When skip_validation is false (the default) Stripe validates the entire input file before processing any requests. This validation catches errors such as:

  • Invalid JSON in any row.
  • Missing or invalid id fields.
  • Duplicate IDs.
  • Missing required path_params for the target endpoint.
  • Malformed parameters.

If validation fails, the status changes to validation_failed, and Stripe doesn’t attempt any requests. The batch job object includes details about the first error it encounters.

When skip_validation is true, the job transitions directly from ready_for_upload to in_progress after upload. Errors in individual requests appear in the results file instead of blocking the entire batch.

Download the results

When the batch job reaches complete status, the status_details field includes a summary of successes and failures, along with a presigned download URL for the output file:

{ "id": "batchv2_AbCdEfGhIjKlMnOpQrStUvWxYz", "object": "v2.core.batch_job", "created": "2026-03-09T20:55:31.000Z", "maximum_rps": 10, "skip_validation": true, "status": "complete", "status_details": { "complete": { "success_count": "2", "failure_count": "0", "output_file": { "content_type": "application/jsonlines", "size": "8514", "download_url": { "expires_at": "2026-03-09T22:05:31.000Z", "url": "https://stripeusercontent.com/files/download/..." } } } } }

Download the file using the URL in status_details.complete.output_file.download_url.url. Stripe provides an output file when the batch job reaches any of these states:

  • complete
  • cancelled
  • timeout
  • validation_failed

To see when the download URL expires, check the expires_at field for the deadline.

The results file contains both successful and failed requests in a single file. To find failures, filter for rows where status isn’t 200.

Results file format

The output file uses JSONL format (one JSON object per line). Each line contains these fields:

FieldDescription
idThe request ID from the input file. Use this to correlate results with requests.
responseThe full API response object. Contains the resource on success, or an error object on failure.
statusThe HTTP status code as an integer (for example, 200, 402).

Example results file

Successful requests return the full API resource in the response field:

{"id": "req_001", "response": {"id": "sub_1AbCdEfGhIjKlMn", "object": "subscription", "status": "active", "billing_cycle_anchor": 1710021331, "current_period_end": 1712613331, "current_period_start": 1710021331}, "status": 200} {"id": "req_002", "response": {"id": "sub_2BcDeFgHiJkLmNo", "object": "subscription", "status": "active", "billing_cycle_anchor": 1710021331, "current_period_end": 1712613331, "current_period_start": 1710021331}, "status": 200}

Failed requests return an error object:

{"id": "req_003", "response": {"error": {"message": "This subscription cannot be migrated because it is not active. Current status is canceled.", "type": "invalid_request_error", "code": "resource_invalid_state"}}, "status": 400}

Results aren’t returned in the same order as the input file. Use the id field to match each result to its corresponding request.

Cancel a batch job

You can cancel a batch job that hasn’t completed yet by sending a POST request:

Command Line
curl https://api.stripe.com/v2/core/batch_jobs/{BATCH_JOB_ID}/cancel \ -u
sk_test_BQokikJOvBiI2HlWgH4olfQ2
:
\ -X POST \ -H "Stripe-Version: 2026-03-25.preview"

Cancelation is asynchronous. The job first transitions to cancelling while in-flight requests finish, then to cancelled. Any partial results from requests processed before cancelation are available in the results file.

Webhook events

Batch jobs emit v2 thin events for every lifecycle transition. To receive these events, you must configure a v2 event destination.

Batch job events require v2 event destinations. They aren’t delivered to v1 webhook endpoints.

The following events are available:

Event typeDescription
v2.core.batch_job.createdA batch job was created.
v2.core.batch_job.ready_for_uploadThe batch job is ready for file upload.
v2.core.batch_job.validatingFile upload complete, validation in progress.
v2.core.batch_job.validation_failedInput file validation failed.
v2.core.batch_job.completedAll requests have been processed.
v2.core.batch_job.batch_failedThe batch job failed unexpectedly.
v2.core.batch_job.canceledThe batch job was canceled.
v2.core.batch_job.timeoutThe batch job exceeded maximum processing duration.
v2.core.batch_job.upload_timeoutThe upload URL expired before the file was uploaded.
v2.core.batch_job.updatedThe batch job status or progress changed.

All batch job events include the metadata you provided when creating the job. Use this to correlate events with your internal systems.

When notification_suppression is set to {"scope": "all"}, webhooks from the underlying API operations (for example, subscription update events) are suppressed. Batch-level events listed above are always delivered regardless of this setting.

Common errors

Upload URL expired

If you don’t upload the input file before the expires_at timestamp (5 minutes after job creation), the batch job transitions to upload_timeout status. Create a new batch job and upload the file promptly. Generate your input file before creating the batch job to avoid this.

Invalid resource state

Individual requests can fail if the target resource isn’t in the expected state. For example, when using /v1/subscriptions/:id/migrate:

  • Subscription isn’t active: The subscription must be in an active state before you can migrate it. Canceled or incomplete subscriptions return a 400 error with resource_invalid_state.
  • Subscription already migrated: Attempting to migrate a subscription that has already been migrated returns an error.

These per-request errors appear in the results file with a non-200 status code. The batch job itself still completes successfully (the batch continues processing if individual lines fail).

Path parameter mismatches

The keys in path_params must exactly match the placeholders in the endpoint path. For example, if your endpoint path is /v1/subscriptions/:id/migrate, your path_params must use {"id": "sub_..."}. A mismatch between the placeholder name and the key causes a validation error or a 400 status in the results file.

Upload content type

The upload PUT request must use Content-Type: application/octet-stream. Other content types are rejected.

File format errors

When skip_validation is false, these errors cause the entire batch to fail with validation_failed status:

  • Rows that aren’t valid JSON
  • Missing id field on any row
  • Duplicate id values across rows
  • IDs containing characters outside A-Za-z0-9_-

When skip_validation is true, file-level format errors can cause individual rows to fail rather than blocking the entire batch.

Job processing timeout

Batch jobs that run longer than 24 hours transition to timeout status. Partial results from requests that completed before the timeout are available in the results file.

Best practices

Choose the right maximum_rps

The maximum_rps parameter controls how fast Stripe processes requests in your batch. Batch processing uses a separate rate limit pool from your main API requests, so batch jobs don’t affect your account’s regular API traffic.

  • Lower values: 1–10 are suitable for non-urgent bulk operations.
  • Higher values: 50–100 process batches faster and are suitable for independent operations across different resources.

Use validation for critical operations

Keep skip_validation set to false (the default) for operations where partial processing would cause issues. Validation ensures your entire file is well-formed before any requests are executed.

Set skip_validation to true when you’ve already validated your input data and want faster job startup, or when processing partial results is acceptable.

Split large workloads

If you have an input file larger than 5GB, split workloads into multiple batch jobs. You can run multiple batch jobs concurrently.

Verify resource state before batching

Confirm that all target resources are in the required state before submitting a batch. For example, subscriptions must be active before they can be migrated with /v1/subscriptions/:id/migrate. Batch jobs execute the target operation directly and don’t change resource state as a prerequisite.

Handle errors in results

Always check the status field in each result line. Individual requests within a successful batch can still fail (for example, because of insufficient funds or invalid parameters). Build your integration to filter the results file for non-200 statuses and handle failures accordingly.

Prepare your file before creating the job

The upload URL expires 5 minutes after batch job creation. Generate and validate your input file before calling the create endpoint. If you need to prepare data dynamically, complete all data retrieval and file generation first, then create the batch job and upload immediately.

Was this page helpful?
YesNo
  • Need help? Contact Support.
  • Chat with Stripe developers on Discord.
  • Check out our changelog.
  • Questions? Contact Sales.
  • LLM? Read llms.txt.
  • Powered by Markdoc
On this page