Skip to content
87 changes: 86 additions & 1 deletion docs/api/client/rest.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ The following chapter describes the use of
npm install @feathersjs/rest-client --save
```

`@feathersjs/rest-client` allows to connect to a service exposed through a REST HTTP transport (e.g. with [Koa](../koa.md#rest) or [Express](../express.md#rest)) using [fetch](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API), [Superagent](https://github.com/ladjs/superagent) or [Axios](https://github.com/mzabriskie/axios).
`@feathersjs/rest-client` allows to connect to a service exposed through a REST HTTP transport (e.g. with [Koa](../koa.md#rest) or [Express](../express.md#rest)) using [fetch](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API), [Superagent](https://github.com/ladjs/superagent) or [Axios](https://github.com/mzabriskie/axios).

<BlockQuote type="info">

Expand Down Expand Up @@ -226,6 +226,91 @@ File uploads use the native `Request.formData()` API which buffers the entire re

</BlockQuote>

### Streaming Uploads

The REST client supports streaming data to services using `ReadableStream`. This is useful for large file uploads, real-time data ingestion, or piping data directly to storage without buffering.

```ts
// Stream a file to a service
const file = fileInput.files[0]
const stream = file.stream()

const result = await app.service('uploads').create(stream, {
headers: {
'Content-Type': file.type,
'X-Filename': file.name
}
})
```

On the server, the service receives the `ReadableStream` directly:

```ts
class UploadService {
async create(stream: ReadableStream, params: Params) {
const filename = params.headers['x-filename']
const contentType = params.headers['content-type']

// Pipe directly to storage - no buffering
await storage.upload(filename, stream, { contentType })

return { filename, uploaded: true }
}
}
```

The stream can be piped directly to cloud storage (S3, R2, etc.) without loading the entire file into memory:

```ts
async create(stream: ReadableStream, params: Params) {
// Stream directly to R2/S3
await env.MY_BUCKET.put(params.headers['x-filename'], stream)
return { success: true }
}
```

For more complex metadata, you can stringify an object into a header:

```ts
// Client
const file = fileInput.files[0]

await app.service('csv-import').create(file.stream(), {
headers: {
'Content-Type': 'text/csv',
'X-Import-Options': JSON.stringify({
filename: file.name,
tableName: 'products',
skipHeader: true
})
}
})

// Server
async create(stream: ReadableStream, params: Params) {
const options = JSON.parse(params.headers['x-import-options'])
// options.filename, options.tableName, options.skipHeader
}
```

<BlockQuote type="warning" label="Header size limits">

HTTP headers are typically limited to 8KB total. Keep metadata small - use headers for filenames, options, and IDs, not large data payloads.

</BlockQuote>

<BlockQuote type="info" label="Content-Type">

If no `Content-Type` header is specified, streaming requests default to `application/octet-stream`. Any content type not recognized as JSON, form-urlencoded, or multipart will be streamed through to the service.

</BlockQuote>

<BlockQuote type="warning" label="REST only">

Streaming uploads are only supported with the REST/HTTP transport. Socket.io does not support streaming request bodies.

</BlockQuote>

### Custom Methods

On the client, [custom service methods](../services.md#custom-methods) registered using the `methods` option when registering the service via `restClient.service()`:
Expand Down
68 changes: 68 additions & 0 deletions docs/api/hooks.md
Original file line number Diff line number Diff line change
Expand Up @@ -199,6 +199,74 @@ If you want to inspect the hook context, e.g. via `console.log`, the object retu

</BlockQuote>

#### Working with Streams

When using [streaming uploads](./client/rest.md#streaming-uploads), `context.data` will be a `ReadableStream`. Since streams can only be consumed once, around hooks are the recommended way to work with streaming data. Here are common patterns:

**Passing streams through unchanged:**

If you only need to validate metadata or check permissions, you can let the stream pass through to the service:

```ts
app.service('uploads').hooks({
around: {
create: [
async (context: HookContext, next: NextFunction) => {
// Validate using headers - don't consume the stream
const contentType = context.params.headers?.['content-type']
if (!contentType?.startsWith('image/')) {
throw new BadRequest('Only images are allowed')
}

// Stream passes through unchanged
await next()
}
]
}
})
```

**Wrapping streams with transforms:**

You can wrap the incoming stream with a transform stream for processing:

```ts
import { TransformStream } from 'node:stream/web'

app.service('uploads').hooks({
around: {
create: [
async (context: HookContext, next: NextFunction) => {
const originalStream = context.data as ReadableStream

// Create a transform that tracks bytes
let totalBytes = 0
const countingTransform = new TransformStream({
transform(chunk, controller) {
totalBytes += chunk.length
controller.enqueue(chunk)
}
})

// Replace with transformed stream
context.data = originalStream.pipeThrough(countingTransform)

await next()

// After service completes, totalBytes is available
context.result.size = totalBytes
}
]
}
})
```

<BlockQuote type="warning" label="Important">

Streams can only be consumed once. If you need to read the stream content in a hook (e.g., for validation), you must either buffer the entire stream or use a tee/transform approach. For large files, prefer validating metadata from headers rather than consuming the stream.

</BlockQuote>

### `context.error`

`context.error` is a **writeable** property with the error object that was thrown in a failed method call. It can be modified to change the error that is returned at the end.
Expand Down
Loading