Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
48 changes: 39 additions & 9 deletions lib/buffer.js
Original file line number Diff line number Diff line change
Expand Up @@ -614,25 +614,55 @@ Buffer.concat = function concat(list, length) {
if (length === undefined) {
length = 0;
for (let i = 0; i < list.length; i++) {
if (list[i].length) {
length += list[i].length;
const buf = list[i];
if (!isUint8Array(buf)) {
// TODO(BridgeAR): This should not be of type ERR_INVALID_ARG_TYPE.
// Instead, find the proper error code for this.
throw new ERR_INVALID_ARG_TYPE(
`list[${i}]`, ['Buffer', 'Uint8Array'], buf);
}
length += TypedArrayPrototypeGetByteLength(buf);
}
} else {
validateOffset(length, 'length');

const buffer = allocate(length);
let pos = 0;
for (let i = 0; i < list.length; i++) {
const buf = list[i];
const bufLength = TypedArrayPrototypeGetByteLength(buf);
TypedArrayPrototypeSet(buffer, buf, pos);
pos += bufLength;
}

if (pos < length) {
TypedArrayPrototypeFill(buffer, 0, pos, length);
}
Comment on lines +636 to +638
Copy link
Member

@gurgunday gurgunday Feb 7, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we even hit this part? I think this is unreachable because you moved type validation to the beginning and it will always fill after going through all of them?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's still reachable if a buffer gets detached between the validation and copy loops buf.length becomes 0, pos won't advance, leaving uninitialized bytes without the zero-fill.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not really sure how it can be detached unless a length getter is compromised. It's all synchronous code here?

In any case, it would be nice to have coverage for it

Other than coverage, the PR LGTM

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about an assert here?

(Attempting to read from a detached buffer should throw an error from the engine in any case.)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the buffer gets detached between these points should actually cause the Uint8Array.prototype.set operation to fail.

Consider the example:

const u8_1 = new Uint8Array([1,2,3,4]);
const u8_2 = new Uint8Array([5,6,7,8]);

let called = false;

Object.defineProperty(u8_1, 'length', {
  get() {
    // The first time this is called, return the actual length of the array
    // The second time this is called, we'll also transfer the ArrayBuffer of the second

    if (!called) {
      called = true;
    } else {
      u8_2.buffer.transfer();
    }
    return 4;
  },
});

const buf = Buffer.concat([u8_1, u8_2]);
console.log(buf);
node:buffer:631
      TypedArrayPrototypeSet(buffer, buf, pos);
      ^

TypeError: Cannot perform %TypedArray%.prototype.set on a detached ArrayBuffer
    at Buffer.set (<anonymous>)
    at Buffer.concat (node:buffer:631:7)
    at Object.<anonymous> (/home/jsnell/tmp/fubar.js:20:20)
    at Module._compile (node:internal/modules/cjs/loader:1811:14)
    at Object..js (node:internal/modules/cjs/loader:1942:10)
    at Module.load (node:internal/modules/cjs/loader:1532:32)
    at Module._load (node:internal/modules/cjs/loader:1334:12)
    at wrapModuleLoad (node:internal/modules/cjs/loader:255:19)
    at Module.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:154:5)
    at node:internal/main/run_main_module:33:47

Node.js v26.0.0-pre

I'm all for being defensive here tho.

Copy link
Member

@jasnell jasnell Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the bigger challenge here is that this does introduce a breaking security risk. Consider the following case:

const u8_1 = new Uint8Array([1,2,3,4]);
const u8_2 = new Uint8Array([5,6,7,8]);

let called = false;

Object.defineProperty(u8_1, 'length', {
  get() {
    return 100;
  },
});

const buf = Buffer.concat([u8_1, u8_2]);
console.log(buf);

Then comparing the output between current node.js and this PR:

// This PR
jsnell@james-cloudflare-build:~/projects/node/node$ ./node ~/tmp/fubar.js 
<Buffer 01 02 03 04 30 70 00 00 f0 b3 61 64 30 70 00 00 00 2d 8b 1d 77 5e 00 00 00 2d 8b 1d 77 5e 00 00 e9 1e 4a 3e ed 01 00 00 21 1f 4a 3e ed 01 00 00 71 1f ... 54 more bytes>

// Original
jsnell@james-cloudflare-build:~/projects/node/node$ node ~/tmp/fubar.js 
<Buffer 01 02 03 04 05 06 07 08 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ... 54 more bytes>
jsnell@james-cloudflare-build:~/projects/node/node$ 

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TypedArrayPrototypeGetLength would be the tonic there, presumably.

Copy link
Member

@jasnell jasnell Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably. Also just make sure that Uint8Array.prototype.set itself does not call the user overide getter (it shouldn't.. but let's confirm)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, switched to TypedArrayPrototypeGetByteLength to avoid the spoofed length getter. Zero-fill kept as defensive measure.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would be sure to also add a test that covers the spoofed length test.

Comment on lines +636 to +638
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: my preference would still be for an assert here, since pos !== length is de facto an invalid state.

return buffer;
}

const buffer = Buffer.allocUnsafe(length);
let pos = 0;
validateOffset(length, 'length');
for (let i = 0; i < list.length; i++) {
const buf = list[i];
if (!isUint8Array(buf)) {
if (!isUint8Array(list[i])) {
// TODO(BridgeAR): This should not be of type ERR_INVALID_ARG_TYPE.
// Instead, find the proper error code for this.
throw new ERR_INVALID_ARG_TYPE(
`list[${i}]`, ['Buffer', 'Uint8Array'], list[i]);
}
pos += _copyActual(buf, buffer, pos, 0, buf.length, true);
}

const buffer = allocate(length);
let pos = 0;
for (let i = 0; i < list.length; i++) {
const buf = list[i];
const bufLength = TypedArrayPrototypeGetByteLength(buf);
if (pos + bufLength > length) {
TypedArrayPrototypeSet(buffer,
TypedArrayPrototypeSlice(buf, 0, length - pos),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: TypedArrayPrototypeSubarray is a zero-copy operation that would presumably be a lot faster for large chunks.

pos);
pos = length;
break;
}
TypedArrayPrototypeSet(buffer, buf, pos);
pos += bufLength;
}

// Note: `length` is always equal to `buffer.length` at this point
Expand Down
Loading