How to load an RGBA image so it passes the GLTF CompareAlphaCoverage unit test? #5494
-
|
The GLTF Sample Asset repository has this excellent unit test for Compare Alpha Coverage behavior: https://github.com/KhronosGroup/glTF-Sample-Assets/blob/main/Models/CompareAlphaCoverage/README.md.
When testing my JS WebGPU viewer, I get this result:
The obvious problem occurs on the left side rendering, which is not supposed to utilize any kind of alpha blending - but renders opaque.. the core problem here is that it displays white pixels instead of the brown-ish in the expected render. When I zoom in to view the source .png image for that object in Paint.Net, I see
the transparent top-left pixels have an RGBA of Also, running ImageMagick's convert-to-TXT raw dump code on the file (with a command So the brown RGBA=(130,101,73,1) are definitely there. But in my own WebGPU program, the pixels are showing up as white. and not brown. When I troubleshoot, I find the culprit to be somewhere in how the image is loaded. Here is a small extracted example of how I try to get the brown pixels to show up: <html><body><script>
function loadRawRGBA(url) {
return fetch(url)
.then(r => r.blob())
.then(blob =>
createImageBitmap(blob, {
colorSpaceConversion: "none",
premultiplyAlpha: "none"
})
).then(bmp => {
const canvas = new OffscreenCanvas(bmp.width, bmp.height);
const ctx = canvas.getContext("2d");
ctx.drawImage(bmp, 0, 0);
const img = ctx.getImageData(0, 0, bmp.width, bmp.height);
return new Uint8Array(img.data.buffer);
});
}
loadRawRGBA("FurBaseColorAlpha.png").then(data => {
console.log("Top-left RGBA:", data[0], data[1], data[2], data[3]);
});
</script></body></html>but when I run this page (with instead of the expected Then, thinking that it might be an issue with using a <html><body><script>
async function readPNG_RGBA(url) {
// 1. Init WebGPU
const adapter = await navigator.gpu.requestAdapter();
const device = await adapter.requestDevice();
// 2. Fetch + decode PNG (NO color conversion, NO premultiply)
const blob = await fetch(url).then(r => r.blob());
const bmp = await createImageBitmap(blob, {
colorSpaceConversion: "none",
premultiplyAlpha: "none"
});
const w = bmp.width;
const h = bmp.height;
// 3. Create texture
const texture = device.createTexture({
size: [w, h],
format: "rgba8unorm-srgb",
usage: GPUTextureUsage.COPY_DST | GPUTextureUsage.COPY_SRC
});
// 4. Upload bitmap to texture
device.queue.copyExternalImageToTexture(
{ source: bmp },
{ texture },
[w, h]
);
// 5. Create readback buffer
const bytesPerRow = ((w * 4 + 255) & ~255); // WebGPU row alignment
const buffer = device.createBuffer({
size: bytesPerRow * h,
usage: GPUBufferUsage.COPY_DST | GPUBufferUsage.MAP_READ
});
// 6. Copy texture → buffer
const encoder = device.createCommandEncoder();
encoder.copyTextureToBuffer(
{ texture },
{ buffer, bytesPerRow },
[w, h]
);
device.queue.submit([encoder.finish()]);
// 7. Map + read data
await buffer.mapAsync(GPUMapMode.READ);
const mapped = new Uint8Array(buffer.getMappedRange());
console.log("Top-left RGBA:", mapped[0], mapped[1], mapped[2], mapped[3]);
}
readPNG_RGBA("FurBaseColorAlpha.png");
</script></body></html>but that too fails to get me the brown RGB pixel in the alpha area, but outputs instead of Anyone knows how to load a PNG image containing an alpha channel properly using browser APIs, to get the GLTF CompareAlphaCoverage test to render correctly? It seems like I might be missing something simple here - but can't quite figure out what. |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 4 replies
-
|
For your first example, through the canvas. The issue is the canvas premultiplies values The value in the file you were checking is 130 101 73 1. That value gets premultiplied when put in the canvas so it becomes 1 1 1 1. It's then un-premultiplied when you call getImageData, the result is 255 255 255 1. For your second example you can't use With that added it works for me https://jsgist.org/?src=e9261816382d2637f29e8e7f105c1c40 It's strange that you saw 255, 255, 255, 1 because since it failed you should have gotten 0,0,0,0 (which is what I got before I added You generally shouldn't load through the canvas
Of course there might be cases where using the canvas is appropriate, maybe you want to manipulate the image beforehand or something. But, in general, in most cases, it's best to use |
Beta Was this translation helpful? Give feedback.
-
|
@juj to workaround the image import issues in Safari you could use GPUQueue.writeTexture directly but we would like to fix this issue at some point. |
Beta Was this translation helpful? Give feedback.
-
|
Thanks for triaging to links to the browser bug trackers, much appreciated. @mwyrzykowski I would like to expand our CI to cover Safari as well. Currently https://webkit.org/b/219005 is another issue that I ran into that is missing in Safari - that would be useful to get resolved, so that WebGPU rendered content can be ensured to be 1:1 rendered without resample aliasing issues. |
Beta Was this translation helpful? Give feedback.



For your first example, through the canvas. The issue is the canvas premultiplies values
The value in the file you were checking is 130 101 73 1. That value gets premultiplied when put in the canvas so it becomes 1 1 1 1. It's then un-premultiplied when you call getImageData, the result is 255 255 255 1.
For your second example you can't use
copyExternalImageToTextureunless you set theRENDER_ATTACHMENTusage. You should have gotten an error.With that added it works for me
https://jsgist.org/?src=e9261816382d2637f29e8e7f105c1c40
It's strange that you saw 255, 255, 255, 1 because since it failed you should have gotten 0,0,0,0 (which is what I got before I added
RENDER_ATTACHMENTusage.Y…