Every major web browser now has built-in support for the DecompressionStream API. Despite this, I'm struggling to utilize it with fetch()
in order to decompress a gzip file directly within the browser.
The code below successfully decompresses a base64 string:
const decompress = async (url) => {
const ds = new DecompressionStream('gzip');
const response = await fetch(url);
const blob_in = await response.blob();
const stream_in = blob_in.stream().pipeThrough(ds);
const blob_out = await new Response(stream_in).blob();
return await blob_out.text();
};
decompress(
'data:application/octet-stream;base64,H4sIAAAAAAAAE/NIzcnJVyjPL8pJAQBSntaLCwAAAA=='
).then((result) => {
console.log(result);
});
However, when attempting to use the same function on a hello.txt.gz
file created using gzip hello.txt
on MacOS (where hello.txt
contains "hello world"), an Error is thrown.
decompress('/hello.txt.gz').then((result) => {
console.log(result);
});
# FireFox
Failed to read data from the ReadableStream: “TypeError: The input data is corrupted: incorrect header check”.
Uncaught (in promise) DOMException: The operation was aborted.
# Chrome
Uncaught (in promise) TypeError: Failed to fetch
# Safari
[Error] Unhandled Promise Rejection: TypeError: TypeError: Failed to Decode Data.
[Error] Unhandled Promise Rejection: TypeError: Failed to Decode Data.
Edit
A demonstration of the issue can be found at Stackblitz.