Is it safe to decode a UTF8-string that has been split into arbitrary byte-chunks into a string (chunk by chunk)?
Also, what about an arbitrary encoding?
The scenario involves the following method:
async getFileAsync(fileName: string, encoding: string):string
{
const textDecoder = new TextDecoder(encoding);
const response = await fetch(fileName);
console.log(response.ok);
console.log(response.status);
console.log(response.statusText);
const reader = response.body.getReader();
let result:ReadableStreamReadResult<Uint8Array>;
let chunks:Uint8Array[] = [];
do
{
result = await reader.read();
chunks.push(result.value);
let partN = textDecoder.decode(result.value);
console.log("result: ", result.value, partN);
} while(!result.done)
let chunkLength:number = chunks.reduce(
function(a, b)
{
return a + (b||[]).length;
}
, 0
);
let mergedArray = new Uint8Array(chunkLength);
let currentPosition = 0;
for(let i = 0; i < chunks.length; ++i)
{
mergedArray.set(chunks[i],currentPosition);
currentPosition += (chunks[i]||[]).length;
}
let file:string = textDecoder.decode(mergedArray);
return file;
} // End Function getFileAsync
Now, my question is, when dealing with arbitrary encoding, is it safe to decode the chunks like this:
result = await reader.read();
// would this be safe ?
chunks.push(textDecoder.decode(result.value));
By "safe," I mean will it correctly decode the overall string?
I suspect it might not, but I would appreciate confirmation.
I thought that since I have to wait until the end to merge the array of chunks, I could just use:
let responseBuffer:ArrayBuffer = await response.arrayBuffer();
let text:string = textDecoder.decode(responseBuffer);
instead.