My JavaScript skills are a bit rusty, especially when it comes to bit arithmetic.
The task at hand is converting a UInt8Array into 11-bit numbers. I need these numbers for a bip39 wordlist so that I can convert a libsodium private box key into a mnemonic. This is all part of a small peer-to-peer chat app project I'm working on.
Here's my plan:
- A Uint8Array is returned from
libsodium.crypto_box_keypair()
- Convert the Uint8Array into a 256-bit (boolean) array
- Split the 256-bit array into 11-bit buckets (a 2D array: about 24 x 11 bits)
- Convert each 11-bit array into a base 10 number (ranging from 0 to 2047)
I believe steps 2, 3, and 4 can be combined within the same loop.
All this is to efficiently transform a UInt8Array into an array of 11-bit numbers; efficient for the computer, although it has been quite challenging for me personally.
Currently, I have this code snippet which attempts to create the 11-bit buckets:
// inspired from: https://github.com/pvorb/node-md5/issues/25
export function toUint11Array(input: Uint8Array): boolean[][] {
let result: boolean[][] = [];
let currentChunk: boolean[] = [];
input.forEach(byte => {
for (var j = 7; j >= 0; j--) {
var b = ((byte >> j) & 0x1) > 0;
if (currentChunk.length === 11) {
result.push(currentChunk);
currentChunk = [];
}
currentChunk.push(b);
}
});
return result;
}
When testing with 2048, I get two 11-bit arrays as expected, but the content and order are not what I anticipated.
[
false, false, false, false,
false, false, false, false,
false, false, false
],
[ false, true, false, false,
false, false, false, false,
false, false, false
]
For 2048, where the 12th digit from the right should be '1', things seem off in terms of endianness and a possible off-by-one issue. Similar confusion arises when testing with 4096.
An update was made regarding the endianness concern, thanks to the community's support.
Update 2
Kudos to @harold who aided in solving the problem. Initial tests appear successful:
const numbers = {
['32']: new Uint8Array([32]),
['64']: new Uint8Array([64]),
['2048']: new Uint8Array([8, 0]),
['4096']: new Uint8Array([16, 0]),
['7331']: new Uint8Array([28, 163])
}
test ('toUint11Array | converts | 32 (8 bits)', function(assert) {
const result = toUint11Array(numbers['32']);
const expected = [32];
assert.deepEqual(result, expected);
});
test ('toUint11Array | converts | 2048 (12 bits)', function(assert) {
const result = toUint11Array(numbers['2048']);
const expected = [8, 0];
assert.deepEqual(result, expected);
});
test ('toUint11Array | converts | 4096 (13 bits)', function(assert) {
const result = toUint11Array(numbers['4096']);
const expected = [16, 0];
assert.deepEqual(result, expected);
});
test ('toUint11Array | converts | 7331 (13 bits)', function(assert) {
const result = toUint11Array(numbers['7331']);
const expected = [3, 1187];
assert.deepEqual(result, expected);
});
While the first three tests pass successfully, the last one does not generate the expected output when converting a Uint8Array(28, 163)
. Further refinement is needed to match the desired outcome.