Your limitations are self-imposed
It has been noted that there is an issue with the types you are using. Both compose(f,g)
and compose(g,f)
require input in the form of [k,v]
, but none of them output that form, therefore causing a mismatch.
Nevertheless, transducers are versatile and do not need to have knowledge about the data types they handle.
const flip = ([ key, value ]) =>
[ value, key ]
const double = ([ key, value ]) =>
[ key, value * 2 ]
const pairToObject = ([ key, value ]) =>
({ [key]: value })
const entriesToObject = (iterable) =>
Transducer ()
.log ('begin:')
.map (double)
.log ('double:')
.map (flip)
.log ('flip:')
.map (pairToObject)
.log ('obj:')
.reduce (Object.assign, {}, Object.entries (iterable))
console.log (entriesToObject ({one: 1, two: 2}))
// begin: [ 'one', 1 ]
// double: [ 'one', 2 ]
// flip: [ 2, 'one' ]
// obj: { 2: 'one' }
// begin: [ 'two', 2 ]
// double: [ 'two', 4 ]
// flip: [ 4, 'two' ]
// obj: { 4: 'two' }
// => { 2: 'one', 4: 'two' }
Furthermore, we can work with boring arrays of numbers or even return a set as a possibility.
const main = nums =>
Transducer ()
.log ('begin:')
.filter (x => x > 2)
.log ('greater than 2:')
.map (x => x * x)
.log ('square:')
.filter (x => x < 30)
.log ('less than 30:')
.reduce ((acc, x) => [...acc, x], [], nums)
console.log (main ([ 1, 2, 3, 4, 5, 6, 7 ]))
// Output shown below in comments
In addition, we can process an input array of objects and return a set as well.
const main2 = (people = []) =>
Transducer ()
.log ('begin:')
.filter (p => p.age > 13)
.log ('age over 13:')
.map (p => p.name)
.log ('name:')
.filter (name => name.length > 3)
.log ('name is long enough:')
.reduce ((acc, x) => acc.add (x), new Set, people)
const data =
[ { name: "alice", age: 55 }
, { name: "bob", age: 16 }
, { name: "alice", age: 12 }
, { name: "margaret", age: 66 }
, { name: "alice", age: 91 }
]
console.log (main2 (data))
// Output shown below in comments
The use of different implementations of reduce highlights the trade-offs between functional programming and functional programs.
A quick wrapper around native Array.prototype.reduce
which limits it to arrays only.
An implementation that works on any iterable but may face stack overflow issues for large datasets.
A more imperatively styled approach that ensures stack safety, sacrificing elegance for practicality.
In the end, understanding these trade-offs allows you to make informed decisions when designing your own modules and utilities.
One potential solution involves leveraging Array.from
to convert any iterable into an array before applying Array.prototype.reduce
.
This approach simplifies the implementation while introducing the downside of creating intermediate arrays.
const Transducer (t = identity) =>
({ ...
, reduce: (f = (a,b) => a, acc = null, xs = []) =>
Array.from (xs)
.reduce (t (f), acc)
})
By exploring different possibilities, you can find the right balance between flexibility, performance, and simplicity in your functional programming endeavors.