I'm currently working on refactoring an array.map function that iterates over an array and returns an array of promises. The task at hand is to switch from using .map
to .reduce
. However, after implementing .reduce
without altering the business logic, I noticed a significant increase in processing time due to asynchronous calls being made for each element in the array.
Is there a way to utilize reduce to generate an array of promises that can be resolved efficiently using Promise.all()
later on?
In an attempt to address this issue, I have conducted some experiments and included a snippet below showcasing the solutions I have tried along with the desired outcome.
If anyone could demonstrate how to achieve this, it would be greatly appreciated.
interface IUser { name: string; balance: number | undefined; }
let user1 = { name: 'user1', balance: undefined }
let user2 = { name: 'user2', balance: undefined }
let user3 = { name: 'user3', balance: undefined }
const users = [user1, user2, user3]
async function calculateBalance(user: IUser): Promise<IUser> {
await new Promise((resolve) => setTimeout(resolve, 5000));
user.balance = Math.floor(Math.random() * 5000) + 1000;
return user;
}
async function processUsers() {
return users.reduce(async (acc, currUser, index) => {
return [ ...acc, calculateBalance(currUser) ] // not working
return acc.push(calculateBalance(currUser)) // Argument of type 'Promise<IUser>' is not assignable to parameter of type 'never'
return acc.concat(calculateBalance(currUser)) // Same above
}, [])
}
const processedUsersArray = await processUsers();
await Promise.all(processedUsersArray) // Desired thing for the sake of performance