In a unique case I'm dealing with, certain validation logic needs to occur in the UI for specific business reasons[...]. The array could potentially contain anywhere from several tens to hundreds of thousands of items (1-400K). This frontend operation is Angular-based.
One of the initial steps involves checking for duplicates within the array and storing them separately[...]. To achieve this, the following code snippet is used:
validateTargets(targets: string[]): ValidationResultObject[] {
let result: ValidationResultObject[];
let dups: string[] = [];
var uniques = targets.filter( (item,index) => {
if (targets.indexOf(item) === index) {
return targets.indexOf(item) === index
}
else {
dups.push(targets[index])
}
}
//additional validation procedures are carried out here
return result;
}
The main issue arises when processing an array above 50K items, causing a noticeable freeze in the UI. As a temporary fix, the aforementioned callback function has been placed within a setTimeout
, allowing a spinner to indicate activity while the page hangs :)
Various suggestions exist on how to structure code to ensure UI responsiveness, but my scenario presents a challenge due to duplicate handling.
One potential solution considered was breaking down the array into chunks and processing the Array.filter
method in a loop using setTimeout
for UI responsiveness. However, eventually comparing these chunks against each other would only prolong the process! With some browser limitations within the organization, experimenting with workers isn't currently feasible.
If anyone has insight or suggestions on how to address this dilemma, it would be greatly appreciated. Unfortunately, migrating this functionality to the backend is not an option :(
Best Regards