It seems that your struggle lies in attempting to blend imperative and reactive approaches. It is often more effective to stick with just one approach rather than combining both. When dealing with complex asynchronous operations, following a reactive programming paradigm usually yields better results.
Here is an alternative approach:
@Injectable()
export class QueueRequestsInterceptor implements HttpInterceptor {
private intercepted$$ = new Subject<Intercepted>();
private queue$ = this.intercepted$$.pipe(
concatMap(({ request, next }) =>
next.handle(request).pipe(
materialize(),
map((materializedResponse) => ({
request,
materializedResponse,
}))
)
),
share()
);
public intercept(
request: HttpRequest<unknown>,
next: HttpHandler
): Observable<HttpEvent<unknown>> {
return merge(
this.queue$.pipe(
filter((res) => res.request === request),
first(),
map(({ materializedResponse }) => materializedResponse),
dematerialize()
),
defer(() => {
this.intercepted$$.next({ request, next });
return EMPTY;
})
);
}
}
To start off, I create a subject where requests are queued up.
Then, the queue is established using concatMap
to ensure that requests are executed sequentially. If a request fails, we use materialize
to prevent the entire queue from failing. This function wraps the observable result into an object, ensuring that it does not throw any errors. We then use map
to retain both the request (for filtering later on) and the materialized response. Finally, we use share
to prevent multiple subscriptions to the queue for each request.
In the intercept
method, we revisit the merge
and defer
functions. We listen to the queue and filter for the current request only. By using first
, we avoid potential memory leaks by keeping the observable open. In case of a failed HTTP call, we revert to the original behavior using dematerialize
. This ensures that the app receives the appropriate response without disrupting the queue.
If we had utilized the following code snippet:
this.intercepted$$.next({ request, next });
return this.queue$.pipe(
filter((res) => res.request === request),
map(({ materializedResponse }) => materializedResponse),
dematerialize()
);
While this might work in most scenarios, there could be cases where requests are resolved immediately or fail instantly. To cover all possibilities, I utilized a merge
to begin listening to the queue first before subscribing to the defer
, triggering the side effect of adding the current request to the queue.
I have created a live demo showcasing a second interceptor for mocking HTTP responses.
A 3-second delay was imposed on each request, resulting in the following output:
Initially, we observe:
Putting request http://mocked-request-anyway/0 in queue
Putting request http://mocked-request-anyway/1 in queue
+3s:
Received response for request http://mocked-request-anyway/0
{count: 1}
+3s:
Received response for request http://mocked-request-anyway/1
{count: 2}