While running this code locally poses no issues, deploying it on Vercel leads to hanging. Monitoring Redis in Upstash reveals intermittent success and failure of data insertion. At times, the code smoothly passes through the await statement, but at other times, it gets stuck there.
The setup includes using Upstash's Redis with a Pay as You Go plan and Vercel with a Pro plan.
QStash by Upstash is also being utilized, facing comparable challenges where successful triggers are interchanged with instances of being stuck at the initial call. All these operations function without any glitches when executed locally.
import { Redis } from '@upstash/redis'
const redis = new Redis({
url: process.env.UPSTASH_REDIS_REST_URL!,
token: process.env.UPSTASH_REDIS_REST_TOKEN!,
enableAutoPipelining: true,
retry: {
retries: 5,
backoff: (retryCount) => Math.exp(retryCount) * 50,
},
})
const result = await Promise.all([
redis.hset(taskId + '_info', {platform,searchId,collectionId,}),
redis.rpush(taskId, ...keywords),
redis.expire(taskId, TASK_EXPIRY_TIME),
redis.expire(taskId + '_info', TASK_EXPIRY_TIME)
]);
Various solutions have been attempted to address this issue:
Adjusting Vercel's function execution time: Increasing the function timeout in Vercel's settings did not eliminate the hanging problem.
Using Redis pipeline: Attempting to batch multiple commands together using Redis pipeline, as shown below:
const pipeline = redis.pipeline();
pipeline.hset(taskId + '_info', { /* data */ });
pipeline.rpush(taskId, ...keywords);
pipeline.expire(taskId, TASK_EXPIRY_TIME);
pipeline.expire(taskId + '_info', TASK_EXPIRY_TIME);
const results = await pipeline.exec();
However, this approach still resulted in the same hanging behavior in some cases.
- Switching to ioredis: Trying out the ioredis library instead of Upstash's Redis client displayed similar inconsistencies in the serverless environment.