Timeouts, often linked to slow 3rd party API calls, can be solved with offloading.
Next.js enabled all TypeScript and JavaScript developers, from front-end to back-end, to quickly build swift React applications.
Next.js apps are most of the time deployed on Serverless solutions such as Vercel or Cloudflare Workers.
While most use-cases fit in the limit of Serverless executions of 30-60 seconds, use-cases such as third-party integrations or AI features don't.
Such timeouts are solved by “offloading” your Functions/Workers and delaying the computation in the background while the front end will poll for the results.
Timeouts happen when a long-running task is happening at the HTTP layer:
Timeouts can be avoided by moving the long-running task out of the HTTP layer:
Having the front end trigger a long-running execution and polling for its results allows for the task to not be limited by the Serverless (1min) or HTTP limits (5min).
While it might look complicated to set up, we will see that such a pattern is simple with Defer.
Let's consider the following Next.js app:
.
|-- pages/
| |-- api/
| |-- longRunning.ts
|-- styles/
|-- next-env.d.ts
|-- next.config.js
|-- package.json
|-- tsconfig.json
|-- yarn.lock
having a /api/longRunning Next.js API Route:
import type { NextApiRequest, NextApiResponse } from "next";type Response = {ret: any;};export default async function handler(req: NextApiRequest,res: NextApiResponse<Response>) {// ... doing some long-running stuff...res.status(200).json({ ret });}
To avoid timeouts, we will move the long-running code in the background.
After setup a Defer application, let's create a defer/longRunning.ts background function that will contain the long-running code:
import { defer, configure } from "@defer/client";async function longRunning() {// ... doing some long-running stuff...}export default(longRunning)
Then update our api/longRunning.ts Next.js API Route to trigger this background function and return the execution ID to the front end:
import type { NextApiRequest, NextApiResponse } from "next";import longRunning from '../../defer/longRunning'type Data = {id: string;};export default async function handler(_req: NextApiRequest,res: NextApiResponse<Data>) {// calling `longRunning()` triggers its execution on Defer Platformconst ret = await longRunning();// return the Defer execution ID to the front-endres.status(200).json(ret);}
Let's finally add a new Next.js API Route enabling the front-end to poll an execution's status and result (/api/longRunning/[id].ts):
import { type FetchExecutionResponse, getExecution } from "@defer/client";import type { NextApiRequest, NextApiResponse } from "next";type Response = {res: FetchExecutionResponse;};export default async function handler(req: NextApiRequest,res: NextApiResponse<Response>) {const executionId = req.query.id;const ret = await getExecution(executionId as string);res.status(200).json({ res: ret });}
You can find a complete Next.js working example here: https://github.com/defer-run/defer.demo/tree/master/nextjs.
Can I offload my Serverless Functions/Workers using Vercel/Cloudflare?
Vercel does not allow calling Functions in the background.
Cloudflare provides a Queue mechanism; however, not designed for such use cases.
What about QStash or Inngest?
QStash and Inngest are headless queueing solutions that might appear, at first, better suited for Serverless applications.
However, such solutions come with some limitations:
What about AWS Lambda?
Yes, AWS Lambdas can run up to 15min.
However: