Cookie Consent by Free Privacy Policy Generator Aktuallisiere deine Cookie Einstellungen ๐Ÿ“Œ What is waitUntil (Vercel, Cloudflare) and when should I use it?


๐Ÿ“š What is waitUntil (Vercel, Cloudflare) and when should I use it?


๐Ÿ’ก Newskategorie: Programmierung
๐Ÿ”— Quelle: dev.to

Recently, Vercel announced that their waitUntil utility method is now available for all serverless functions. Now, this useful feature is available for both Vercel's Node.js and edge functions. This utility is not unique to Vercel -- Cloudflare Workers also offers a similar one.

This is a very useful tool so let's dive into what waitUntil is and what it does, when to use it, and when it's better not to.

What does waitUntil do

To understand what waitUntil does and why it exists, let's talk about how serverless functions work.

Serverless functions are designed to be stateless and short-lived. Unlike traditional servers, serverless functions are not kept alive between requests. Instead, they are spun up on-demand to handle incoming requests and then shut down once the response is sent.

With JavaScript serverless functions, you have to await all asynchronous operations to ensure that the function doesn't return before the asynchronous operation is complete. If you don't await an asynchronous operation, the serverless function might be shut down before the operation is complete. If this happens, the operation might be canceled or fail which results in unpredictable behavior. This is why it's important to await all asynchronous operations in serverless functions.

// โŒ BAD: If sendMetrics is an async function (a Promise), there is no guarantee it will succeed
async function POST(request) {
  const data = await createPost(request.body)
  sendMetrics({ metric: 'posts.created', increment: 1 })
  return new Response(data)
}
// โœ… GOOD: Awaiting will ensure it completes before the function returns
async function POST(request) {
  const data = await createPost(request.body)
  await sendMetrics({ metric: 'posts.created', increment: 1 })
  return new Response(data)
}

The downside of using await is that it can slow down the response time of your serverless functions. You ideally want to keep the critical path of your request minimal so you return the fastest response possible for your users. This is where waitUntil comes in.

waitUntil allows you to ensure that an async function (a Promise) completes before the serverless function is shut down. As the example code above doesn't require any data returned from sendMetrics, we can use waitUntil to ensure that the async function completes in the background without blocking the response from returning to the user.

// โœ… GOOD: The sendMetrics function will execute in the background,
// but not block the response from returning to the user
async function POST(request) {
  const data = await createPost(request.body)
  waitUntil(sendMetrics({ metric: 'posts.created', increment: 1 }))
  return new Response(data)
}

So, is waitUntil a solution to all possible problems? Well, it is a very useful tool and it ensures completion. However, it does not ensure success. For example, if the promise passed to waitUntil fails, the serverless function will still complete and return a response. So when should we use it?

When to use waitUntil

waitUntil is useful when you need to perform some asynchronous work that is not critical to the response of the serverless function. This is useful for things like sending metrics, logging, cache control, or other things that should not block the user's response. The user should not pay the penalty for these operations, so they can be done in the background.

When not to use waitUntil

As mentioned above, waitUntil does not ensure success. If the promise passed to waitUntil fails, the serverless function will still complete and return a response. This is important to keep in mind when deciding when to use waitUntil.

In other words, waitUntil is not the tool for the job if the asynchronous operation affects critical business logic in your application (which is precisely what makes it great for logging and metrics!). If the operation fails, it's not the end of the world. But if the operation is critical to the response of the serverless function, await is the way to go.

Can I run background jobs with waitUntil?

Short answer: No.

Long answer: waitUntil is not designed for running background jobs. There are no retries or tools for handling failures. Also, waitUntil will only run for as long as your function's timeout is set for, meaning if something takes longer than your timeout, it will be canceled. For running background jobs, you should use a system that handles queueing, retries, and logs failures.

Inngest is designed to handle background jobs reliably. It automatically handles retries, logs, and failures. If you need to run background jobs, you should consider using Inngest instead of waitUntil.

Should I use waitUntil with Inngest?

Inngest functions are truly async, background jobs. To trigger Inngest functions, you use inngest.send() which sends the data to the Inngest Event API. Inngest then triggers the function and handles retries on it's end. The inngest.send() function returns a Promise so it is possible to use with waitUntil. Should you use it though?

import { inngest } from "src/inngest";

async function POST(request) {
  const data = await createPost(request.body)
  waitUntil(inngest.send({ name: "post.created", data: data }))
  return new Response(data)
}

Typically, Inngest functions contain core business logic, so you need to ensure that the functions are triggered successfully. The key question is the reliability of waitUntil and the function it calls. Let's break down some things to consider:

  • The Inngest Event API has very high uptime and is designed to be reliable.
  • The inngest.send() function is very simple - mostly just an POST request to the Event API.
  • The waitUntil utility from Vercel and Cloudflare is designed to be reliable.
  • Networking issues or blips can always cause issues with outbound requests from any platform.
  • waitUntil will only last as long as the timeout for the function.

Given these points, it is safe to use waitUntil with inngest.send() with the caveat that if there is a networking issue, the event may not reach Inngest. While this is a rare occurrence, it is a risk to be aware of and is a trade off that you need to decide for yourself.

If you don't want to take this risk, you can use inngest.send() without waitUntil, but it will be blocking:

import { inngest } from "src/inngest";

async function POST(request) {
  const data = await createPost(request.body)
  await inngest.send({ name: "post.created", data: data })
  return new Response(data)
}

If you want the best of both worlds, you might determine a recovery strategy for when the event doesn't reach Inngest. This could be as simple as logging the failure and retrying the event later. Below is an incomplete example of how you might handle a failure and do something with the event. Not included is any logic you might need for retrying the event later using whatever logging backend or similar that you use.

import { inngest } from "src/inngest";

async function POST(request) {
  const data = await createPost(request.body)
  waitUntil(
    inngest.send({ name: "post.created", data: data })})
      .catch((error) => {
        // Log the event somewhere where you could read and re-send it later
      })
  )
  return new Response(data)
}

Conclusion

waitUntil is a highly useful tool to have in your serverless toolkit. It can help keep async code out of the critical path of the request to return the response to the user as fast as possible. It is very important to understand when to use it and when not to use it. It's also important to understand the risks of using it with other services.

Ideally, in the future, these services might offer some sort of out-of-the-box logging and retry mechanism for waitUntil. Until then, you should be aware of the risks and decide if it's right for your use case.

...



๐Ÿ“Œ What is waitUntil (Vercel, Cloudflare) and when should I use it?


๐Ÿ“ˆ 75.71 Punkte

๐Ÿ“Œ Vercel's New waitUntil Utility ๐Ÿ•ฐ๏ธโœจ


๐Ÿ“ˆ 53.02 Punkte

๐Ÿ“Œ How to Deploy Your Backend on Vercel Using `vercel.json`: A Step-by-Step Guide


๐Ÿ“ˆ 37.12 Punkte

๐Ÿ“Œ How to Deploy Your React App Using Cloudflare Pages, Vercel, and Netlify


๐Ÿ“ˆ 30.69 Punkte

๐Ÿ“Œ AI Chatbot with Cloudflare Worker AI Model and Vercel AI SDK


๐Ÿ“ˆ 30.69 Punkte

๐Ÿ“Œ Resolving "404: Image Not Found" with Airtable, Vercel, NextJS and Cloudflare Images


๐Ÿ“ˆ 30.69 Punkte

๐Ÿ“Œ Cloudflare Pages ist ein JAMstack-Wettbewerber fรผr Netlify und Vercel


๐Ÿ“ˆ 29.17 Punkte

๐Ÿ“Œ Unlocking Seamless Image Delivery with Airtable, Vercel & Cloudflare Images


๐Ÿ“ˆ 29.17 Punkte

๐Ÿ“Œ What is Vercel's AI tool, V0.dev and how do you use it?


๐Ÿ“ˆ 24.29 Punkte

๐Ÿ“Œ Netlify vs Vercel vs Surge | The best platform I use to deploy your projectsย ๐Ÿค”?!


๐Ÿ“ˆ 22.77 Punkte

๐Ÿ“Œ How to use Redis with Vercel Edge


๐Ÿ“ˆ 22.77 Punkte

๐Ÿ“Œ Quick dip: Use Google Cloud Vertex AI SDK with the Vercel AI SDK


๐Ÿ“ˆ 22.77 Punkte

๐Ÿ“Œ Cloudflare kรผndigt Cloudflare One an, eine Plattform zur Vernetzung und Sicherung von ...


๐Ÿ“ˆ 21.21 Punkte

๐Ÿ“Œ Cloudflare One: Cloudflare will Browser in der Cloud isolieren


๐Ÿ“ˆ 21.21 Punkte

๐Ÿ“Œ Cloudflare releases new AI security tools with Cloudflare One


๐Ÿ“ˆ 21.21 Punkte

๐Ÿ“Œ Cloudflare Public Bug Bounty: YAML schema injection risk in Swagger UI via schema_url parameter at developers.cloudflare.com


๐Ÿ“ˆ 21.21 Punkte

๐Ÿ“Œ How to build a Nextjs blog with Hygraph and deploy to Vercel in 2023.


๐Ÿ“ˆ 20.08 Punkte

๐Ÿ“Œ Full Stack GraphQL With Next.js, Neo4j AuraDB And Vercel


๐Ÿ“ˆ 20.08 Punkte

๐Ÿ“Œ Building an Image Recognition App in Javascript using Pinecone, Hugging Face, and Vercel


๐Ÿ“ˆ 20.08 Punkte

๐Ÿ“Œ Building a SQL Expert Bot: A Step-by-Step Guide with Vercel AI SDK and OpenAI API


๐Ÿ“ˆ 20.08 Punkte

๐Ÿ“Œ The Best Alternative to Vercel, Netilfy, GitHub Pages and many more : Coolify


๐Ÿ“ˆ 20.08 Punkte

๐Ÿ“Œ How to Deploy and Host Your Website Cost-Effectively with Vercel ๐Ÿ’ฐ๐Ÿ’ฐ


๐Ÿ“ˆ 20.08 Punkte

๐Ÿ“Œ Build and Deploy Notion Clone โ€“ Full Stack Tutorial (NextJS 13, DALLโ€ขE, DrizzleORM, OpenAI, Vercel)


๐Ÿ“ˆ 20.08 Punkte

๐Ÿ“Œ Next.js 14 and Vercel Edge Network


๐Ÿ“ˆ 20.08 Punkte

๐Ÿ“Œ How To Build a Task-Tracker Application in Minutes Using TiDB Cloud Data Service and Vercel


๐Ÿ“ˆ 20.08 Punkte

๐Ÿ“Œ Add Webhook Verification, Queueing, Filtering, and Retry Logic to Any Vercel Deployed Endpoint


๐Ÿ“ˆ 20.08 Punkte

๐Ÿ“Œ How I Resolved Prisma Issues and Fixed Vercel 500 INTERNAL_SERVER_ERROR ๐Ÿš€


๐Ÿ“ˆ 20.08 Punkte

๐Ÿ“Œ Adding comments to my Astro blog with PlanetScale & Prisma on Vercel Edge


๐Ÿ“ˆ 18.56 Punkte

๐Ÿ“Œ Social Media Cards with @vercel/og


๐Ÿ“ˆ 18.56 Punkte

๐Ÿ“Œ Who is the GOAT? ๐Ÿ”ฎ Vercel Edge Config stores my answer


๐Ÿ“ˆ 18.56 Punkte

๐Ÿ“Œ How To Make Vercel Preview Deployments 10x More Valuable for Your Team ๐Ÿ˜Ž


๐Ÿ“ˆ 18.56 Punkte

๐Ÿ“Œ How I Built Ecommerce store with Next.js + Medusa + Stripe + Vercel


๐Ÿ“ˆ 18.56 Punkte

๐Ÿ“Œ Serverless Functions & FaaS with Vercel


๐Ÿ“ˆ 18.56 Punkte

๐Ÿ“Œ Deploying Laravel applications on Vercel


๐Ÿ“ˆ 18.56 Punkte

๐Ÿ“Œ How to Host WordPress for Free on Vercel: A Step-by-Step Guide


๐Ÿ“ˆ 18.56 Punkte











matomo