Generating Feeds with Next.js Route Handlers

Since I’ve started collecting notes and highlights here, I’ve been meaning to return them as formatted feeds, RSS being the main one. Well, I got around to it. It was way easier than I remembered, and I even got bonus Atom and JSON feeds out of it.

I’m using Next 13.2 and its new App Directory to generate the site, so this made feeds delightfully simple to implement. In fact, it may be the best experience I’ve ever had for developing content feeds like these. I want to share my walkthrough and results since this is a pretty common task when setting up a new project with Next, and all the existing examples were based in Next’s older pages generation system.

How to Generate RSS, Atom, and JSON Feeds with Markdown content using Next.js App Directory Route Handlers

I started from the point of already having data-fetching functions for getting all my notes from my CMS (the aptly named getAllNotes and getNoteTitle).

When adding a new function to generate the feed, it simply has to set the top-level properties then run over the notes to add them as entries. I author and store all my notes as Markdown, so for each note I render its body into HTML. Each feed format then gets its own Route Handler, which calls the generator function for the formatted feed. Finally, I update the top-level metadata to include links to the newly added feeds.

Create a Site URL

I quickly realized I needed a little utility function to get the canonical site URL. Since I build and host using Vercel, I want to make sure my site URL corresponds with its preview deploy URL. I used a combination of environment variables to figure that out, using a dedicated SITE_URL variable with Vercel’s system environment variables to figure out the build’s context and dedicated URL.

export default function getSiteUrl() {
  let protocol = "https";
  let domain = process.env.SITE_URL;
  switch (process.env.VERCEL_ENV) {
    case "preview":
      domain = process.env.VERCEL_URL;
    case "development":
    case undefined:
      protocol = "http";
  return `${protocol}://${domain}`;

Render Markdown to HTML

To render Markdown into HTML, I used the unified library with the plugins:

  1. remark-parse to parse the Markdown string into an AST
  2. remark-rehype to convert the Markdown into HTML
  3. rehype-sanitize to ensure the HTML is safe to render
  4. rehype-stringify to turn the AST back into a string

This string was then passed as the content value for each feed item.

import { unified } from "unified";
import remarkParse from "remark-parse";
import remarkRehype from "remark-rehype";
import rehypeSanitize from "rehype-sanitize";
import rehypeStringify from "rehype-stringify";

export default async function markdownToHtml(input: string) {
  const file = await unified()

  return file;

Create the Feed

With other site generation frameworks I’ve used, generating feeds has meant writing a template XML file and filling in dynamic values with curly-braced variables, usually with that format’s spec open alongside. This time, I was able to use the feed package for all the XML authoring. As a result, generating multiple feed formats became a matter of making a function call.

The generateFeed function is based on an example provided by Ashlee M Boyer. It creates a feed with proper metadata, then generates each post. Since the Markdown generation runs asynchronously, adding entries needs to happen inside a Promise.all call. This way, generateFeed waits to return the feed object until all content has finished generating.

import { Feed } from "feed";
import smartquotes from "smartquotes";
import getAllNotes from "src/data/getAllNotes";
import getNoteTitle from "src/data/getNoteTitle";
import markdownToHtml from "./markdownToHtml";
import getSiteUrl from "./getSiteUrl";

export default async function generateFeed() {
  const notes = await getAllNotes();
  const siteURL = getSiteUrl();
  const date = new Date();
  const author = {
    name: "Allan Lasser",
    email: "",
    link: "",
  const feed = new Feed({
    title: "Allan Lasser",
    description: "Thoughts, reading notes, and highlights",
    id: siteURL,
    link: siteURL,
    image: `${siteURL}/logo.svg`,
    favicon: `${siteURL}/favicon.png`,
    copyright: `All rights reserved ${date.getFullYear()}, Allan Lasser`,
    updated: date,
    generator: "Feed for Node.js",
    feedLinks: {
      rss2: `${siteURL}/feeds/rss.xml`,
      json: `${siteURL}/rss/feed.json`,
      atom: `${siteURL}/rss/atom.xml`,
  await Promise.all(
      async (note) =>
        new Promise<void>(async (resolve) => {
          const id = `${siteURL}/notes/${note._id}`;
          const url = note.source?.url ? note.source.url : id;
          const content = String(await markdownToHtml(smartquotes(note.body)));
            title: smartquotes(getNoteTitle(note)),
            link: url,
            date: new Date(note._createdAt),
  return feed;

Create the Feed Endpoints

Now here comes the fun part. Creating feed endpoints becomes so simple it’s silly. Using Route Handlers introduced in Next.js 13.2, adding a new endpoint is as simple as creating a folder in the App Directory with the name of the feed file, then creating a route.ts file inside it.

So, to add the RSS feed, I create the folder src/app/feeds/rss.xml and then create route.ts inside it.

import generateFeed from "src/utils/generateFeed";

export async function GET() {
  const feed = await generateFeed();
  return new Response(feed.rss2(), {
    headers: { "Content-Type": "application/rss+xml" },

To create the Atom and JSON feeds, I follow the same process ensuring that the appropriate method and content type are used in the format’s route handler.

import generateFeed from "src/utils/generateFeed";

export async function GET() {
  const feed = await generateFeed();
  return new Response(feed.atom1(), {
    headers: { "Content-Type": "application/atom+xml" },
import generateFeed from "src/utils/generateFeed";

export async function GET() {
  const feed = await generateFeed();
  return new Response(feed.json1(), {
    headers: { "Content-Type": "application/json" },

Adding alternates to site metadata

The last step is updating the site’s <head> to reference these feeds to make them more discoverable to readers. This is made even easier using the App Directory’s Metadata APIalso new to Next.js 13.2. In the top-most page or layout file in my app directory, I add an alternates property to the exported metadata object:

import { Metadata } from "next";
import getSiteUrl from "src/utils/getSiteUrl";

export const metadata: Metadata = {
  title: "Allan Lasser",
  viewport: { width: "device-width", initialScale: 1 },
  icons: [{ type: "image/x-icon", url: "/static/favicon.ico" }],
  alternates: {
    canonical: getSiteUrl(),
    types: {
      "application/rss+xml": `${getSiteUrl()}/feeds/rss.xml`,
      "application/atom+xml": `${getSiteUrl()}/feeds/atom.xml`,
      "application/json": `${getSiteUrl()}/feeds/feed.json`,

That’s it!

Now after running next dev, I can see I have feed files generated at /feeds/rss.xml, /feeds/atom.xml, and /feeds/feed.json. I’ve gotten feeds in three different formats with only a few libraries and simple, easily testable functions.

After deploying to production, you can now follow my new notes via:

The flourishing, decentralized Web

The level of productivity I feel when using Next.js, Vercel, and GitHub together is really hard to beat. It feels like the tools are getting out of my way and letting me developer smaller PRs faster.

I’m still a daily RSS user. It’s my preferred way to read on the web. I’m glad to see that there’s still robust library support for RSS and feed generation, at least within the Node ecosystem at least. I don’t think RSS is going anywhere, especially since it powers the entire podcasting ecosystem. It’s great to see the longevity of these open standards.

Speaking of open standards, integrating an ActivityPub server into a Next.js application is something I’m interested in exploring next. It’d be very cool to have a site generated out of an aggregation of one’s own ActivityPub feeds, for example combinining posts from personal, Mastodon and Pixelfed into a single syndicated feed.

Seeing all of the recent progress in decentralizing important services has felt so cool. We can still keep the Web wild and weird, empower individuals with more tools for expressing themselves online, and have it all be user-friendly. Content feeds are an important force for good here, so I’m very glad how easy it is these days for even a novice developer to publish them.