When you have slow, non-essential data that loads in your page, streaming the data can be a wonderful way to keep the page faster and interactive.
We build and analyze a streaming example in all major frameworks where streaming is available.
When we have some data in a webpage that can take longer to load than the essential data, this can slow down the page. Streaming allows us to load the important data first, then load the non-essential data later.
Streaming has proven to help with SEO including faster First Contentful Paint, better Largest Contenful Paint and Time to First Byte.
Streaming loads slow data last, and then JavaScript will insert the data into the correct place once the entire page is loaded. That’s the simplest explanation.
Imagine you are viewing a blog post with many comments. You need to be able to view the blog post data first, so that items like the title and description can be immediately shown in the <head>
and header of your article.
If you have several comments, it would be better to separate your database query so that you can get the comments later. The comments are less important and can be displayed after the essential data has loaded.
In our applications, we will be fetching a random todo item from an external example API in Latin.
type Todo = {
title: string
};
export const getTodo = async () => {
const randomTodo = Math.floor(Math.random() * 200) + 1;
return await fetch(`https://jsonplaceholder.typicode.com/todos/${randomTodo}`)
.then(r => r.json()) as Todo;
};
This API has 200 different todo items, so we will select one from random each time the page is loaded. If you refresh the page, you should get a new item.
While I personally hate React and JSX, I give a lot of props to the Next.js team at Vercel. Next.js has been controversial lately, but streaming is definitely a strong suit.
// todo.tsx
'use server';
type Todo = {
title: string
};
export const getTodo = async () => {
const randomTodo = Math.floor(Math.random() * 200) + 1;
return await fetch(`https://jsonplaceholder.typicode.com/todos/${randomTodo}`, {
cache: 'no-cache'
})
.then(r => r.json()) as Todo;
};
export default async function Todo() {
const todo = await getTodo();
return (
<h2>{todo.title}</h2>
);
}
We are using a server component with use server
. Notice we must add no-cache
option to prevent Next.js from caching our page.
I am also using a loading spinner. Keep in mind, this is the beauty of streaming. We are loading directly from the server and then the browser.
export default function Loading() {
return (
<div role="status">
<svg
aria-hidden="true"
className="w-8 h-8 text-gray-200 animate-spin dark:text-gray-600 fill-blue-600"
viewBox="0 0 100 101"
fill="none"
xmlns="http://www.w3.org/2000/svg"
>
<path
d="M100 50.5908C100 78.2051 77.6142 100.591 50 100.591C22.3858 100.591 0 78.2051 0 50.5908C0 22.9766 22.3858 0.59082 50 0.59082C77.6142 0.59082 100 22.9766 100 50.5908ZM9.08144 50.5908C9.08144 73.1895 27.4013 91.5094 50 91.5094C72.5987 91.5094 90.9186 73.1895 90.9186 50.5908C90.9186 27.9921 72.5987 9.67226 50 9.67226C27.4013 9.67226 9.08144 27.9921 9.08144 50.5908Z"
fill="currentColor"
/>
<path
d="M93.9676 39.0409C96.393 38.4038 97.8624 35.9116 97.0079 33.5539C95.2932 28.8227 92.871 24.3692 89.8167 20.348C85.8452 15.1192 80.8826 10.7238 75.2124 7.41289C69.5422 4.10194 63.2754 1.94025 56.7698 1.05124C51.7666 0.367541 46.6976 0.446843 41.7345 1.27873C39.2613 1.69328 37.813 4.19778 38.4501 6.62326C39.0873 9.04874 41.5694 10.4717 44.0505 10.1071C47.8511 9.54855 51.7191 9.52689 55.5402 10.0491C60.8642 10.7766 65.9928 12.5457 70.6331 15.2552C75.2735 17.9648 79.3347 21.5619 82.5849 25.841C84.9175 28.9121 86.7997 32.2913 88.1811 35.8758C89.083 38.2158 91.5421 39.6781 93.9676 39.0409Z"
fill="currentFill"
/>
</svg>
<span className="sr-only">Loading...</span>
</div>
);
}
Note: This example spinner was taken from Flowbite and requires Tailwind.
You could try the Loader component available with KendoReact Free.
React already has a element that can load slow data. It is called <Suspense />
.
import { Suspense } from "react";
import Todo from "./todo";
import Loading from "./loading";
export default function Home() {
return (
<main className="flex flex-col justify-center items-center mt-5 gap-3">
<h1 className="text-2xl">Todo</h1>
<Suspense fallback={<Loading />}>
<Todo />
</Suspense>
</main>
);
}
By wrapping our element in Suspense, and using our loader as a fallback, everything works perfectly.
Demo: Vercel
Repo: GitHub
Docs: Loading UI and Streaming
SvelteKit does not have server components, but page loaders.
import type { PageServerLoad } from "./$types";
type Todo = {
title: string
};
const getTodo = async () => {
const randomTodo = Math.floor(Math.random() * 200) + 1;
return await fetch(`https://jsonplaceholder.typicode.com/todos/${randomTodo}`)
.then(r => r.json()) as Todo;
};
export const load: PageServerLoad = () => {
return {
todos: getTodo()
};
};
When we return our async function directly, the function will be streamed to the browser as it resolves.
export const load: PageServerLoad = () => {
return {
post: getPost()
};
};
Unless we have essential data that needs to be loaded first.
export const load: PageServerLoad = () => {
const post = await getPost();
return {
post
};
}
// OR
export const load: PageServerLoad = () => {
return {
post: await getPost()
};
}
Note: Make sure a promise cannot be rejected before returning the function to properly handle errors.
We can get the page data using await
inside our component. We catch the error if necessary or display the loader. Beautiful!
<script lang="ts">
import type { PageProps } from "./$types";
import Loading from "./loading.svelte";
let { data }: PageProps = $props();
</script>
<main class="flex flex-col justify-center items-center mt-5 gap-3">
<h1 class="text-2xl">Todo</h1>
{#await data.todos}
<Loading />
{:then todo}
<h2>{todo.title}</h2>
{:catch error}
<p>{error.message}</p>
{/await}
</main>
Svelte is my personal favorite framework by far, so I’m glad this feature is available!
Demo: Vercel
Repo: GitHub
Docs: Streaming with Promises
Qwik follows the same pattern, but is not ready yet.
type Todo = {
title: string
};
export const useTodo = routeLoader$(() => {
return async () => {
const randomTodo = Math.floor(Math.random() * 200) + 1;
return await fetch(`https://jsonplaceholder.typicode.com/todos/${randomTodo}`)
.then(r => r.json()) as Todo;
}
});
Similar to <Suspense />
, Qwik has a <Resource />
component which loads data as expected.
export default component$(() => {
const todo = useTodo();
return (
<main class="flex flex-col justify-center items-center mt-5 gap-3">
<h1 class="text-2xl">Todo</h1>
<Resource
value={todo}
onPending={() => <Loading />}
onResolved={(todo) => <h2>{todo.title}</h2>}
/>
</main>
);
});
Note: Currently, the onPending
does not work correctly to display a loading state, and there is a GitHub Issue for this. Qwik technically streams the response, but still awaits for it to finish. V2 will fix this.
SolidJS’s server framework is built for streaming, literally.
The query must be loaded with use server
to function correctly.
type Todo = {
title: string
};
const getTodo = query(async () => {
'use server';
const randomTodo = Math.floor(Math.random() * 200) + 1;
const todo = await fetch(`https://jsonplaceholder.typicode.com/todos/${randomTodo}`);
return await todo.json() as Todo;
}, 'todo');
export const route = {
preload: () => getTodo()
} satisfies RouteDefinition;
And the function must be added to preload
.
export default function Home() {
const todo = createAsync(() => getTodo());
return (
<main class="flex flex-col justify-center items-center mt-5 gap-3">
<h1 class="text-2xl">Todo</h1>
<ErrorBoundary fallback={<div>Something went wrong!</div>}>
<Suspense fallback={<Loading />}>
<Show when={todo()}>
{(data) => (
<h2>{data().title}</h2>
)}
</Show>
</Suspense>
</ErrorBoundary>
</main>
);
}
We must get the data with createAsync
first. We can put it inside an ErrorBoundary
to catch errors. Suspense
determines when we are loading, and Show
will display the component when it is available.
I find SolidJS to have too much boilerplate for my personal tastes, but it has worked with Streaming from day one.
Demo: Vercel
Repo: GitHub
Docs: Data loading always on the server
Nuxt currently has an open issue and plans to add the feature soon!
Angular probably will never implement it, even though it is a semi-server framework with @angular/ssr
. There was an old feature request that was denied. Analog has a newer feature request, but no confirmation.
Streaming is an amazing tool to add to your repertoire. When you need to speed up your page, it will definitely help you outshine the competition with better SEO page load speeds.
Jonathan Gamble has been an avid web programmer for more than 20 years. He has been building web applications as a hobby since he was 16 years old, and he received a post-bachelor’s in Computer Science from Oregon State. His real passions are language learning and playing rock piano, but he never gets away from coding. Read more from him at https://code.build/.