How to use React’s concurrent mode

React’s new concurrent mode allows your interface to be rendered while data fetching is in progress, providing an improved render lifecycle and a simple way to achieve parallel data fetching for multiple components.

How to use React’s concurrent mode
graemenicholson / Getty Images

This article introduces you to the idea behind React’s concurrent mode as well as some of its usage and benefits. React’s concurrent mode is an innovative set of features designed to improve the handling of asynchronous rendering. These improvements make for a better end-user experience.

One of the perennial issues that has dogged web clients since time immemorial is dealing with rendering of asynchronous updates. The React team continues its tradition of introducing ambitious solutions into the framework by adding concurrent mode support to the React 16.x release line.

There are a variety of cases where naive rendering of changing state leads to less-than-desirable behavior including tedious loading screens, choppy input handling, and unnecessary spinners, to name a few.

Addressing such issues piecemeal is error-prone and inconsistent. React’s concurrent mode represents a wholesale, baked-into-the-framework solution. The core idea: React now draws updates concurrently in memory, supports interruptible rendering, and offers ways for application code to interact with that support.

Enabling concurrent mode in React

The API for harnessing these capabilities is still in flux, and you have to install it explicitly, like so:

npm install react@experimental react-dom@experimental

Concurrent mode is a global change to the way React works, and requires that the root level node be passed through the concurrent engine. This is done by calling createRoot on the app root, instead of just reactDOM.render(). This is seen in Listing 1.

Listing 1. Using the concurrent renderer

).render(<App />);

Note that createRoot is available only if you’ve installed the experimental package. And because it is a fundamental change, existing codebases and libraries are likely not compatible with it. Especially the lifecycle methods that are now prepended with UNSAFE_ are not compatible.

Because of this fact, React introduces a middle step between the old-school render engine that we use today and the concurrent mode. This step is called “blocking mode” and it is more backward compatible, but with fewer concurrent features.

In the long-term, concurrent mode will become the default. In the mid-term, React will support the following three modes as described here:

  1. Legacy mode: ReactDOM.render(<App />, rootNode). The existing legacy mode.
  2. Blocking mode: ReactDOM.createBlockingRoot(rootNode).render(<App />). Fewer breaking changes, fewer features.
  3. Concurrent mode: ReactDOM.createRoot(rootNode).render(<App />). Full concurrent mode with many breaking changes.

A new rendering model in React

Concurrent mode fundamentally alters the way React renders the interface to allow the interface to be rendered while fetching of data is in progress. This means that React must know something about your components. Specifically, React must now know about the data-fetching status of your components.

React’s new Suspense component

The most prominent feature is the new Suspense component. You use this component to inform React that a given area of the UI is dependent on asynchronous data loading and give it the status of such loading.

This capability acts at the framework level and means that your data-fetching library must alert React to its status by implementing the Suspense API. Currently Relay does this for GraphQL and the react-suspense-fetch project is tackling REST data fetching.

To reiterate, you are now required to use a more intelligent data fetching library that is capable of telling React what its status is, thereby allowing React to optimize the way your UI renders.

Look at this example from the React examples. Listing 2 has the important view template details.

Listing 2. Using Suspense in the view

<Suspense fallback={<h1>Loading profile...</h1>}>
      <ProfileDetails />
      <Suspense fallback={<h1>Loading posts...</h1>}>
        <ProfileTimeline />

Notice that Suspense allows for the definition of alternate loading content. This is analogous to how you might use different return values inside a component based on its loading status in the old rendering engine to render a placeholder until the data is ready.

Inside the components used by this view template, no special code is required to deal with the loading state. This is all now handled behind the scenes by the framework and data fetching library.

For example, the ProfileDetails component can innocently load its data and return its markup as in Listing 3. Again, this is dependent on the data store (in Listing 3 the resource object) implementing the Suspense API.

Listing 3. ProfileDetails

function ProfileDetails() {
  const user =; 
  return <h1>{}</h1>;

Concurrency in data fetching

An important benefit of this setup that bears repeating is that all data fetching will occur concurrently. So your UI benefits from both an improved render lifecycle and a simple and automatic way to achieve parallel data fetching for multiple components.

React’s useTransition hook

The next major tool in your new concurrent React kit is the useTransition hook. This is a more fine-grained tool that allows you to tune how UI transitions occur. Listing 4 has an example of wrapping a transition with useTransition.

Listing 4. useTransition

function App() {
  const [resource, setResource] = useState(initialResource);
  const [ startTransition, isPending ] = useTransition({ timeoutMs: 3000 });
  return (
         onClick={() => {
            startTransition(() => {
              const nextUserId = getNextId(resource.userId);
       }}> Next </button>
        {isPending ? " Loading..." : null} 
       <ProfilePage resource={resource} />

What the code in Listing 4 says is, “Delay displaying the new state up to three seconds.” This code works because the ProfilePage, when used, is wrapped by a Suspense component. React begins fetching the data, and instead of displaying the placeholder, displays the existing content for as a long as the defined timeoutMs. As soon as fetching is complete, React will display the updated content. This is a simple mechanism for improving the perceived performance of transitions.

The startTransition function exposed by useTransition allows you to wrap the fetching portion of code, while the isPending function exposes a boolean flag you can use to handle conditional loading display.

All of this magic is possible because React’s concurrent mode has implemented a kind of background rendering mechanism: React renders your updated state UI in memory in the background while fetching is happening. You can get a more detailed sense of how this works here.

React’s useDeferredValue hook

Our last example involves fixing the problem of choppy typing when typing causes data loading. This is a fairly canonical problem that is often solved with debounce/throttle on the input. Concurrent mode opens up a more consistent and smoother solution: the useDeferredValue hook.

An example is here. The genius of this solution is that it gets you the best of both worlds. The input remains responsive and the list updates just as soon as the data is available.

Listing 5. useDeferredValue in action

const [text, setText] = useState("hello");
  const deferredText = useDeferredValue(text, {    timeoutMs: 5000  });
  // ....  
  <MySlowList text={deferredText} /> 

Similar to how we wrapped a transition with useTransition, we are wrapping a resource value with useDeferredValue. This allows the value to remain as-is for as long as the timeoutMs value. All the complexity of managing this improved render is handled behind the curtain by React and the data store.

Solution to race conditions

Another benefit by using Suspense and concurrent mode is that race conditions introduced by manually loading data in lifecycle hooks and methods are avoided. Data is guaranteed to arrive and be applied in the order it is requested. (This is similar to how Redux fixes race conditions.) Therefore, the new mode obviates the need for manually checking data staleness due to the interleaving of request responses.

These are some of the highlights to the new concurrent mode. They offer compelling benefits that will become the norm going forward.

Copyright © 2021 IDG Communications, Inc.