javascriptreactjsreact-hooksasync-awaitstreaming

React 18 streaming chat messages updating with stale state


i’m building a small AI chat ui in react (next.js app router, react 18) that streams tokens from my backend (openai style stream). basic flow:

it “kind of” works locally, but when i click fast / send multiple prompts or in production build, the ui goes crazy:

i know about react 18 concurrent rendering / strict mode double invoking, stale closures, etc. my gut says i’m closing over messages inside the async function and then using it in setMessages([...messages, ...]) while new renders already happened. but i’m not 100% sure what is the idiomatic pattern here for streaming ai tokens:

here’s a simplified version of what i’m doing right now (this is the broken one). where exactly is the bug and how would you structure this properly for streaming ai responses?

import { useState } from "react";

type Message = {
  id: string;
  role: "user" | "assistant";
  content: string;
};

export default function Chat() {
  const [messages, setMessages] = useState<Message[]>([]);
  const [loading, setLoading] = useState(false);

  const handleSend = async (userInput: string) => {
    if (!userInput.trim()) return;

    // push user message
    const userMsg: Message = {
      id: crypto.randomUUID(),
      role: "user",
      content: userInput,
    };
    setMessages([...messages, userMsg]);

    setLoading(true);

    try {
      const res = await fetch("/api/chat", {
        method: "POST",
        body: JSON.stringify({
          messages: messages, // send whole history
          input: userInput,
        }),
        headers: { "Content-Type": "application/json" },
      });

      const reader = res.body?.getReader();
      let assistantMsg: Message = {
        id: crypto.randomUUID(),
        role: "assistant",
        content: "",
      };

      if (!reader) {
        setLoading(false);
        return;
      }

      // stream chunks
      while (true) {
        const { value, done } = await reader.read();
        if (done) break;

        const chunk = new TextDecoder().decode(value || new Uint8Array());
        assistantMsg.content += chunk;

        // ❌ this is where things go wrong when sending multiple messages fast
        // messages here is not the latest one and strict mode makes it worse
        setMessages([
          ...messages,
          userMsg,
          assistantMsg, // keeps getting overwritten / duplicated
        ]);
      }
    } catch (e) {
      console.error(e);
    } finally {
      setLoading(false);
    }
  };

  return (
    <div>
      {/* imagine there is an input that calls handleSend */}
      {messages.map((m) => (
        <div key={m.id}>
          <b>{m.role}:</b> {m.content}
        </div>
      ))}
      {loading && <div>Thinking…</div>}
    </div>
  );
}

Solution

  • Yes, you appear to be closing over the external messages state value when you use setMessages([...messages, userMsg]);, which can/will easily go stale. The issue becomes apparent in the while-loop because React does not immediately update the state within a callback scope, i.e. the handleSend function body, messages is a const variable that won't update until all the current render cycle completes and all enqueued state updates are processed, and you use the same stale old messages value for each one.

    You should use the function state update callback that is passed the current state value.

    Example:

    setMessages(messages => [...messages, userMsg]);
    
    setMessages(messages => [
      ...messages,
      userMsg,
      assistantMsg,
    ]);
    

    This way you are always updating from the previous messages state value and not whatever happens to be closed over in callback scope.