javascripttypescriptasynchronousstreaming

How to split chunks into individual JSON objects


I am fetching from an api which sends data in a stream. Unfortunately when I process the data in chunks, the chunks come as multiple JSON strings like this;

         { 
           "productName": "bag",
           "code": "BGX-112"
         } 
         { 
           "productName": "purse",
           "code": "PUSR-112"
         } 
         etc..


Here is the code that process the data:


            const getProductData = async (url:string) => {
               const decoder = new TextDecoderStream()
               let chunks: any[];
               chunks = [];
               try {
                   const response = await fetch(url)
                   const stream = response?.body.pipeThrough(new TextDecoderStream());
                   const reader = stream.getReader()
                   while (true) {
                       const {value, done} = await reader.read();
                       if (done) {
                           // Flush any buffered characters.
                           const stringArray = value?.split(/\b\s/);
                           chunks.push(value);
                           return chunks;
                       }
                       if (value) {
                           const stringArray = value?.split(/\b\s/);
                           chunks.push(value);
                       }
        
                   }
               }catch (e) {
                   console.error("Error ",e)
               }
           }



I want to get the chunks as individual JSON objects so I can parse them, but I don't know how. How can I achieve this?


Solution

  • Edited

    To handle streamed JSON data, use TextDecoder to assemble chunks into a string Look for delimiters (e.g., }\n{) to identify complete JSON objects split the string at these points and parse each segment with JSON.parse()

    Server-Side:

    const express = require('express');
    const app = express();
    const port = 3000;
    
    app.get('/large-json', (req, res) => {
        res.setHeader('Content-Type', 'application/json');
    
        for (let i = 0; i < 1000; i++) {
            const jsonObject = JSON.stringify({ productName: `item${i}`, code: `CODE-${i}` });
            res.write(jsonObject + '\n');  // Separate objects with a newline
        }
    
        res.end();
    });
    
    app.listen(port, () => {
        console.log(`Server running at http://localhost:${port}/`);
    });
    

    Client-Side :

    const getProductData = async (url: string) => {
        try {
            const response = await fetch(url);
            const reader = response.body?.getReader();
    
            let decoder = new TextDecoder();
            let partialData = '';
    
            if (!reader) throw new Error("Stream reader not available");
    
            while (true) {
                const { value, done } = await reader.read();
                if (done) break;
    
                partialData += decoder.decode(value, { stream: true });
    
                let lines = partialData.split('\n');
                partialData = lines.pop()||'';  
    
                for (let line of lines) {
                    if (line.trim()) {
                        try {
                            const jsonObject = JSON.parse(line);
                            console.log(jsonObject);
                        } catch (e) {
                            console.error("JSON parsing error: ", e);
                        }
                    }
                }
            }
    
            if (partialData.trim()) {
                try {
                    const jsonObject = JSON.parse(partialData);
                    console.log(jsonObject);
                } catch (e) {
                    console.error("JSON parsing error in final chunk: ", e);
                }
            }
        } catch (e) {
            console.error("Error ", e);
        }
    };
    getProductData("http://localhost:3000/large-json");