databasetime-seriesquestdb

"Unexpected Symbol" when ingesting CSV into QuestDB using Java


I am trying to ingest one of the QuestDB's sample datasets. The whole file can be seen at the link, but the first lines look like this:

"symbol","side","price","amount","timestamp"
"BTC-USD","sell",25741.02,0.02188038,"2023-09-05T16:00:01.281719Z"
"BTC-USD","buy",25741.03,0.00184646,"2023-09-05T16:00:01.775613Z"
"BTC-USD","buy",25741.03,3.844E-5,"2023-09-05T16:00:02.722748Z"

I want to ingest the data using Java and the REST API. I have this code, but it is giving me an error

import org.springframework.core.io.FileSystemResource;
import org.springframework.http.HttpEntity;
import org.springframework.http.HttpHeaders;
import org.springframework.http.MediaType;
import org.springframework.http.ResponseEntity;
import org.springframework.util.LinkedMultiValueMap;
import org.springframework.util.MultiValueMap;
import org.springframework.web.client.RestTemplate;

import java.io.File;
import java.util.Map;

public class Test {
    public static void main(String[] arg) {
        try {
            String url = "http://127.0.0.1:9000/imp?fmt=json&overwrite=true&forceHeader=true";
            String filePath = "/Users/admin/btc_trades.csv";

            RestTemplate restTemplate = new RestTemplate();
            HttpHeaders headers = new HttpHeaders();
            headers.setContentType(MediaType.MULTIPART_FORM_DATA);
            MultiValueMap<String, Object> body = new LinkedMultiValueMap<>();
            String schema = "[" +
                    "{\"name\":\"symbol\", \"type\": \"varchar\"}," +
                    "{\"name\":\"side\", \"type\": \"varchar\"}" +
                    "{\"name\":\"price\", \"type\": \"double\"}" +
                    "{\"name\":\"amount\", \"type\": \"double\"}" +
                    "{\"name\":\"timestamp\", \"type\": \"timestamp\"}" +
                    "]";
            body.add("schema", schema);
            body.add("data", new FileSystemResource(new File(filePath)));
            HttpEntity<MultiValueMap<String, Object>> requestEntity = new HttpEntity<>(body, headers);
            ResponseEntity<Map> response = restTemplate.postForEntity(url, requestEntity, Map.class);

            System.out.println("Response Status Code: " + response.getStatusCode());
            System.out.println("Response Headers:" + response.getHeaders());
            System.out.println("Response Body: " + response.getBody());
        } catch (Exception e) {
            System.out.println("Exception:" + e.getMessage());
            e.printStackTrace();
        }
    }
}

I got the following output:

Response Status Code: 200 OK
Response Headers:[Server:"questDB/1.0", Date:"Mon, 18 Nov 2024 05:53:23 GMT", Transfer-Encoding:"chunked", Content-Type:"application/json; charset=utf-8"]
Response Body: {status=Unexpected symbol}

Anyone can tell what's wrong or missing with the above program.


Solution

  • There are two problems here. The first one is the JSON string for the schema is not well constructed, and there are trailing missing commas at the end of the columns definitions. The second issue, after fixing the commas, is that when we are passing a timestamp in the schema, we need to add the pattern of the timestamp. If we change the schema definition to this, it should work.

     String schema = "[" +
                        "{\"name\":\"symbol\", \"type\": \"varchar\"}," +
                        "{\"name\":\"side\", \"type\": \"varchar\"}," +
                        "{\"name\":\"price\", \"type\": \"double\"}," +
                        "{\"name\":\"amount\", \"type\": \"double\"}," +
                        "{\"name\":\"timestamp\", \"type\": \"timestamp\", \"pattern\": \"yyyy-MM-ddTHH:mm:ss.SSSUUUZ\"}" +
                        "]";
    

    An alternative would be creating the table beforehand (using the REST API, the pgwire interface, or just interactively from the web console), so we don't have to pass a schema when importing. Manual definition of the table also allows for configuration of important things like symbol capacity, or deduplication.