This has almost certainly been asked.
I have timeseries
data that looks like this in a .csv
file. The data has no heading, but this is what the different fields are: date,time,open,high,low,close,volume,0
2017-09-22,21:50:00,131.36,131.415,131.35,131.39,78489,0
2017-09-22,21:55:00,131.39,131.4,131.37,131.38,95322,0
2017-09-22,22:00:00,131.38,131.6,131.31,131.39,1212804,0
I am attempting to import this data into InfluxDB
from python
. I see in the documentation that you have to create the json
for it line by line, but I am uncertain how this is done:
This is close, but no cigar. Assume I have the string already split and the datetime-timestamp already in the correct format. When I print the json_body below, I get e.g.,
[{'fields': {'Int_value': '1212804', 'Float_value': '131.39'}, 'time': datetime.datetime(2017, 9, 22, 22, 0), 'measurement': 'quote'}]
which doesn't seem right - I seem to be getting only one float value.
What is the correct JSON for input to InfluxDB?
import datetime
import random
import time
import os
import csv
from csv import reader
import argparse
from influxdb import client as influxdb
db = influxdb.InfluxDBClient("127.0.0.1", 8086, "", "", "stocks")
def read_data(filename):
print filename
with open(filename) as f:
lines = f.readlines()[1:]
return lines
if __name__ == '__main__':
filename = r"jnj.us.csv"
lines = read_data(filename)
for rawline in lines:
line = rawline.split(",")
d= getPythonDateTimeFromStr(line[0], line[1])
#EVERYTHING UP TO HERE WORKS. Not sure how to create the json below
#====================================
json_body = [
{
"measurement": "quote",
"time": d,
"fields": {
"Float_value": line[2],
"Float_value": line[3],
"Float_value": line[4],
"Float_value": line[5],
"Int_value": line[6]
}
}
]
print json_body
db.write_points(json_body )
Duh,
"fields": {
"Float_value": line[2],
"Float_value": line[3],
"Float_value": line[4],
"Float_value": line[5],
"Int_value": line[6]
}
Should be
"fields": {
"Open": line[2],
"High": line[3],
"Low": line[4],
"Close": line[5],
"Volume": line[6]
}