pythonpandaskdbqpython

How can I commit a dataframe to kdb as a new table?


I'm aware of this question was already answered here, however as a new q-bie I am still quite confused with how to create and insert a new table using q.

My problem: I have a dataframe and I am using qPython to connect to kdb. I want to write this dataframe to kdb as a new table.

My question: it seems like I could do something like this:

df = pd.DataFrame({'sym':['abc','def','ghi'],'price':[10.1,10.2,10.3]})
with qconnection.QConnection(host = 'localhost', port = 5001, pandas = True) as q:
    q.sync('{t::x}',df)

But what is this {t::x} here? would t be my table name? and what would x be?

How do I specify column types and size restrictions or they are not needed? Thanks!


Solution

  • It's important to note that if you are using qPython 2.* it should be q.sendSync instead of q.sync.

    q.sendSync(kdbquery, parameters)

    q.sendSync("{t::x}", df)
    {} function syntax
    : assignment like = in python
    :: global assignment so that the table is defined outside of the function
    t is the variable name / table name 
    x is the parameter as q/kdb can use implied notation for x,y,z. You could use explicit notation like so "{[x]t::x}"
    

    No need to worry about column types as kdb will automatically interpret these and can be changed with a qSQL update statement. Heres a handy kx reference card for datatypes.

    There is no column size limit as such but there is a limit of 2GB of data over IPC. Edit: (3.4+ kdb has a 1TB IPC limit)