pythonsql-serverpython-3.xpypyodbc

How to call SQL Server stored procedure with so many parameters in python?


I have to insert records in a db table with more than 50 columns. I know how to call stored procedure with parameters in Python like this:

 self.__createConnection()
    proc= "{call faftech..Hns_HolidayCheck (?)}"
    ls=[]
    ls.append(dateTime)
    param= ls
    self.__cursor.execute(proc, param)
    val= self.__cursor.fetchone()
    self.__closeConnection()

I want to know the best way to insert the data in the table with so many columns. I am using pypyOdbc.


Solution

  • There's sadly no easy way to do this. The best way is to provide the table_name, column_names, and values and build up the sql by yourself:

    def insert_row(self, table_name, column_names, values):
        params = [ '?' ] * len(values)
        params = ','.join(params)
        column_names = [ f'[{x}]' for x in column_names ]
        column_names = ','.join(column_names)
        sql = f'insert into [{table_name}] ({column_names}) values ({params})'
        conn = self.getConnection()
        with conn.cursor() as cursor:
            conn.execute(sql, values)
    

    Alternatively, you can send in a dict of column_names => values:

    def insert_row(self, table_name, values):
        params = [ '?' ] * len(values)
        params = ','.join(params)   
        q, column_names = [], []
        for column_name, value in values.items():
            q.append(value)
            column_names.append(f'[{column_name}]')
        column_names = ','.join(column_names)
        sql = f'insert into [{table_name}] ({column_names}) values ({params})'
        conn = self.getConnection()
        with conn.cursor() as cursor:
            conn.execute(sql, q)
    

    Alternatively, you can use django which will make this much easier to manage.