0

I'm trying to import large excel files with multiple sheets into my sql server but the problem is that it taking too much time, is there any way i can make it efficient or do it in a better way.I'm kinda new to this language so any help can be grateful.Here is my code:

    filename = input("Input the Filename: ")
    dfs = pd.read_excel(filename, usecols=['SR_NO','NTN'], sheet_name=None)
    
    d = {}
    for k, v in dfs.items():
        d[k] = pd.concat(df for df in dfs.values()).to_numpy().tolist()
   #to test sheetnames  
     print (d.keys())

#cursor connection to insert records
             try:
                    cursor = conn.cursor()
                    cursor.executemany(sql_insert, records)
                    cursor.commit();
                except Exception as e:
                    cursor.rollback()
                    print(str(e[1]))
                finally:
                    print('Task is complete.')
                    cursor.close()
                    conn.close()

0 Answers0