7

I need help to get this working. I have a pd.DataFrame (df), which I need to load to a MySQL database. I don't understand what the error message means and how to fix it.

Any help will be highly appreciated.

This is what I tried:

    import MySQLdb
    from pandas.io import sql

    #METHOD 1 
    db=MySQLdb.connect(host="***",port=***,user="***",passwd="***",db="***")
    df.to_sql(con=db, name='forecast', if_exists='replace', flavor='mysql')
    ##Also tried
    sql.write_frame(df, con=db, name='forecast', if_exists='replace', flavor='mysql')

   **DatabaseError**: Execution failed on sql: SHOW TABLES LIKE %s
   (2006, 'MySQL server has gone away')
   unable to rollback


   #METHOD 2: using sqlalchemy
   from sqlalchemy import create_engine

   engine =   create_engine("mysql+mysqldb://**username***:**passwd**@***host***:3306/**dbname**")
   conn = engine.raw_connection()
   df.to_sql(name='demand_forecast_t', con=conn,if_exists='replace',    flavor='mysql',index=False, index_label='rowID')
   conn.close()

The error message is:

**OperationalError**: DatabaseError: Execution failed on sql: SHOW TABLES LIKE %s
(2006, 'MySQL server has gone away') unable to rollback
Franck Dernoncourt
  • 69,497
  • 68
  • 312
  • 474
Amrita Sawant
  • 9,645
  • 4
  • 19
  • 26

4 Answers4

16

When using sqlalchemy, you should pass the engine and not the raw connection:

engine = create_engine("mysql+mysqldb://...")
df.to_sql('demand_forecast_t', engine, if_exists='replace', index=False)

Writing to MySQL without sqlalchemy (so with specifying flavor='mysql') is deprecated.

When the problem is that you have a too large frame to write at once, you can use the chunksize keyword (see the docstring). Eg:

df.to_sql('demand_forecast_t', engine, if_exists='replace', chunksize=10000)
joris
  • 121,165
  • 35
  • 238
  • 198
0

I was able to resolve this issue. I was trying to load a large table into MySQL and as a result of which was getting the error. A simple for-loop to upload data in chunks solved the issue ! Many thanks to everyone who replied.

Amrita Sawant
  • 9,645
  • 4
  • 19
  • 26
0

For me this was fixed using

MySQLdb.connect("127.0.0.1","root","","db" )

instead of

MySQLdb.connect("localhost","root","","db" )

and then

df.to_sql('df',sql_cnxn,flavor='mysql',if_exists='replace', chunksize=100)
citynorman
  • 4,262
  • 2
  • 34
  • 35
0

You can write pandas dataframe in mysql table using mysql flavour(with DBAPI connection) in following ways

step1: install mysqldb module - $ sudo apt-get install python-dev libmysqlclient-dev then $ pip install MySQL-python

step2: make a connection with mysql import MySQLdb con = MySQLdb.connect("hostname","username","password","databasename")

step3: write pandas dataframe in mysql table by using df.to_sql df.to_sql('TableName',con = con,flavor='mysql',if_exists='replace', chunksize=100)

pyAddict
  • 1,386
  • 12
  • 15