I'm trying to merge very large csv files together using pandas and keep running out of memory. Is there a more memory efficient way to do this?
I've tried using Dask Dataframes instead of just pandas, but I still ran into the same problem.
temp_df = dd.merge(gcs_df, dias_fio_df, how='inner', on=['charttime', 'subject_id'])
I get a MemoryError: screenshot_of_error