0

I don't know what is the efficient way to write table records to CSV using Python. There is any best method to write CSV file for more than 1 million records for postgres DB? I thought to use celery with rabbitmq for that but no idea on how to do that for generating tasks in broker.

any ideas on it is more helfull

does postgres copy command is more effect than celey? if so does server wont crash if we are writing more than 10 million records to csv?

I am using FastAPI as my backend

Thank you in advance.

Ramu
  • 3
  • 4
  • Does this answer your question? [How to export table as CSV with headings on Postgresql?](https://stackoverflow.com/questions/1120109/how-to-export-table-as-csv-with-headings-on-postgresql) – 404 Jan 08 '22 at 11:32
  • what if the COPY command takes more time for writing 10 million records to CSV? the server will crash, right? – Ramu Jan 08 '22 at 11:36
  • It does not take very long to export data with the COPY command. I don't think anything is faster than that. There is no reason to expect the server to crash even if it takes a long time. – 404 Jan 08 '22 at 11:39
  • so basically using celery will be unnecessary right? how can i integrate this copy command to my controller so that it can write file automatically ? – Ramu Jan 08 '22 at 11:45
  • COPY command is the fastest way to export, i don't see why the PGSQL crash if you export data ? – mshabou Jan 08 '22 at 12:19
  • I want to include the export logic in one of my controller (api end point) there will be a delay in response, right? if records are more not sure is there any other better way to implement this logic from api end point of view – Ramu Jan 08 '22 at 13:06

0 Answers0