I don't know what is the efficient way to write table records to CSV using Python. There is any best method to write CSV file for more than 1 million records for postgres DB? I thought to use celery with rabbitmq for that but no idea on how to do that for generating tasks in broker.
any ideas on it is more helfull
does postgres copy command is more effect than celey? if so does server wont crash if we are writing more than 10 million records to csv?
I am using FastAPI as my backend
Thank you in advance.