I have been trying to write large amount (>800mb) of data to JSON file; I did some fair amount of trial and error to get this code:
def write_to_cube(data):
with open('test.json') as file1:
temp_data = json.load(file1)
temp_data.update(data)
file1.close()
with open('test.json', 'w') as f:
json.dump(temp_data, f)
f.close()
to run it just call the function write_to_cube({"some_data" = data})
Now the problem with this code is that it's fast for the small amount of data, but the problem comes when test.json file has more than 800mb in it. When I try to update or add data to it, it takes ages.
I know there are external libraries such as simplejson or jsonpickle, I am not pretty sure on how to use them.
Is there any other way to this problem?
Update:
I am not sure how this can be a duplicate, other articles say nothing about writing or updating a large JSON file, rather they say only about parsing.
Is there a memory efficient and fast way to load big json files in python?
Reading rather large json files in Python
None of the above resolve this question a duplicate. They don't say anything about writing or update.