4

I have been trying to write large amount (>800mb) of data to JSON file; I did some fair amount of trial and error to get this code:

def write_to_cube(data):
    with open('test.json') as file1:
        temp_data = json.load(file1)

    temp_data.update(data)

    file1.close()

    with open('test.json', 'w') as f:
        json.dump(temp_data, f)

        f.close()

to run it just call the function write_to_cube({"some_data" = data})

Now the problem with this code is that it's fast for the small amount of data, but the problem comes when test.json file has more than 800mb in it. When I try to update or add data to it, it takes ages.

I know there are external libraries such as simplejson or jsonpickle, I am not pretty sure on how to use them.

Is there any other way to this problem?

Update:

I am not sure how this can be a duplicate, other articles say nothing about writing or updating a large JSON file, rather they say only about parsing.

Is there a memory efficient and fast way to load big json files in python?

Reading rather large json files in Python

None of the above resolve this question a duplicate. They don't say anything about writing or update.

Community
  • 1
  • 1
Akshay
  • 2,351
  • 1
  • 28
  • 61
  • 1
    Possible duplicate of [is-there-a-memory-efficient-and-fast-way-to-load-big-json-files-in-python](http://stackoverflow.com/questions/2400643/is-there-a-memory-efficient-and-fast-way-to-load-big-json-files-in-python) – BPL Sep 06 '16 at 00:36
  • Basically JSON is not a good choice of format when it comes to the serialization of a large amounts of data. – Klaus D. Sep 06 '16 at 00:39
  • Possible duplicate of [Reading rather large json files in Python](http://stackoverflow.com/questions/10382253/reading-rather-large-json-files-in-python) – Sreejith Menon Sep 06 '16 at 00:53
  • @BPL But none of them say about writing large data or updating them. – Akshay Sep 06 '16 at 02:39
  • Your question has been asked and answered in the below threads: http://stackoverflow.com/questions/10382253/reading-rather-large-json-files-in-python and http://stackoverflow.com/questions/2400643/is-there-a-memory-efficient-and-fast-way-to-load-big-json-files-in-python – Sreejith Menon Sep 06 '16 at 00:52
  • @SreejithMenon But that's only to parse not to write and update. How can it be a duplicate? – Akshay Sep 06 '16 at 02:43
  • @KlausD. What do you recommend? – Akshay Sep 06 '16 at 02:50
  • Are you sure the time is being taken in the writing, rather than the reading? How have you proved that to yourself? – GreenAsJade Sep 06 '16 at 03:14
  • @GreenAsJade I have been testing this for over a month now, whenever there is a small amount of data, this procedure works fine. But when you read and try to update it and the write a large file, then it slows down quite a lot. – Akshay Sep 06 '16 at 03:20
  • Sure, but the code you listed above could be slow in reading, or in writing, or in both. Which is it? – GreenAsJade Sep 06 '16 at 03:23
  • @GreenAsJade Both. If the JSON file has large data. – Akshay Sep 06 '16 at 03:24
  • 1
    There are other json encoders on https://pypi.python.org like ultrajson or ujson that tend to be faster than the default implementation. Try them out. They tend to serialize/deserialize to memory then read/write the file so memory overhead is 800mb plus well over a gig for the in-memory object (its size depends on what the data is... it could easily be multiple gigs). The thing is, a json file of that size is going to be slow unless you have a machine will significant RAM and fast storage. You should rethink how this data is being stored. A JSON file for this data seems like a really bad idea. – tdelaney Sep 06 '16 at 03:33
  • To try using simplejson, install it via pipy, then `import simplejson as json` instead of `import json`. – Quan To Sep 06 '16 at 04:30
  • Usually, for doing long operation, I'll just go with threads and callbacks – Quan To Sep 06 '16 at 04:33
  • @tdelaney that's a good option. – Akshay Sep 06 '16 at 04:43
  • @btquanto i'll give it a try – Akshay Sep 06 '16 at 04:43
  • The reason why I marked it a duplicate is because for updating the JSON file you have to first read the data and then update the JSON object and then write it. For both reading and writing the data, the links in my comment should be able to help. – Sreejith Menon Sep 06 '16 at 15:25
  • @SreejithMenon No, it doesn't. If it had, I wouldn't have even asked this question. It just doesn't make sense. And reading data is only a PART of it. – Akshay Sep 06 '16 at 23:15

1 Answers1

0

So the problem is that you have a long operation. Here are a few approaches that I usually do:

  1. Optimize the operation: This rarely works. I wouldn't recommend any superb library that would parse the json a few seconds faster
  2. Change your logic: If the purpose is to save and load data, probably you would like to try something else, like storing your object into a binary file instead.
  3. Threads and callback, or deferred objects in some web frameworks. In case of web applications, sometimes, the operation takes longer than a request can wait, we can do the operation in background (some cases are: zipping files, then send the zip to user's email, sending SMS by calling another third party's api...)
Quan To
  • 657
  • 3
  • 10