11

I'm working with a large number of binary files. After a recent change to a local git repo, I tried to push my changes back up to the remote, only to receive the following error.

remote: fatal: pack exceeds maximum allowed size

Unfortunately I can't use the strategy described here, since all the changes are contained in a single commit. Any suggestions? How can I get around this pack size restriction?

Community
  • 1
  • 1
Madison May
  • 2,593
  • 2
  • 20
  • 32
  • Is there a huge file involved? Or just a giant commit with lots of smaller files changed in it? – VonC Jul 12 '14 at 08:48
  • A lot of serialized files are generated on code modification and rerun (so one giant commit with lots of smaller files). – Madison May Jul 13 '14 at 02:59
  • For GitHub, I would keep each push size below 2 GB. Also consider setting `http.postBuffer` to as high as [2000000000](https://stackoverflow.com/a/64565533/). – Asclepius Oct 28 '20 at 02:58

1 Answers1

10

A lot of serialized files are generated on code modification and rerun (so one giant commit with lots of smaller files)

That means you can split that huge commit in several smaller one.

  • A git reset HEAD~ will be enough to "un-commit" all the files.
  • then add a subset of the files, and commit
  • repeat for all the files
  • push a collection of commits.

Finally, modify your script (which by default adds and commit everything after that "serialized files" generation) in order to add and commit only a batch of files at a times (instead of everything).

VonC
  • 1,129,465
  • 480
  • 4,036
  • 4,755