0

I use PHP's copy() function to copy files. To my horror, I have discovered that it only copied 2 588 396 096 bytes of my 15 473 297 984 bytes large .7z file.

It's being copied between two local disks. Both use NTFS. PHP 7.4.11. Windows 10 Pro 2004.

Please don't tell me to use exec('xcopy blabla') or something like that since this breaks cross-platformness which is a must for me to retain. I'd like to understand why copy(), like so many other built-in PHP functions, has some serious, undocumented limitation. This was unexpected because I didn't think that something like copying a file could possibly be broken.

I did read a user-submitted comment saying something relevant to this, but he claims it stops at 4 GB (which is not the case for me), and he doesn't explain why.

This is an important task and I can't have it just silently "cut off" files like this. It's especially worrying that it didn't even hit the rumored 4 GB limit.

My Windows is 64-bit and so is all my other software, BTW, so it cannot be explained by me using a 32-bit legacy OS.

Minimal code example:

var_dump(copy('C:\test\big.7z', 'D:\test\big.7z'));

Result (after a few seconds):

bool(true)

The target file is 2.41 GB -- a fraction of the actual file. It falsely returns "true" even though it was not successful? It apparently thinks that it was successful.

No error whatsoever is logged. Yes, error logging is turned on.

  • if the limit is PHP, you can use shell_execute using 2 different commands in unix or windows OS host. P.S. 4gb limit depends on filesystem. – Giacomo M Oct 12 '20 at 06:24
  • @GiacomoM But *is* the limit PHP? And why would it be? And I predicted this comment; that's why I mentioned it in the question............. I won't repeat myself here. – user14430182 Oct 12 '20 at 06:31
  • I know you wrote, but if your ONLY solution is that ... well you just have THAT solution :D – Giacomo M Oct 12 '20 at 06:32
  • @GiacomoM If I have to keep making custom terminal commands for everything, the point of using PHP in the first place seems unclear. It's supposed to be cross-platform. And for every such "hack", I have to perpetually maintain it as things change and new OSes are added, etc. It's just not a solution. I'm still in disbelief that this copy() command fails to copy files while thinking that it succeeds. And doesn't log any error. It makes no sense to me. – user14430182 Oct 12 '20 at 06:37
  • you can be right, but if the problem is a php bug (or feature, who knows?) we do not have any other solutions. Try to ask to php maintainers. – Giacomo M Oct 12 '20 at 06:39
  • check this post https://stackoverflow.com/questions/6564643/copy-large-files-over-2-gb-in-php – Giacomo M Oct 12 '20 at 06:41
  • @GiacomoM In regards to asking PHP maintainers, I already have multiple questions "pending" for months and months on their mailing lists which haven't received a single reply. They seem to have stopped caring, frankly. (Although PHP 8 is coming up.) – user14430182 Oct 12 '20 at 06:42
  • @GiacomoM I assume that you mean the top answer, because all the other stuff on that page says nothing new (or even correct). Well, I suppose that "could work", but it feels bad to say the least to have to rely on some "hack" that somebody threw together instead of being able to rely on the built-in copy() function. I haven't tried it yet, but I think it will work. But it will introduce worry into my code as there could be bugs with that snippet... I really don't know what to do anymore about anything. It doesn't feel "fun" anymore when literally everything is broken and buggy. – user14430182 Oct 12 '20 at 06:47
  • @GiacomoM Well, I've now verified that it appears to work. At least in my test case. But it returns the number of bytes transferred instead of true/false -- that already is a difference from copy(), and my copy() is already inside other advanced logic, so this is going to get messy... – user14430182 Oct 12 '20 at 07:09
  • the logic of your software is about to you. You can easily create a my_copy function that returns true if bytes copied are > 0. – Giacomo M Oct 12 '20 at 07:21
  • "Literally everything is broken and buggy" is a bit of an overstatement. If you are frequently moving files of that magnitude (eg: working on a DAM) you shouldnt really be relying on brute force commands like copy that are subject to the limitations of allocated memory in a virtual space. Stream/chunk copying would be a far more reliable solution in this case. – Dave Oct 12 '20 at 08:17

0 Answers0