Hi guys, after this discussion we decided to make some test on archive libraries.
Here are the results:
Hi guys, after this discussion we decided to make some test on archive libraries.
Here are the results:
Suggest you try to run the same with a larger data set. I.e. maybe 5GB of files (like a real site with graphics/thumbnails). Please also do this on a 32 bit server.
I think you'll find that Phar will fail completely which is my whole point related to choice of archivers. 44MB of data is not representative of a real site. The standard site once installed is on the order of 350MB. But that too is really a skeletal environment.
A 5GB site would be much more representative of a 1K product site with high density imagery and all thumbnails having been created. Note too that having lots of images will also impact the compression more accurately since it will be trying to compress data that is already compressed (more realistic).
But my real point was that if you had just stuck with using Tar_Archive in the first place you wouldn't have all of these upgrade failures. Phar didn't provide you any functionality (that you used) or performance improvement. So it was yet another change for change's sake. Sometimes just leaving things alone is the best approach. And for archives/backups, stability is of the utmost importance.