0

The Scenario

I work with laravel frequently and if anyones worked with framework you should know that the "vendor" folder contains 6000+ files within multiple folders.

Something ive noticed is the speed of copying a laravel project to another internal directory is EXTREMELY slow (atleast 10 minutes per project). At first I thought there was a problem with my SSD but i soon realised after zipping my project folder and copying it to a new destination that the problem was with the amount of files and folders in my projects.

The Question

Is there a special type of SSD or hard drive type that allows me to copy vast amounts of files and folders in directories/subdirectories (Extremely Fast) without needing to zip the files first. Or if theres a way to zip / unzip this amount of folders at high speeds that would also be beneficial

I can zip them but it takes just as long to zip, compress, move and then uncompress as it does to copy and paste the files over to a new destination making zipping useless for me.

I want to just be able to make new copies of the project on the fly for testing / backup purposes!

1 Answer 1

0

There are several approaches on this, assuming you have no access/permission problems, long path problems or other such things.

Before everything, you could defrag your disk.

  1. You can use a decent file manager or specific file copy utility (like TeraCopy). There should be an improvement to the normal copy operation and with a file manager many more additional things are supported.

  2. Use a good archiver with STORE option (no compression, just store the files) like 7ZIP and archive the files directly to the destination. That way, you will spare some of the copy time, but that time is then wasted to unpack them, so if your objective is to backup, this is good, if your objective is to just copy them as-they-are, it is not. Make sure to have enough network bandwidth when attempting this.

  3. Use a command prompt copy (like xcopy, xxcopy, robocopy, etc) and add a parameter to copy only newer files. If most of your files do not change, there is no reason for you to overwrite all of them every time. It's better just to add the new ones and refresh the modified ones (xcopy /d parameter does exactly that).

1
  • Thanks, ive used 7zip for near instant archiving of my file setting the compression method to LMZA2 and setting my CPU threads to use up to 4 (Normally 2) Only problem is extracting these files still take just as long, around 10 minutes but compression is instant Commented Sep 4, 2017 at 11:42

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .