2

I have been using the following command to pull back files from a web server and store them on my local machine:

$client = New-Object System.Net.WebClient
$client.DownloadFile('http://www.example.com/style/test.php','C:\Users\Me\Documents\style.css')

Is there a way of modifying this to pull back a whole directory? This is my first time using Powershell and the only other exposure I've had to anything like this is wget in Linux.

Any help would be greatly appreciated.

2

2 Answers 2

0

Use a tool that is appropriate for this, like WGet. This does not guarantee that you will download an entire directory, it will just give you all the pages linked to from your starting URL. Judging from the URLs in your question you do not know the remote directory anyway, so the term is meaningless - the site could be distributed over many directories.

If you do know the server and directory name, use an FTP program, e.g. FileZilla.

-1

Another good utility (besides WGet) is HTTrack.

It can 'rip' a complete website. It even makes links work locally. You could try to do all that in Powershell but using an advanced tool like httrack, it does it all for you, especially the link translation.

Like Jan Doggen said, it depends what you are using it for. If you want a backup of your website you should use ftp. If you want a local rip you can check out httrack.

HTTrack homepage
HTTrack info page on wikipedia

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .