0

I have been trying to save this page:

http://www.geopostcodes.com/Bolton?loc=Bolton

(and others like it from the same domain) but always get an error about NOCAPTCHA not working, and the Javascript for the Show next... part of the page does not work.

I tried Scrapbook on Firefox 56 and got that error. SiteSucker = same error.

DownThemAll! for Firefox did multiple URLs but still had the same NOCAPTCHA error for all pages.

I haven't tried HTTrack because I'm on OSX and haven't installed MacPorts yet.

Saving it conventionally via Save page as... I haven't tried yet, but that's because I was trying to find a way to save several URLs from the site.

Basically, I want to try and save all pages from within

http://www.geopostcodes.com/UK (and subpages)

for posterity

I am on a Mac Mini 2011, using Mac OS Sierra.

I would appreciate it if anyone can help me find a workable solution to preserving pages for posterity for offline usage.

1 Answer 1

0

Those pages are generated using Javascript and ajax calls to the server as you are viewing them. Copying the HTML or saving from a browser just isn't going to work.

Your best best may be to copy/paste the data you want from the page into a text file or spreadsheet. Either that, or write browser plugin that will extract the data for you.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .