Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Downloading a list of URLs - How?

  • 17-11-2003 11:44pm
    #1
    Registered Users, Registered Users 2 Posts: 199 ✭✭


    Hello Everybody,

    I hope this is the most appropriate place to post this. I've looked through other Forums and this seems like the closest match so here goes. . .

    I have a piece of software that lets me input a single URL and it will download it for browsing offline (all the publically available stuff, that is).

    However, when doing research for writing and other things, I usually have a loooooong list of pages to visit and, since I am limited to a slow dial-up, it takes an extraordinary amount of time to visit each.

    It would be so much faster to be able to paste a list into a program and have it automatically download each one in sequence.

    Is there such a thing available? (Preferaly free or low cost or, at least available as shareware before buying).

    Thank you for any help.

    Best regards,
    Tommy.


Comments

  • Closed Accounts Posts: 660 ✭✭✭naitkris


    Getleft 1.1.1 from http://personal1.iddeo.es/andresgarci/getleft/english/index.html and WinHTTrack 3.30 from http://www.httrack.com/ are both licensed under the GPL and are free software to download web pages to be viewed online.

    For shareware alternatives, try visiting http://www.webattack.com/shareware/downloader/swoffline.html which lists various alternatives, some of which may be better suited to what you need in terms of inputting a list of URLs as I do not know if the two free solutions mentioned above support that.


  • Closed Accounts Posts: 1,502 ✭✭✭MrPinK


    It's pretty easy to do in unix with wget, you just 'wget -i filename' and it will download all the URL's listed in the file.

    There is a version of wget for windows but I've never used it. I presume it would work just as well as it does in unix though, and it would be free. Try google it, or someone else may be able to give you a link for it.


  • Registered Users, Registered Users 2 Posts: 199 ✭✭TommyK


    Some of those shareware programs look like just the ticket!

    I appreciate the pointers! - Thank you.

    Tommy.


  • Registered Users, Registered Users 2 Posts: 3,177 ✭✭✭oneweb


    Also consider HTTPWeazel, shareware but you can filter by size/filetype etc.

    It is what it's.



Advertisement