Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Help Keep Boards Alive. Support us by going ad free today. See here: https://subscriptions.boards.ie/.
If we do not hit our goal we will be forced to close the site.

Current status: https://keepboardsalive.com/

Annual subs are best for most impact. If you are still undecided on going Ad Free - you can also donate using the Paypal Donate option. All contribution helps. Thank you.

wget with sub directories

  • 07-01-2008 09:38PM
    #1
    Registered Users, Registered Users 2 Posts: 719 ✭✭✭


    can someone tell me the command line option please to get the following site downloaded to my local machine
    http://www.linux.org/lessons/beginner/toc.html

    I'd like it to download the html links of this page. and when I try just
    wget http://www.linux.org/lessons/beginner/toc.html it only d/l's the one html file.

    should I first be trying to get the links from the html page into a txt file to fee into wget ? or can I download the site from a directory (with all files) pointof view?

    thanks in advance

    Fionn


Comments

  • Registered Users, Registered Users 2, Paid Member Posts: 2,427 ✭✭✭ressem


    wget -rl 4 http://www.linux.org/lessons/beginner/toc.html

    Downloads the page and all files on it to a depth of 4 links.


  • Registered Users, Registered Users 2 Posts: 719 ✭✭✭Fionn101


    thanks for the reply ressem. I tried this but I get a permission denied error(fair enough)

    today I was thinking it would perhaps be best if I do visit each page on their site and save off the text,that way I get the end result I'd like (which is an off-line copy of this howto) and they (linux.org) would get to serve out their fair share of google adwords and keep the site alive.

    A win win situation I reckon. (ahem..)


  • Registered Users, Registered Users 2 Posts: 1,110 ✭✭✭Skrynesaver


    If you use the -k (--convert-links) switch the local copy is rebuilt for local browsing


  • Registered Users, Registered Users 2 Posts: 719 ✭✭✭Fionn101


    worked a charm, thanks .

    oonly annoying part was stripping out the google syndycation ads.

    Thanks all


Advertisement