Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

download website for offline use

  • 15-01-2009 1:15pm
    #1
    Registered Users, Registered Users 2 Posts: 302 ✭✭


    Hi guys

    I have one year subscription for a website having useful tutorials and question bank. It would expire within 15 days. Just wanted to save everything for future reference:D, in case i need it. It requires username and password. I tried WinHTTrack capture url thing but it didn't work. Would any techie please tell me how to do that. Thanks in advance.


Comments

  • Registered Users, Registered Users 2 Posts: 2,859 ✭✭✭Duckjob


    Depending on what browser you're using but most straightforward way I can think of is to just save individual pages that you want to keep to your hard drive.

    In Firefox for example, you can just File >> Save Page as and save it as Web Page Complete, which automatically downloads the page along with all referenced graphic files etc.

    Other posters on here might know a better way to do it using more advanced tools.


  • Registered Users, Registered Users 2 Posts: 2,793 ✭✭✭oeb


    http://www.websnake.com/

    There are free applications that do the same thing as this, I just can't think of any of them offhand.

    wget can do it on linux.


  • Registered Users, Registered Users 2 Posts: 4,387 ✭✭✭EKRIUQ


    BackStreet Browser 3.1 - Free Offline Browser / WebSite Downloader

    http://www.spadixbd.com/backstreet/

    Or for larger websites

    RafaBot 1.5

    http://www.spadixbd.com/rafabot/index.htm
    RafaBot is a high-speed, multi-threading, large scale web spidering robot. A powerful bulk website downloading tool. It can download websites from a starting URL, search engine results or web dirs and able to follow external links. It can download either a single website or many thousands of websites in one session.


  • Registered Users, Registered Users 2 Posts: 302 ✭✭confuzed


    thanks guys but none of these worked. As i said, i have to login to browse website and all these tools are not able to go thru password thing although i tried putting login details to these browsers.:confused::confused:


  • Moderators, Technology & Internet Moderators Posts: 11,017 Mod ✭✭✭✭yoyo


    http://www.httrack.com/

    Used this before and seemed to work well :) Free too

    Nick


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 7,501 ✭✭✭BrokenArrows


    confuzed wrote: »
    Hi guys

    I have one year subscription for a website having useful tutorials and question bank. It would expire within 15 days. Just wanted to save everything for future reference:D, in case i need it. It requires username and password. I tried WinHTTrack capture url thing but it didn't work. Would any techie please tell me how to do that. Thanks in advance.

    you said winhttrack didnt work but did you configure it properly.

    i have used this before and it took me a good few tries before i got it to work properly.


Advertisement