Advertisement
Help Keep Boards Alive. Support us by going ad free today. See here: https://subscriptions.boards.ie/.
If we do not hit our goal we will be forced to close the site.

Current status: https://keepboardsalive.com/

Annual subs are best for most impact. If you are still undecided on going Ad Free - you can also donate using the Paypal Donate option. All contribution helps. Thank you.
https://www.boards.ie/group/1878-subscribers-forum

Private Group for paid up members of Boards.ie. Join the club.

Wget Help please

Comments

  • Registered Users, Registered Users 2 Posts: 604 ✭✭✭Kai


    use a batch file to get each one individually ? have a counter that increments for each image.


  • Registered Users, Registered Users 2 Posts: 1,268 ✭✭✭hostyle


    Put all the URLs in a txt file (easy enough to copy and paste then change number in 250 lines). Then use wget -i filelist.txt


  • Closed Accounts Posts: 7,562 ✭✭✭leeroybrown


    Asuming a bash shell save this as a shell script and run it:
    #!/bin/bash
    
    for ((i = 1; i <= 250; i++))
    do
        wget http://www.domainname.net/images/$i.jpg
    done
    

    This will need minor modification if the names are '001.jpg' and not '1.jpg'


  • Registered Users, Registered Users 2 Posts: 1,186 ✭✭✭davej


    yes you forgot to take leading zeros into account which can be inserted using (s)printf:

    this should work on the command line:

    perl -e 'for ($i=1;$i<251;$i++){$command = sprintf ("wget http://www.domainname.net/images/%03d.jpg",$i);$cmd= `$command`;}'

    davej


  • Registered Users, Registered Users 2 Posts: 604 ✭✭✭Kai


    he said he was using the win32 version though.


  • Advertisement
  • Closed Accounts Posts: 825 ✭✭✭MarcusGarvey


    I ended up using the win32 version of curl which allowed me to use the following command:

    curl -f -O "http://www.domainname.net/images/[001-240].jpg&quot;

    Thanks for the suggestions though.


Advertisement