Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

php copy image script

  • 04-08-2006 12:01pm
    #1
    Registered Users, Registered Users 2 Posts: 1,086 ✭✭✭


    I need to copy an image (I have it's url) and save it on my server using PHP.

    Probabaly a simple exercise. Any help?

    Thanks


Comments

  • Registered Users, Registered Users 2 Posts: 68,317 ✭✭✭✭seamus


    Go to www.php.net and look up the fopen() command. It's easy-peasy.

    You will need to check your php configuration and ensure that the fopen_url directive is properly configured.


  • Registered Users, Registered Users 2 Posts: 1,086 ✭✭✭Peter B


    Anyone have some sample code.

    I cannot make head or tail of the examples I googled.

    So far I have managed to make create a JPG file on my server containing it's own URL, but no image.
    [PHP]
    $file_address = "http://www.boards.ie/vbulletin/images/misc/vbulletin3_logo_white.gif";
    $newfile = "new_picture.jpg";

    $file = fopen($file_address, r);

    if (!$file)
    {
    echo "File failed to open<br>";
    }
    if (!$file2 = fopen($newfile, "a"))
    {
    echo "could not open file";
    }
    if (!fwrite($file2, $file))
    {
    echo "<br>could not write to file";
    }

    fclose($file);
    fclose($file2);
    [/PHP]

    Any help?


  • Closed Accounts Posts: 2,046 ✭✭✭democrates


    Do you want to write a php script that retrieves images from around the net to your server?


  • Registered Users, Registered Users 2 Posts: 1,086 ✭✭✭Peter B


    Not from around the net. Just one website. Save me having to right click on them click "save as" and then ftp them up to my server.

    I have all the image addresses because I display the images already. Just want to load images locally.


  • Closed Accounts Posts: 22,479 ✭✭✭✭philologos


    you can do that but that is definetely not the right way to do it.
    look up PHP GD library
    imagecreatefromjpeg();
    imagecreatefromgif in your case
    etc

    and you need to use headers to make it a certain file type
    Header("Content-type: image/jpeg");

    i cant help you any more because there is a rule that we arent allowed to just give you the code.

    my sig uses that function btw


  • Advertisement
  • Closed Accounts Posts: 2,046 ✭✭✭democrates


    There are free site grabber programs which will fetch a site to your pc, if all images are in say an images directory you delete logos buttons etc and bulk upload the rest. Some of them may have the ability to grab images only.

    Careful if you're fetching from myspace or flickr though, big sites, but I seem to recall the spider can be set to stay within the given url fragment.

    Afaik if you point one of these at a page it will fetch the html, parse out hyperlinks and image tags, grab the images, follow any links to other pages with the same url and repeat until it's got images off all the pages for that url. This won't work as a filter on all sites, some do not have search engine friendly urls (eg every single page has index.php?whatever as a front processor).

    If you can view all images on the one page in your browser, you can save the page to your pc, and all images will be put in a sub-directory.

    Alternatively your web browser cache may still have local copies if you've viewed the images.

    But I'll continue to look into a php solution and revert, it's something I'd like to have myself, an admin page where I can enter a url, get back a page of images, and select the ones I want to keep locally, adding titles, dates, subjects and eventually tags.

    Edit: ok curl seems the way to go...


  • Closed Accounts Posts: 22,479 ✭✭✭✭philologos


    couldn't imagejpg($im, $filepath); export the file to the server


  • Closed Accounts Posts: 2,046 ✭✭✭democrates


    Jakkass wrote:
    couldn't imagejpg($im, $filepath); export the file to the server
    I've just started looking at this so I may have the wrong end of the stick alltogether.

    So far it looks to me like the GD functions like imagejpg etc are for creating an image stream on the server side to send to a requesting browser or write to file, I've used them to read a database schema and generate a rudimentary image of the tables for viewing in firefox. That's fine, but how to we get the images from another server in the first place.

    The curl functions seem to be the ones that mimic a browser, and can send http requests to other servers and handle the response, google reveals curl being used as a basis for web crawlers and the manual docs look promising at first glance, though text examples abound rather than image crawlers. When I get a chance I'll try to suss the key bit, how to request an image url by http and save the response stream as a file. It could well be that imagejpg will do the final write to disk, in fact it may be possible to fetch a gif and save as jpg, early days...


  • Closed Accounts Posts: 22,479 ✭✭✭✭philologos


    I use it for my signature from imageshack take the image imagecreatefromjpeg("link"); and then export it using imagejpeg($im, $filepath);
    or imagejpeg($im);
    for just displaying


  • Closed Accounts Posts: 2,046 ✭✭✭democrates


    Jakkass wrote:
    I use it for my signature from imageshack take the image imagecreatefromjpeg("link"); and then export it using imagejpeg($im, $filepath);
    or imagejpeg($im);
    for just displaying
    Bang on, forget libcurl, I got it working as you described as far as fetching and then sending a stream to the browser, next stop writing the file. I hate opening write access to apache even on my dev box, I know, paranoid.


  • Advertisement
  • Closed Accounts Posts: 22,479 ✭✭✭✭philologos


    hopefully the OP can see what we are on about from what we have said and not ask us to paste a load of code down


  • Closed Accounts Posts: 2,046 ✭✭✭democrates


    Gas, I have to admit that once the internet arrived I learned html, perl, javascript, and now php through copy, paste, examine, modify. Takes longer than formal tuition but it's cheaper.

    Now I tend to hit the online reference manuals (often lacking full coverage) and books, as well as websites with code snippets, full floss application code, and discussions to find answers. This thread is a perfect example, I could have spent ages on curl but you pointed out the appropriate method and saved me wasted effort on an over-complicated solution. And people learn soon enough not to expect to be spoon fed. So thanks Jakkass.:)


Advertisement