Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Cloud storage, AWS, Gdrive, Dropbox - what to use?

Options
  • 23-10-2019 8:40am
    #1
    Registered Users Posts: 1,244 ✭✭✭


    Hi,

    I'm specing a web application that will be using about 100gb of data (in uploaded files) per year and it's going to work out too expensive to host that on my web server.

    My plan is to build it on my server which will handle the front end, database and all logic, but to store the files (PDFs & images) on the cloud. I'll be using PHP.

    Looking at the Amazon S3 PHP SDK, it seems I can upload my file to my server (as a temp file) and then send it to the S3

    https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/s3-examples-creating-buckets.html

    Then I can get a link for it when it needs to be download again

    https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/s3-presigned-url.html

    Is bucket and key equivalent to directory and filename? Or what is a bucket?

    AWS seems to have many storage products, given that I'll be using 100gb per annum but there will not be a very high frequency of read or write access, what is the best product?

    https://aws.amazon.com/products/storage/

    Also, does AWS have a windows program/plugin such that my client could access the files from the windows explorer (RO access)

    I also quickly looked at Dropbox but they do not have an official PHP SDK, just two userland ones.

    Any advice on this?


Comments

  • Registered Users Posts: 6,010 ✭✭✭Talisman


    You can upload directly to S3 from the web browser. Your proposed solution involves two uploads (frontend -> server, server -> s3 bucket), so your server is doing the heavy lifting on behalf of the client. That's not advisable as it is a single point of failure.

    It's been a while since I used AWS but I recall using AWS Security Tokens to allow the client use temporary credentials to upload to S3. The access rights were defined so the client had limited access for a limited time period.

    EDIT: I just looked at the AWS documentation and things are simpler now - Amazon S3 Pre-Signed URL with AWS SDK for PHP Version 3. With this approach, your server will generate a pre-signed POST URL that the client will use to upload the file. You don't have to generate any credentials that are exposed on the frontend and the upload is done by the client.

    In answer to your one of your questions:

    A bucket is equivalent to a share name on a network drive - it has to be unique. There will only be a single bucket on S3 that uses that name. When you upload a file to S3 the bucket name is part of the URL, e.g. https://bucketname.s3.amazonaws.com/


  • Registered Users Posts: 1,244 ✭✭✭MrCostington


    Talisman, many thanks for the info.

    I did not see the upload, so guess a reverse of the GET URL thing I linked to. I may still upload to my server first, as there will be a few things I need to do with the files before moving them to S3.

    Is S3 the best solution to my use case, EFS also looks like a candidate?

    I assume when uploading/downloading, a one time token is included too to the example URL you gave?

    Ah, get you re buckets. Can I then create a file structure within my bucket?

    That got me thinking, I guess this is not possible, but can AWS or another solution be mounted on a unix filesystem?

    How did you find AWS re speed and reliability?

    Many thanks!


  • Registered Users Posts: 6,010 ✭✭✭Talisman


    You can create a file structure within the S3 bucket - in reality the structure isn't actually created on the S3 servers. Instead the filename is prefixed with the file path, this page explains it: https://docs.aws.amazon.com/AmazonS3/latest/user-guide/using-folders.html

    I would think that S3 is better suited to your use case rather than EFS - my understanding of EFS is that it's like S3 but targeted at Amazon EC2 instances, so it's like having an S3 bucket that is tied to a specific server hosted on the Amazon cloud.

    S3 client tools to check out:
    S3 Browser - freeware Windows GUI for browsing your S3 buckets

    S3cmd - Command Line S3 Client for Linux


  • Registered Users Posts: 6,010 ✭✭✭Talisman


    How did you find AWS re speed and reliability?
    Where I ran into problems was on the user side - people living in Ballygobackwards attempting to upload large files over very poor connections. It led to problems because the token would timeout before their upload would complete. My solution to this involved splitting the file into chunks and then uploading each one in turn. Each chunk required a separate token for upload. Afterwards a server side job stitched the chunks together. The days of such poor internet speeds now seem to be behind us so uploads shouldn't be a problem.

    S3 was perfect for my use case and I never encountered problems with accessing the files afterwards.

    And if it puts your mind at ease further, Dropbox uses S3 for its storage solution.


  • Registered Users Posts: 1,244 ✭✭✭MrCostington


    Thanks again, lots of useful info there.

    Re the file structure, that sounds good and it looks like it would mimic a network drive when used with the S3 Browser client?

    S3 Browser client - I assume one has to login to use it, and that I can set bucket access to RO for all bar my application?

    Is the browser "safe" to download - I'm always wary of freeware.

    Yes I thought S3 was the best product for my needs, thanks for confirming.

    Each file is max 8mb and will be used in a B2B situation, so I assume I don't need to worry about Ballygobackwards uploading from the thatched cottage!


  • Advertisement
  • Registered Users Posts: 895 ✭✭✭Dubba


    Thanks again, lots of useful info there.
    Is the browser "safe" to download - I'm always wary of freeware.

    Just use the AWS Console (available when signed in) and CLI, no need for 3rd party tools.


  • Registered Users Posts: 27,037 ✭✭✭✭GreeBo


    Op are you likely to have millions of small files or what is the makeup of that 100gb?

    Editing and moving lots of small files in S3 can be slow and painful.
    But I would certainly start there.


  • Registered Users Posts: 1,244 ✭✭✭MrCostington


    GreeBo wrote: »
    Op are you likely to have millions of small files or what is the makeup of that 100gb?

    Editing and moving lots of small files in S3 can be slow and painful.
    But I would certainly start there.

    Min 0.5Mb and max 8Mb, average 4Mb - As I said above, I think I'll do any processing locally on my server then send to S3 where is will be just downloading from there. Would that fall into the case you mention?


  • Registered Users Posts: 81,223 ✭✭✭✭biko


    Essentially you are looking to use some sort of CDN
    Start with a simple AWS bucket for now for your CSS, JS, images and PDFs and maybe look at Amazon CloudFront https://aws.amazon.com/cloudfront/

    Google is getting into the cloud market with their offering https://cloud.google.com/free/


Advertisement