Upload to Amazon S3 using AngularJS

In this post I’m going to detail the steps to upload to Amazon S3 using angularJS. A friend contacted me as he needed a simple way for his users to upload large (GBs) files to his Amazon S3 bucket.

Before, he was using a mix of dropbox sharing via and manually uploading once he has possession of the files. Obviously this doesn’t scale the more users who are trying to upload files – he only has a certain number of hours in a day and (at the time) had a really slow internet connection – think dialup speeds.

I created a simple single page which allows people to upload to his bucket. From there he then transfers to another Amazon S3 bucket – which allows downloads.

The Buckets

We have two buckets – one which allows anyone to upload (we’ve not implemented user authentication before uploading) and one which allows anyone to download – given a specific link.

The upload Amazon S3 bucket is public – users only have permission to upload, therefore they cannot Delete/Read/Update others files that may happen to be in there at that time. Once the upload is complete, we move the file manually into a download bucket – which has permissions to only allow public users to download individual files. They cannot Delete, Create, Update or list the contents of the bucket.

Amazon S3 configuration

The configuration for the public Amazon S3upload bucket looks like this (this is editable from the Amazon S3 Bucket console)

{ 
   "Version": "2012-10-17", 
   "Statement": [{ 
      "Sid": "mySIDhere", 
      "Effect": "Allow", 
      "Action": [ "s3:PutObject" ], 
      "Resource": [ "arn:aws:s3:::myTempBucket/*" ] } ] 
}

The JSON above, tells Amazon S3 that we want to explicitly allow the PutObjectAction on the bucket called “myTempBucket”.

This JSON config can also be created from the AWS Policy Generator. As security is very important, make sure you have a good read of the docs.

The Page

Now onto the page – I’m using AngularJS (to get started nice and quick).

I’ve stripped the html down (for the purposes of this example) – but in reality there’s a bit of Bootstrap in there – to get us looking presentable (I’ve left in the bootstrap for the alerts and progress bar).

Gist here… html below

In the html we’re providing a button which will open the file selector, we’re recommending 7-Zip (it’s awesome).
Then we’ve got a section to show the current stats of the files and if they uploaded successfully/had errors.

The Controller

The controller is where we link the button to the function that will accept the file and send it off to an UploaderService which will do just that!

I’m using the angularJS ng-file-upload module.

NOTE:I’m only using this plugin to get access to the file object the user selects. You canuse thismoduleto upload directly to the S3 Bucket (example includedon the module’sGitHub page). However, by doing it this way the module will upload the file in one great big chunk – using a PUT request. Amazon will only allow files up to 5GB to be uploaded through this method.
If you want to reap the benefits of chunked uploads and uploads up to 5TB in size then you use the JavaScript AWS SDK– just use the links on the right to download the default of customise your own.

As my friend was very likely going to be receiving files larger than 5GB it was a no-brainer – plus I can imagine uploading 4.5GB of a 5GB file – it failing and me not being a happy bunny.

The progress function reports the progress back to the html, I’m using the bootstrap progress bar component to just change the width.

Gist here and JavaScript below

The S3 Upload Service

This simple AngularJS service initialises the Amazon bucket and credentials we will be uploading under and the region.
We give some chunking defaults (10mb), a timeout value, bucket name, ACL and upload progress function (to show the user the progress of the file upload).
The JavaScript Amazon S3 SDK will automatically split the file and perform the chunked upload.

Gist here and JavaScript below

Hopefully this has shown you how to allow your users to upload to an Amazon S3 bucket (with large file size limits).

You could give each user their own bucket and also go as far as authenticating the users before they upload – this was outside the requirements of this uploader but can be added in with relative ease.

If you have any questions, suggestions or improvements, hit me up on twitter or in the comments below.

ryansouthgate

Software developer, living in Coventry, loves .Net, JavaScript and learning new languages.

Coventry