Create an account

Very important

  • To access the important data of the forums, you must be active in each forum and especially in the leaks and database leaks section, send data and after sending the data and activity, data and important content will be opened and visible for you.
  • You will only see chat messages from people who are at or below your level.
  • More than 500,000 database leaks and millions of account leaks are waiting for you, so access and view with more activity.
  • Many important data are inactive and inaccessible for you, so open them with activity. (This will be done automatically)


Thread Rating:
  • 540 Vote(s) - 3.51 Average
  • 1
  • 2
  • 3
  • 4
  • 5
S3 file upload stream using node js

#1
I am trying to find some solution to stream file on amazon S3 using node js server with requirements:

- Don't store temp file on server or in memory. But up-to some limit not complete file, buffering can be used for uploading.
- No restriction on uploaded file size.
- Don't freeze server till complete file upload because in case of heavy file upload other request's waiting time will unexpectedly
increase.

I don't want to use direct file upload from browser because S3 credentials needs to share in that case. One more reason to upload file from node js server is that some authentication may also needs to apply before uploading file.

I tried to achieve this using node-multiparty. But it was not working as expecting. You can see my solution and issue at

[To see links please register here]

. It works fine for small files but fails for file of size 15MB.

Any solution or alternative ?
Reply

#2
Give

[To see links please register here]

a try.

I used it for uploading several big files in parallel (>500Mb), and it worked very well.
It very configurable and also allows you to track uploading statistics.
You not need to know total size of the object, and nothing is written on disk.
Reply

#3
I'm using the [s3-upload-stream](

[To see links please register here]

) module in a working project [here](

[To see links please register here]

).

There is also some good examples from @raynos in his [http-framework](

[To see links please register here]

) repository.
Reply

#4
Alternatively you can look at -

[To see links please register here]

. It has minimal set of abstracted API's implementing most commonly used S3 calls.

Here is an example of streaming upload.

$ npm install minio
$ cat >> put-object.js << EOF

var Minio = require('minio')
var fs = require('fs')

// find out your s3 end point here:
//

[To see links please register here]


var s3Client = new Minio({
url: 'https://<your-s3-endpoint>',
accessKey: 'YOUR-ACCESSKEYID',
secretKey: 'YOUR-SECRETACCESSKEY'
})

var outFile = fs.createWriteStream('your_localfile.zip');
var fileStat = Fs.stat(file, function(e, stat) {
if (e) {
return console.log(e)
}
s3Client.putObject('mybucket', 'hello/remote_file.zip', 'application/octet-stream', stat.size, fileStream, function(e) {
return console.log(e) // should be null
})
})
EOF

putObject() here is a fully managed single function call for file sizes over 5MB it automatically does multipart internally. You can resume a failed upload as well and it will start from where its left off by verifying previously upload parts.

Additionally this library is also isomorphic, can be used in browsers as well.
Reply

#5
If it helps anyone I was able to stream from the client to s3 successfully (without memory or disk storage):



The server endpoint assumes `req` is a stream object, I sent a File object from the client which modern browsers can send as binary data and added file info set in the headers.

const fileUploadStream = (req, res) => {
//get "body" args from header
const { id, fn } = JSON.parse(req.get('body'));
const Key = id + '/' + fn; //upload to s3 folder "id" with filename === fn
const params = {
Key,
Bucket: bucketName, //set somewhere
Body: req, //req is a stream
};
s3.upload(params, (err, data) => {
if (err) {
res.send('Error Uploading Data: ' + JSON.stringify(err) + '\n' + JSON.stringify(err.stack));
} else {
res.send(Key);
}
});
};


Yes putting the file info in the headers breaks convention but if you look at the gist it's much cleaner than anything else I found using streaming libraries or multer, busboy etc...

+1 for pragmatism and thanks to @SalehenRahman for his help.
Reply

#6
You can now use streaming with the [official Amazon SDK for nodejs][1] in the section "Uploading a File to an Amazon S3 Bucket" or see their [example on GitHub][2].

What's even more awesome, you finally can do so __without knowing the file size in advance__. Simply pass the stream as the `Body`:


var fs = require('fs');
var zlib = require('zlib');

var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body})
.on('httpUploadProgress', function(evt) { console.log(evt); })
.send(function(err, data) { console.log(err, data) });


[1]:

[To see links please register here]

[2]:

[To see links please register here]

Reply

#7
For your information, the v3 SDK were published with a dedicated module to handle that use case :

[To see links please register here]


Took me a while to find it.
Reply



Forum Jump:


Users browsing this thread:
1 Guest(s)

©0Day  2016 - 2023 | All Rights Reserved.  Made with    for the community. Connected through