29
loading...
This website collects cookies to deliver better user experience
"s3:PutObject",
"s3:ListBucket",
"AllowedMethods": [
"PUT",
"POST",
],
npm install aws-sdk
const s3bucket = new AWS.S3({
accessKeyId: process.env.AWS_ACCESS_KEY,
secretAccessKey: process.env.AWS_SECRET_KEY,
signatureVersion: 'v4',
region: process.env.AWS_REGION, // ex) us-west-2
});
getSignedUrlPromise()
to receive the generated upload URL.const params = {
Bucket: process.env.AWS_BUCKET_NAME,
Expires: 3000, // expire time in second
Key, // this key is the S3 full file path (ex: mnt/sample.txt)
};
// notice that this is the same method that we used for downloading,
// but using 'putObject' instead of 'getObject'
const url = await s3bucket
.getSignedUrlPromise('putObject', params)
.catch((err) => {
logger.error(err);
});
// create read stream with file's full path including file name and extension
const istream = fs.createReadStream(streamPath);
// using generated uploading url to upload file
axios.put(url, istream, {
headers: {
'Content-Type': mimetype, // mime type of the file
'Content-Length': totalSize, // file's total size
},
}).then(() => {
console.log('http upload success!');
}).catch((err) => {
console.error(err);
});
.on()
function. The events that you may be interested in during the upload are close
(when the stream is finished, meaning upload is done), data
(sending data chunk - upload process), and error
(when failed to upload).