Javascript AWS SDK S3 upload method with Body stream generating empty file
I'm trying to use the method upload from s3 using a ReadableStream from the module fs
.
The documentation says that a ReadableStream can be used at Body
param:
Body — (Buffer, Typed Array, Blob, String, ReadableStream) Object data.
Also the upload method description is:
Uploads an arbitrarily sized buffer, blob, or stream, using intelligent concurrent handling of parts if the payload is large enough.
Also, here: Upload pdf generated to AWS S3 using nodejs aws sdk the @shivendra says he can use a ReadableStream and it works.
This is my code:
const fs = require('fs')
const S3 = require('aws-sdk/clients/s3')
const s3 = new S3()
const send = async () => {
const rs = fs.createReadStream('/home/osman/Downloads/input.txt')
rs.on('open', () => {
console.log('OPEN')
})
rs.on('end', () => {
console.log('END')
})
rs.on('close', () => {
console.log('CLOSE')
})
rs.on('data', (chunk) => {
console.log('DATA: ', chunk)
})
console.log('START UPLOAD')
const response = await s3.upload({
Bucket: 'test-bucket',
Key: 'output.txt',
Body: rs,
}).promise()
console.log('response:')
console.log(response)
}
send().catch(err => { console.log(err) })
It's getting this output:
START UPLOAD
OPEN
DATA: <Buffer 73 6f 6d 65 74 68 69 6e 67>
END
CLOSE
response:
{ ETag: '"d41d8cd98f00b204e9800998ecf8427e"',
Location: 'https://test-bucket.s3.amazonaws.com/output.txt',
key: 'output.txt',
Key: 'output.txt',
Bucket: 'test-bucket' }
The problem is that my file generated at S3 (output.txt) has 0 Bytes.
Someone know what am I doing wrong?
If I pass a buffer on Body
it works.
Body: Buffer.alloc(8 * 1024 * 1024, 'something'),
But it's not what I want to do. I'd like to do this using a stream to generate a file and pipe a stream to S3 as long as I generate it.
It's an API interface issue using NodeJS ReadableStreams
. Just comment the code related to listen event 'data'
, solves the problem.
const fs = require('fs')
const S3 = require('aws-sdk/clients/s3')
const s3 = new S3()
const send = async () => {
const rs = fs.createReadStream('/home/osman/Downloads/input.txt')
rs.on('open', () => {
console.log('OPEN')
})
rs.on('end', () => {
console.log('END')
})
rs.on('close', () => {
console.log('CLOSE')
})
// rs.on('data', (chunk) => {
// console.log('DATA: ', chunk)
// })
console.log('START UPLOAD')
const response = await s3.upload({
Bucket: 'test-bucket',
Key: 'output.txt',
Body: rs,
}).promise()
console.log('response:')
console.log(response)
}
send().catch(err => { console.log(err) })
Though it's an strange API, when we listen to 'data'
event, the ReadableStream
starts the flowing mode (listening to an event changing publisher/EventEmitter state? Yes, very error prone...). For some reason the S3 need a paused ReadableStream
. If whe put rs.on('data'...)
after await s3.upload(...)
it works. If we put rs.pause()
after rs.on('data'...)
and befote await s3.upload(...)
, it works too.
Now, what does it happen? I don't know yet...
But the problem was solved, even it isn't completely explained.
/home/osman/Downloads/input.txt
actually exists and accessible by node.js process putObject
method Example:
const fs = require('fs');
const S3 = require('aws-sdk/clients/s3');
const s3 = new S3();
s3.putObject({
Bucket: 'test-bucket',
Key: 'output.txt',
Body: fs.createReadStream('/home/osman/Downloads/input.txt'),
}, (err, response) => {
if (err) {
throw err;
}
console.log('response:')
console.log(response)
});
Not sure how this will work with async .. await
, better to make upload to AWS:S3 work first, then change the flow.
UPDATE: Try to implement upload directly via ManagedUpload
const fs = require('fs');
const S3 = require('aws-sdk/clients/s3');
const s3 = new S3();
const upload = new S3.ManagedUpload({
service: s3,
params: {
Bucket: 'test-bucket',
Key: 'output.txt',
Body: fs.createReadStream('/home/osman/Downloads/input.txt')
}
});
upload.send((err, response) => {
if (err) {
throw err;
}
console.log('response:')
console.log(response)
});
链接地址: http://www.djcxy.com/p/39610.html