Rackspace cloud taking too long to upload? - node.js

Im following rackspace example on file upload to cloud storage on docs. It is works but the upload are taking too loong. Like really longer! No matter what region do I use 17,kb file take more than 3sec is this actual behaviour of rackspace cloud, they really slow?
Im using rackspace with nodejs, whith the help of pacakage named pkgcloud.
// taken from pkgcloud docs
var readStream = fs.createReadStream('a-file.txt');
var writeStream = client.upload({
container: 'a-container',
remote: 'remote-file-name.txt'
});
writeStream.on('error', function(err) {
// handle your error case
});
writeStream.on('success', function(file) {
// success, file will be a File model
});
readStream.pipe(writeStream);
The purpose here is, I do image processing on the backend then I send back CDN URL to the user, but a user cannot wait too long 2MB took forever to upload -- timeout and held my server until crash since the stream aren't finished yet

Related

Download image files from google cloud storage bucket to IOS localstorage using meteor

I am using Meteor project to upload images to the google cloud from iOS device and download the same images to iOS device.
I don't get any issues while uploading the images, it gets stored in google storage bucket. The issue I am facing is while downloading the images, I am using below code which downloads the images on server's path.
bucket.file(srcFilename).download(options);
I want to download and store the images on iOS device.
When I tried to read the file using createReadStream, my app get stuck without any progress (Not getting any callback).
bucket.file(srcFilename).createReadStream()
.on('error', function(err) {
console.log("error");
})
.on('response', function(response) {
// Server connected and responded with the specified status and
console.log("response");
})
.on('end', function() {
// The file is fully downloaded.
console.log("The file is fully downloaded.");
})
I hope that I am not missing anything while downloading the images to iOS device. I looked but unable to find any other option to do the same.
Any help in this regard is really appreciated as I am stuck at this very point.
I used below code to get the file from google cloud and download the chunks which I converted to binary format. Then I used this binary format to display image and store in my local storage from client side.
var chunkNew = new Buffer('');
bucket.file(srcFilename).createReadStream().on('data', function (chunk) {
chunkNew = Buffer.concat([chunkNew,chunk]);
})
.on('end', function() {
// The file is fully downloaded.
callback(null, chunkNew.toString('base64'));
})
More information can be found in this link http://codewinds.com/blog/2013-08-04-nodejs-readable-streams.html which uses data chunk to show the image as array buffer.

Streaming upload from NodeJS to Dropbox

Our system needs to use out internal security checks when interacting with dropbox, we can therefore not use the clientside SDK for Dropbox.
We would rather upload to our own endpoint, apply security checks, and then stream the incoming request to dropbox.
I am coming up short here as there was an older NodeJS Dropbox SDK which supported pipes, but the new SDK does not.
Old SDK:
https://www.npmjs.com/package/dropbox-node
We want to take the incoming upload request and forward it to dropbox as it comes in. (and thus prevent the upload from taking twice as long if we first upload the entire thing to our server and then upload to dropbox)
Is there any way to solve this?
My Dropbox NPM module (dropbox-v2-api) supports streaming. It's based on HTTP API, so you can take an advantage of streams. Example? I see it this way:
const contentStream = fs.createReadStream('file.txt');
const securityChecks = ... //your security checks
const uploadStream = dropbox({
resource: 'files/upload',
parameters: { path: '/target/file/path' }
}, (err, result, response) => {
//upload finished
});
contentStream
.pipe(securityChecks)
.pipe(uploadStream);
Full stream support example here.

Get upload progress using Azure Storage Node.JS SDK

Looking at this, we have some code that looks like
var azure = require('azure-storage');
var blobSvc = azure.createBlobService()
blobSvc.createBlockBlobFromLocalFile('mycontainer', 'giantFile', 'giantFile.bak', function(error, result, response){
if(!error){
// file uploaded
}
});
This "work" but we have no idea about the status of each upload until it returns. We'd like to print the progress of the upload to the console since it's a sequence of very large files that can take several hours.
I'm afraid i don't think its possible out of the box with the SDK. MSDN Artical Apparently such an API hasn't been implemented yet.

Streaming files directly to Client from Amazon S3 (Node.js)

I am using sails.js and am trying to stream files from the Amazon s3 server directly to the client.
To connect to S3, I use the s3 Module : https://www.npmjs.org/package/s3
This module provides capabilities like client.downloadFile(params) and client.downloadBuffer(s3Params).
My current code looks like the following:
var view = client.downloadBuffer(params);
view.on('error', function(err) {
cb({success: 0, message: 'Could not open file.'}, null);
});
view.on('end', function(buffer) {
cb(null, buffer);
});
I catch this buffer in a controller using:
User.showImage( params , function (err, buffer){
// this is where I can get the buffer
});
Is it possible to stream this data as an image file (using buffer.pipe(res) doesn't work of course). But is there something similar to completely avoid saving file to server disk first?
The other option client.downloadFile(params) requires a local path (i.e. a server path in our case)
The GitHub issue contains the "official" answer to this question: https://github.com/andrewrk/node-s3-client/issues/53

What "streams and pipe-capable" means in pkgcloud in NodeJS

My issue is to get image uploading to amazon working.
I was looking for a solution that doesnt save the file on the server and then upload it to Amazon.
Googling I found pkgcloud and on the README.md it says:
Special attention has been paid so that methods are streams and
pipe-capable.
Can someone explain what that means and if it is what I am looking for?
Yupp, that means you've found the right kind of s3 library.
What it means is that this library exposes "streams". Here is the API that defines a stream: http://nodejs.org/api/stream.html
Using node's stream interface, you can pipe any readable stream (in this case the POST's body) to any writable stream (in this case the S3 upload).
Here is an example of how to pipe a file upload directly to another kind of library that supports streams: How to handle POSTed files in Express.js without doing a disk write
EDIT: Here is an example
var pkgcloud = require('pkgcloud'),
fs = require('fs');
var s3client = pkgcloud.storage.createClient({ /* ... */ });
app.post('/upload', function(req, res) {
var s3upload = s3client.upload({
container: 'a-container',
remote: 'remote-file-name.txt'
})
// pipe the image data directly to S3
req.pipe(s3upload);
});
EDIT: To finish answering the questions that came up in the chat:
req.end() will automatically call s3upload.end() thanks to stream magic. If the OP wants to do anything else on req's end, he can do so easily: req.on('end', res.send("done!"))

Resources