How to upload image on aws server using node - node.js

I am trying to upload the image on the AWS server using the multer using the following code:
//post request
var multer = require('multer');
var upload = multer({ dest: path.resolve('./public/uploads/'), });
/* POST saveblog router. */
router.post('/uploadMedia', upload.any(), function(req, res, next) {
//console.log("res",res);
console.log(req.body, 'Body');
console.log(req.files, 'files');
var myJSON = JSON.stringify(req.files[0].filename);
console.log("file name "+myJSON);
if (typeof myJSON != 'undefined'){
var obj= new Object();
obj.status=true;
obj.message="File Uploaded Successfully";
obj.type= req.files[0].mimetype;
var fileExtensionArray=(req.files[0].mimetype).split("/");
var fileExtension=fileExtensionArray[1];
res.json(obj);
}else {
var obj= new Object();
obj.status=false;
obj.message="Error while uploading file try again";
res.json(obj);
}
res.end();
});
It is working fine on the local. But when I upload that code to the server and try to hit through the API. The server stops with following logs
Change detected on path public/uploads/b690b296bfde62eb8ff527328bc8b463 for app www - restarting
PM2 | Stopping app:www id:0
As log says change detected but I am not able to find any image.
As I have searched on StackOverflow and google, I am not able to find tutorial or help to do the same.
I am getting help regarding the s3 but I don't want to upload on it.
Is it not possible to upload the image to AWS server like that and I have to use S3 or something wrong with my code?
Edit: Now I am able to get the image to the destination folder but still not able to return the response.

You're likely running PM2 in Watch mode, which is a feature intended to be used in development, to restart your application upon changes to its files. This is not intended to be used in production.
To fix this, if you're starting pm2 from the cli use the option
--no-autorestart
or the yaml config;
autorestart: false

Related

npm connect-multiparty 'TypeError: Cannot read properties of undefined (reading 'path')' on AWS Elastic Beanstalk

The connect-multiparty package is giving me some trouble; When I'm running my website locally, it works perfectly fine, saves the uploaded multiform data as intended, but when running on AWS Elastic Beanstalk, I run into the error TypeError: Cannot read properties of undefined (reading 'path')
The first bit of my backend code is here:
const multipart = require('connect-multiparty')
consts.mainRouter.post(`/*`, mpmw, (req, res, next) => {
index.log(`mainRouter got posted in uploads: ${req.url}`)
next()
})
consts.mainRouter.post(`${prefix}`, mpmw, async (req, res) => {
index.log(`upload request received`)
const account_data = await consts.accountExists(req.signedCookies)
console.log(req.body)
console.log(req.files)
let { audio, thumbnail } = req.files
let { name, collab } = req.body
let imageblobin = fs.readFileSync(`${thumbnail.path}`)
let imageblob = undefined
let imagebloblarge = undefined
let newtime = new Date().getTime()
On the "fs.readFileSync()" line, the error occurs; What could go wrong on AWS Elastic Beanstalk here that would work fine on my local machine?
Note: I am 100% sure that the request was correctly made, it previously worked, and only the backend was changed since
Turns out there was something wrong with the EB environment I was using (sadly don't know what); I swapped the backend to a different environment which isn't blocking my multipart/form-data requests and therefore doesn't bring this error.

Generating rss.xml for Angular 8 app locally works fine, but not on prod

I am trying to generate a rss.xml file for my angular 8 app with ssr + firebase + gcp inside the domain.
I've created a RssComponent which can be reached at /rss route. There i call getNews() method and receive an array of objects. Then I make a http request to /api/rss and in server.ts i handle that request:
app.post('/api/rss', (req, res) => {
const data = req.body.data;
const feedOptions = // defining options here
const feed = new RSS(feedOptions);
data.forEach((item) => {
feed.item({
title: item.headingUa,
description: item.data[0].dataUa,
url: item.rssLink,
guid: item.id,
date: item.utcDate,
enclosure: {url: item.mainImg.url.toString().replace('&', '&'), type: 'image/jpeg'}
});
});
const xml = feed.xml({indent: true});
fs.chmod('dist/browser/rss.xml', 0o600, () => {
fs.writeFile('dist/browser/rss.xml', xml, 'utf8', function() {
res.status(200).end();
});
});
});
And finally on response i'm opening the recently generated rss.xml file in RssComponent. Locally everything is working fine but on Google Cloud Platform it's not generating a file.
As explained in the Cloud Functions docs:
The only writeable part of the filesystem is the /tmp directory
Try changing the path to the file to the /tmp directory.
Nevertheless, using local files in a serverless environment is a really bad idea. You should assume the instance handling the following request will not be the same as the previous one.
The best way to handle this would be to avoid writing local files and instead storing the generated file in GCP Storage or Firebase Storage, and then retrieving it from there when needed.
This will ensure your functions are idempotent, and also will comply with the best practices.

Get source url for image stored in Bluemix Object Storage container using Node.js app

I have an Object Storage instance on Bluemix where I am storing images in the container. I need a source url for the images stored there so that I can use that image. To do this, I'm thinking of creating a Node.js app so that I will write a post call where I'll pass image name present in Object Storage as request, so that it will give me the image url as response.
Is this possible or not? If possible, can anyone suggest whether there are any npm modules which do this functionality? If not, are there any other suggestions to get the url of image?
Any help is appreciated..Thanks!
start the server by command node app.js also u need package pkgcloud to perform this operation. You can get the object storage credentials simply by creating a key on IBM console inside Storage module.
inside app.js insert a new route for download
var objectStorageHandler = require("./lib/objectStorageHandler.js");
app.get('/download', function(req, res) {
(new objectStorageHandler()).download('YourContainerName', 'imagenamewithextension',function(download){
console.log(res);
download.pipe(res);
});
});
Inside Lib folder create a module with name objectStorageHandler.js
Inside objectStorageHandler.js write code
var pkgcloud = require('pkgcloud');
var objectStorageHandler = function(){
}
objectStorageHandler.prototype.download = function(container, file,callback)
{
var config = {
provider: 'openstack',
useServiceCatalog: true,
useInternal: false,
keystoneAuthVersion: 'v3',
authUrl: 'https://identity.open.softlayer.com',
tenantId: 'YOURPROJECTID', //projectId from credentials
domainId: 'YOURDOMAINID',
username: 'YOURUSRNAME',
password: 'YOURPASSWORD',
region: 'dallas' //dallas or london region
};
var client = pkgcloud.storage.createClient(config);
client.auth(function (error) {
if(error) {
console.error("Authorization error for storage client (pkgcloud): ", error);
}
else {
var request = client.download({
container: container,
remote: file
});
callback(request);
}
});
}
module.exports = objectStorageHandler;
after server started lets assume at port 3000 simply call localhost:3000/download that will download the image, we can also pass image name in parameters to download images dynamically.

Can't connect to the storage server in the Google Compute Engine Tutorial for Node.js

I'm following Google's tutorial to integrate cloud storage with Node.js. I'm having problems connecting to the Cloud storage server.
In the tutorial, you create the server using
$ gsutil mb gs://<your-project-id>
$ gsutil defacl set public-read gs://<your-project-id>
Afterwards, you configure the config.js file in the sample project to use the created storage server. Then, the following function (in the image.js file) is used as a request preprocessor to upload files to the cloud storage.
function processUploads(req, resp, next) {
var numFiles = Object.keys(req.files).length;
if (!numFiles) return next();
function checkNext(err) {
numFiles--;
if (numFiles === 0) next();
}
Object.keys(req.files).forEach(function(key) {
var uploadedFile = req.files[key];
var file = bucket.file(uploadedFile.name);
var stream = file.createWriteStream();
stream.on('error', function(err) {
uploadedFile.cloudStorageError = err;
checkNext();
});
stream.on('complete', function() {
uploadedFile.cloudStorageObject = uploadedFile.name;
uploadedFile.cloudStoragePublicUrl = getPublicUrl(uploadedFile.name);
file.makePublic(checkNext);
});
stream.end(uploadedFile.buffer);
});
}
The problem is that, when you upload a file in the form, neither the complete nor the error callbacks are called when you upload a file. It seems the storage server is simply not responding.
Can anyone help me make it work?
After fiddling around a little more, I found out the files were actually being stored in the cloud. I was able to use the browse function in the Storage server session of the Google Compute Engine Console in order to see the files I uploaded.
But my callbacks weren't being called. After reading the source code, it appears that the sample code is outdated. One I replaced the event from complete to finish, as shown down below, everything started working perfectly.
function processUploads(req, resp, next) {
var numFiles = Object.keys(req.files).length;
if (!numFiles) return next();
function checkNext(err) {
numFiles--;
if (numFiles === 0) next();
}
Object.keys(req.files).forEach(function(key) {
var uploadedFile = req.files[key];
var file = bucket.file(uploadedFile.name);
var stream = file.createWriteStream();
stream.on('error', function(err) {
uploadedFile.cloudStorageError = err;
checkNext();
});
stream.on('finish', function() {
uploadedFile.cloudStorageObject = uploadedFile.name;
uploadedFile.cloudStoragePublicUrl = getPublicUrl(uploadedFile.name);
file.makePublic(checkNext);
});
stream.write(uploadedFile.buffer);
stream.end();
});
}
I've also found a gcloud package issue that shows there was a breaking change, but the documentation is outdated as it tells you to use complete rather than finish at the time of this writing.

Use Cordova's Filetransfer plugin with express/blob storage

I am using Typescript and Cordova 4.0.
I have the following sample code:
uploadImages(imageUris: Array<string>): any {
var fileTransfer = new FileTransfer();
for (var i = 0; i < imageUris.length; i++) {
fileTransfer.upload(imageUris[i], encodeURI('http://3187cf3.ngrok.com/test/photos'), (success) => {
alert('success');
}, (err) => {
alert('error');
});
}
}
This corresponds to an express route:
var router = express.Router(),
test = test.controller;
router
.post('/test/photos', bind(test.uploadPhotos, test));
Which corresponds to a controller method:
uploadPhotos(req: express.Request, res: express.Response) {
console.log(req);
}
I can't seem to figure out how to, inside of my controller, grab the "file" or image I'm posting to my server using Filetransfer. It's not on req.body or req.query, and when I look through the entire req I can't seem to locate the file. The app flow is working enough to actually make the POST request to test/photos, I just don't know how to or if I can access the file at that point.
How does Filetransfer work, and how can I access the data I need in my controller so that I can push it to Azure Blob Storage?
It looks like you have everything setup correctly to send the data through to your controller. The issue is that you need to put the file on the request since cordova's filetransfer plugin doesn't do that by default.
You can do that with a popular library multer.
npm install multer --save-dev To install multer and save it to your package.json file.
In your express config file, add something like the following:
var multer = require('multer');
app.use(multer({ dest: path.resolve(config.root, 'public/img/') }))
'public/img/' is the path you would like for your image to be saved.
Now your req will have files on it. To upload a single file, you would use req.files.file. You'll want to use this variable to send your file to azure's blob storage using something like blobSvc.createBlockBlobFromLocalFile(containerName, fileName, filePath)
Since you're using Azure for remote storage, chances are you will want to remove the local file that multer has saved. I'd recommend using fs or rimraf to remove the file stored in public/img/, or whatever you set the path to in your express config. If you are using fs, you'll want to use the .unlink command.

Resources