Can't connect to the storage server in the Google Compute Engine Tutorial for Node.js - node.js

I'm following Google's tutorial to integrate cloud storage with Node.js. I'm having problems connecting to the Cloud storage server.
In the tutorial, you create the server using
$ gsutil mb gs://<your-project-id>
$ gsutil defacl set public-read gs://<your-project-id>
Afterwards, you configure the config.js file in the sample project to use the created storage server. Then, the following function (in the image.js file) is used as a request preprocessor to upload files to the cloud storage.
function processUploads(req, resp, next) {
var numFiles = Object.keys(req.files).length;
if (!numFiles) return next();
function checkNext(err) {
numFiles--;
if (numFiles === 0) next();
}
Object.keys(req.files).forEach(function(key) {
var uploadedFile = req.files[key];
var file = bucket.file(uploadedFile.name);
var stream = file.createWriteStream();
stream.on('error', function(err) {
uploadedFile.cloudStorageError = err;
checkNext();
});
stream.on('complete', function() {
uploadedFile.cloudStorageObject = uploadedFile.name;
uploadedFile.cloudStoragePublicUrl = getPublicUrl(uploadedFile.name);
file.makePublic(checkNext);
});
stream.end(uploadedFile.buffer);
});
}
The problem is that, when you upload a file in the form, neither the complete nor the error callbacks are called when you upload a file. It seems the storage server is simply not responding.
Can anyone help me make it work?

After fiddling around a little more, I found out the files were actually being stored in the cloud. I was able to use the browse function in the Storage server session of the Google Compute Engine Console in order to see the files I uploaded.
But my callbacks weren't being called. After reading the source code, it appears that the sample code is outdated. One I replaced the event from complete to finish, as shown down below, everything started working perfectly.
function processUploads(req, resp, next) {
var numFiles = Object.keys(req.files).length;
if (!numFiles) return next();
function checkNext(err) {
numFiles--;
if (numFiles === 0) next();
}
Object.keys(req.files).forEach(function(key) {
var uploadedFile = req.files[key];
var file = bucket.file(uploadedFile.name);
var stream = file.createWriteStream();
stream.on('error', function(err) {
uploadedFile.cloudStorageError = err;
checkNext();
});
stream.on('finish', function() {
uploadedFile.cloudStorageObject = uploadedFile.name;
uploadedFile.cloudStoragePublicUrl = getPublicUrl(uploadedFile.name);
file.makePublic(checkNext);
});
stream.write(uploadedFile.buffer);
stream.end();
});
}
I've also found a gcloud package issue that shows there was a breaking change, but the documentation is outdated as it tells you to use complete rather than finish at the time of this writing.

Related

Generating rss.xml for Angular 8 app locally works fine, but not on prod

I am trying to generate a rss.xml file for my angular 8 app with ssr + firebase + gcp inside the domain.
I've created a RssComponent which can be reached at /rss route. There i call getNews() method and receive an array of objects. Then I make a http request to /api/rss and in server.ts i handle that request:
app.post('/api/rss', (req, res) => {
const data = req.body.data;
const feedOptions = // defining options here
const feed = new RSS(feedOptions);
data.forEach((item) => {
feed.item({
title: item.headingUa,
description: item.data[0].dataUa,
url: item.rssLink,
guid: item.id,
date: item.utcDate,
enclosure: {url: item.mainImg.url.toString().replace('&', '&'), type: 'image/jpeg'}
});
});
const xml = feed.xml({indent: true});
fs.chmod('dist/browser/rss.xml', 0o600, () => {
fs.writeFile('dist/browser/rss.xml', xml, 'utf8', function() {
res.status(200).end();
});
});
});
And finally on response i'm opening the recently generated rss.xml file in RssComponent. Locally everything is working fine but on Google Cloud Platform it's not generating a file.
As explained in the Cloud Functions docs:
The only writeable part of the filesystem is the /tmp directory
Try changing the path to the file to the /tmp directory.
Nevertheless, using local files in a serverless environment is a really bad idea. You should assume the instance handling the following request will not be the same as the previous one.
The best way to handle this would be to avoid writing local files and instead storing the generated file in GCP Storage or Firebase Storage, and then retrieving it from there when needed.
This will ensure your functions are idempotent, and also will comply with the best practices.

Firebase Storage-How to delete file from storage with node.js?

I want to delete a folder in firebase storage with node js because this is a firebase function.
For example :
storageRef.child(child1).child(child2).delete();
something like this, but firebase documentation doesn't tell anything.
One more question:
When initialize storage documentation node js requires my admin json, but realtime database doesn't want this wonder why?
Have a look at the Node.js client API Reference for Google Cloud Storage and in particular at the delete() method for a File.
You can do it like this using Node.js:
const firebase = require('firebase-admin');
async function deleteImageFromFirebase(imageName) {
await firebase.storage().bucket().file("folderName/"+imageName).delete();
}
And like this client side:
// Create a reference to the file to delete
var desertRef = storageRef.child('images/desert.jpg');
// Delete the file
desertRef.delete().then(function() {
// File deleted successfully
}).catch(function(error) {
// Uh-oh, an error occurred!
});
View this info on the Firebase website:
how to delete files Firebase-storage
This might be late but at least on Web (so basically what you need), there is new API to delete the whole folder.
I tested deleting a folder with 2 pictures inside and it works. I then tried a folder-A with contents: folder-B + picture-A. Folder-B also has a picture-B inside; it still deleted folder-A with all of its contents.
Solution:
const bucket = admin.storage().bucket();
return bucket.deleteFiles({
prefix: `posts/${postId}`
);
I couldn't find this on the official documentation (perhaps is really new API) but really cool article where I found the solution:
Automatically delete your Firebase Storage Files from Firestore with Cloud Functions for Firebase
import { storage } from "./firebaseClient";
import { bucket } from "./firebaseServer";
//Let's assume this is the URL of the image we want to delete
const downloadUrl = "https://storage.googleapis.com/storage/v1/b/<projectID>.appspot.com/o/<location>?"
//firebase delete function
const deleteImages = async ({ downloadUrl }) => {
const httpsRef = storage.refFromURL(downloadUrl).fullPath;
return await bucket
.file(httpsRef)
.delete()
.then(() => "success")
.catch(() => "error")
}
//call the deleteImages inside async function
const deleteStatus = await deleteImages({ downloadUrl: oldImage });
console.log(deleteStatus) //=> "success"

How to upload image on aws server using node

I am trying to upload the image on the AWS server using the multer using the following code:
//post request
var multer = require('multer');
var upload = multer({ dest: path.resolve('./public/uploads/'), });
/* POST saveblog router. */
router.post('/uploadMedia', upload.any(), function(req, res, next) {
//console.log("res",res);
console.log(req.body, 'Body');
console.log(req.files, 'files');
var myJSON = JSON.stringify(req.files[0].filename);
console.log("file name "+myJSON);
if (typeof myJSON != 'undefined'){
var obj= new Object();
obj.status=true;
obj.message="File Uploaded Successfully";
obj.type= req.files[0].mimetype;
var fileExtensionArray=(req.files[0].mimetype).split("/");
var fileExtension=fileExtensionArray[1];
res.json(obj);
}else {
var obj= new Object();
obj.status=false;
obj.message="Error while uploading file try again";
res.json(obj);
}
res.end();
});
It is working fine on the local. But when I upload that code to the server and try to hit through the API. The server stops with following logs
Change detected on path public/uploads/b690b296bfde62eb8ff527328bc8b463 for app www - restarting
PM2 | Stopping app:www id:0
As log says change detected but I am not able to find any image.
As I have searched on StackOverflow and google, I am not able to find tutorial or help to do the same.
I am getting help regarding the s3 but I don't want to upload on it.
Is it not possible to upload the image to AWS server like that and I have to use S3 or something wrong with my code?
Edit: Now I am able to get the image to the destination folder but still not able to return the response.
You're likely running PM2 in Watch mode, which is a feature intended to be used in development, to restart your application upon changes to its files. This is not intended to be used in production.
To fix this, if you're starting pm2 from the cli use the option
--no-autorestart
or the yaml config;
autorestart: false

Retrieving a Firebase storage image link via Cloud Function

How do I retrieve the download links to stored images in Firebase via a cloud function?
I've tried all kinds of variations including the next one:
exports.getImgs = functions.https.onRequest((req, res) => {
var storage = require('#google-cloud/storage')();
var storageRef = storage.ref;
console.log(storageRef);
storageRef.child('users/user1/avatar.jpg').getDownloadURL().then(function(url) {
});
});
It annoyed me, so I will put the solution with a straight forward explanation to those who are looking for it.
1st, install GCD Storage using the firebase command line:
npm install --save #google-cloud/storage
Cloud function code:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account.json'});
const bucket = gcs.bucket('name-of-bucket.appspot.com');
const file = bucket.file('users/user1/avatar.jpg');
return file.getSignedUrl({
action: 'read',
expires: '03-09-2491'
}).then(signedUrls => {
console.log('signed URL', signedUrls[0]); // this will contain the picture's url
});
The name of your bucket can be found in the Firebase console under the Storage section.
The 'service-account.json' file can be created and downloaded from here:
https://console.firebase.google.com/project/_/settings/serviceaccounts/adminsdk
And should be stored locally in your Firebase folder under the functions folder. (or other as long as change the path in the code above)
That's it.

Get source url for image stored in Bluemix Object Storage container using Node.js app

I have an Object Storage instance on Bluemix where I am storing images in the container. I need a source url for the images stored there so that I can use that image. To do this, I'm thinking of creating a Node.js app so that I will write a post call where I'll pass image name present in Object Storage as request, so that it will give me the image url as response.
Is this possible or not? If possible, can anyone suggest whether there are any npm modules which do this functionality? If not, are there any other suggestions to get the url of image?
Any help is appreciated..Thanks!
start the server by command node app.js also u need package pkgcloud to perform this operation. You can get the object storage credentials simply by creating a key on IBM console inside Storage module.
inside app.js insert a new route for download
var objectStorageHandler = require("./lib/objectStorageHandler.js");
app.get('/download', function(req, res) {
(new objectStorageHandler()).download('YourContainerName', 'imagenamewithextension',function(download){
console.log(res);
download.pipe(res);
});
});
Inside Lib folder create a module with name objectStorageHandler.js
Inside objectStorageHandler.js write code
var pkgcloud = require('pkgcloud');
var objectStorageHandler = function(){
}
objectStorageHandler.prototype.download = function(container, file,callback)
{
var config = {
provider: 'openstack',
useServiceCatalog: true,
useInternal: false,
keystoneAuthVersion: 'v3',
authUrl: 'https://identity.open.softlayer.com',
tenantId: 'YOURPROJECTID', //projectId from credentials
domainId: 'YOURDOMAINID',
username: 'YOURUSRNAME',
password: 'YOURPASSWORD',
region: 'dallas' //dallas or london region
};
var client = pkgcloud.storage.createClient(config);
client.auth(function (error) {
if(error) {
console.error("Authorization error for storage client (pkgcloud): ", error);
}
else {
var request = client.download({
container: container,
remote: file
});
callback(request);
}
});
}
module.exports = objectStorageHandler;
after server started lets assume at port 3000 simply call localhost:3000/download that will download the image, we can also pass image name in parameters to download images dynamically.

Resources