application/octet-stream issue while using google moderate images trigger (blur image) - node.js

I,m using moderate images solution trigger from google.
I taked this solution from here.
I ask some to upgrade for me this solution & here is code:
'use strict'
const gm = require('gm').subClass({imageMagick: true})
const functions = require('firebase-functions')
const admin = require('firebase-admin')
admin.initializeApp()
const Vision = require('#google-cloud/vision')
const vision = new Vision.ImageAnnotatorClient()
const spawn = require('child-process-promise').spawn
const path = require('path')
const fs = require('fs')
const { Storage } = require('#google-cloud/storage')
const gcs = new Storage({
projectId: xxxxxxxxxxx,
})
exports.blurOffensiveImages = functions.storage
.object()
.onFinalize(async (object) => {
const file = gcs.bucket(object.bucket).file(object.name)
const filePath = `gs://${object.bucket}/${object.name}`
console.log(`Analyzing ${file.name}.`)
try {
const [result] = await vision.safeSearchDetection(filePath)
const detections = result.safeSearchAnnotation || {}
if (
detections.adult === 'VERY_LIKELY' ||
detections.violence === 'VERY_LIKELY'
) {
console.log(`Detected ${file.name} as inappropriate.`)
await blurImage(file, object.bucket, object.metadata)
console.log('Deleted local file', file)
return null
} else {
console.log(`Detected ${file.name} as OK.`)
}
} catch (err) {
console.error(`Failed to analyze ${file.name}.`, err)
throw err
}
})
async function blurImage(file, bucketName, metadata) {
const tempLocalPath = `/tmp/${path.parse(file.name).base}`
const bucket = gcs.bucket(bucketName)
await file.download({ destination: tempLocalPath })
console.log('The file has been downloaded to', tempLocalPath)
// Blur the image using ImageMagick.
await new Promise((resolve, reject) => {
gm(tempLocalPath)
.blur(0, 20)
.write(tempLocalPath, (err, stdout) => {
if (err) {
console.error('Failed to blur image.', err);
reject(err);
} else {
console.log(`Blurred image: ${file.name}`);
resolve(stdout);
}
});
});
console.log('Blurred image created at', tempLocalPath)
await bucket.upload(tempLocalPath, {
destination: file.name,
metadata: { metadata: metadata },
})
console.log('Blurred image uploaded to Storage at', file)
return fs.unlink(tempLocalPath, (e) => { if (e) {console.log(e)}})
}
End it's worked perfect, with one bad issue.
Sometimes when user sending list of photos i have "application/octet-stream" file type, but it should be "image/jpg" all media files at my project should be image/jpg.
one user's publication with error in image data type
It's looks like this trigger stuck when it executing.
I made delay in uploading images in my project, but it's doesn't helps me.
I tested - when i delete this trigger - all uploading photos is well & no issues at all.
Help me fix it.
P.S. want to say also, after uploading - image should have all data like original. (Destination, name etc.)

Related

Resizing Images already on Google Storage locally through SHARPJS, and keeping/updating the downloadUrl

There was similar questions/answers but not recently and none with the exact requirements.
I have many pictures for a dating app on Firebase Storage, uploaded from the users, with a downloadUrl saved on Firestore. Just noticed it is saved as very big pictures, and slow down loading of the users. Result: I need to resize and reformat to jpeg all pictures on firebase storage.
My research and trials for now 2 months brought me to the following conclusions:
It's not possible through Google Functions as the quota of 9 minutes is too slow to do the whole resizing.
Sharp is the best library to do this, but better do it locally.
I can use gsutil as in this Question Here to download all pictures and keep the path, resize it, and upload it later.
I was blocked at finding how I can resize/reformat with Sharp and whilst the name will be different and probably the metadata kept out, how can I uploaded it back and at the same time get a new downloadUrl so that I can in turn upload it to firestore in the users collection?
MY POTENTIAL SOLUTION (STEP 4):
Not sure if it will work, but I'd have a listening function for changed (finalized) object and getting info from the image to upload it back on firestore, using a self-made downloadUrl.
MY NEW QUESTION: Is it going to work? I'm afraid to break the pictures of all my users...
For your better understanding, here is my process so far:
1. Download Images
gsutil cp -r gs://my-bucket/data [path where you want to download]
2. Script (typescript) to resize/reformat them.
import * as fs from "fs";
import sharp from "sharp";
import * as path from "path";
const walk = (dir: string, done) => {
let results = [];
fs.readdir(dir, (err, list) => {
if (err) return done(err);
let i = 0;
(function next() {
let file = list[i++];
if (!file) return done(null, results);
file = path.resolve(dir, file);
fs.stat(file, (err, stat) => {
if (stat && stat.isDirectory()) {
walk(file, (err, res) => {
results = results.concat(res);
next();
});
} else {
results.push(file);
next();
}
});
})();
});
};
const reformatImage = async (filesPaths: string[]) => {
let newFilesPaths: string[] = [];
await Promise.all(
filesPaths.map(async (filePath) => {
let newFileName = changeExtensionName(filePath);
let newFilePath = path.join(path.dirname(filePath), NewFileName);
if (filePath === newFilePath) {
newFileName = "rszd-" + newFileName;
newFilePath = path.join(path.dirname(filePath), newFileName);
}
newFilesPaths.push(newFilePath);
try {
await sharp(filePath)
.withMetadata()
.resize(600, 800, {
fit: sharp.fit.inside,
})
.toFormat("jpeg")
.jpeg({
mozjpeg: true,
force: true,
})
.toFile(newFilePath)
.then(async (info) => {
console.log("converted file...", info);
})
.catch((error) => {
console.log("sharp error: ", error);
});
} catch (error) {
console.error("error converting...", error);
}
})
);
console.log("THIS IS THE RESIZED IMAGES");
console.log(newFilesPaths);
};
const changeExtensionName = (filePath: string) => {
const ext = path.extname(filePath || "");
const virginName = path.basename(filePath, ext);
const newName = virginName + ".jpg";
return newName;
};
walk("./xxxxxx.appspot.com", (err, results) => {
if (err) throw err;
console.log("THIS IS THE DOWNLOADED IMAGES");
console.log(results);
reformatImage(results);
});
3. Re-upload the files
gsutil cp -r [path your images] gs://my-bucket/data
4. Listen for new file update through a Firebase Functions, and update the new downloadUrl
export const onOldImageResizedUpdateDowloadUrl = functions.storage
.object()
.onFinalize(async (object: any) => {
if (object) {
functions.logger.log('OBJECT: ', object);
const fileBucket = object.bucket;
const filePath: string = object.name;
const userId = path.basename(path.dirname(filePath));
const fileName = path.basename(filePath);
const isResized = fileName.startsWith('rszd-');
if (!isResized) {return;}
const token = object.metadata.firebaseStorageDownloadTokens;
const downloadUrl = createDownloadUrl(
fileBucket,
token,
userId,
fileName
);
const pictureId = 'picture' + fileName.charAt(5); // pictures are named as eg "rszd-" + 1.jpeg
admin
.firestore()
.collection('users')
.doc(userId)
.update({ [pictureId]: downloadUrl });
}
});
function createDownloadUrl(
bucketPath: string,
downloadToken: string,
uid: string,
fileName: string) {
return `https://firebasestorage.googleapis.com/v0/b/${bucketPath}/o/pictures-profil%2F${uid}%2F${fileName}?alt=media&token=${downloadToken}`;
}

Get progress of firebase admin file upload

I'm trying to get the progress of a 1 minute video uploading to firebase bucket storage using the admin sdk. I've seen a lot about using firebase.storage().ref.child..... but I'm unable to do that with the admin sdk since they don't have the same functions. This is my file upload:
exports.uploadMedia = (req, res) => {
const BusBoy = require('busboy');
const path = require('path');
const os = require('os');
const fs = require('fs');
const busboy = new BusBoy({ headers: req.headers, limits: { files: 1, fileSize: 200000000 } });
let mediaFileName;
let mediaToBeUploaded = {};
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
if(mimetype !== 'image/jpeg' && mimetype !== 'image/png' && mimetype !== 'video/quicktime' && mimetype !== 'video/mp4') {
console.log(mimetype);
return res.status(400).json({ error: 'Wrong file type submitted, only .png, .jpeg, .mov, and .mp4 files allowed'})
}
// my.image.png
const imageExtension = filename.split('.')[filename.split('.').length - 1];
//43523451452345231234.png
mediaFileName = `${Math.round(Math.random()*100000000000)}.${imageExtension}`;
const filepath = path.join(os.tmpdir(), mediaFileName);
mediaToBeUploaded = { filepath, mimetype };
file.pipe(fs.createWriteStream(filepath));
file.on('limit', function(){
fs.unlink(filepath, function(){
return res.json({'Error': 'Max file size is 200 Mb, file size too large'});
});
});
});
busboy.on('finish', () => {
admin
.storage()
.bucket()
.upload(mediaToBeUploaded.filepath, {
resumable: false,
metadata: {
metadata: {
contentType: mediaToBeUploaded.mimetype
}
}
})
.then(() => {
const meadiaUrl = `https://firebasestorage.googleapis.com/v0/b/${config.storageBucket}/o/${mediaFileName}?alt=media`;
return res.json({mediaUrl: meadiaUrl});
})
.catch((err) => {
console.error(err);
return res.json({'Error': 'Error uploading media'});
});
});
req.pipe(busboy);
}
This works okay right now, but the only problem is that the user can't see where their 1 or 2 minute video upload is at. Currently, it's just a activity indicator and the user just sits their waiting without any notice. I'm using react native on the frontend if that helps with anything. Would appreciate any help!
I was able to implement on the client side a lot easier... but it works perfect with image and video upload progress. On the backend, I was using the admin sdk, but frontend I was originally using the firebase sdk.
this.uploadingMedia = true;
const imageExtension = this.mediaFile.split('.')[this.mediaFile.split('.').length - 1];
const mediaFileName = `${Math.round(Math.random()*100000000000)}.${imageExtension}`;
const response = await fetch(this.mediaFile);
const blob = await response.blob();
const storageRef = storage.ref(`${mediaFileName}`).put(blob);
storageRef.on(`state_changed`,snapshot=>{
this.uploadProgress = (snapshot.bytesTransferred/snapshot.totalBytes);
}, error=>{
this.error = error.message;
this.submitting = false;
this.uploadingMedia = false;
return;
},
async () => {
storageRef.snapshot.ref.getDownloadURL().then(async (url)=>{
imageUrl = [];
videoUrl = [url];
this.uploadingMedia = false;
this.submitPost(imageUrl, videoUrl);
});
});
export const uploadFile = (
folderPath,
fileName,
file,
generateDownloadURL = true,
updateInformationUploadProgress
) => {
return new Promise((resolve, reject) => {
try {
const storageRef = firebaseApp.storage().ref(`${folderPath}/${fileName}`)
const uploadTask = storageRef.put(file)
uploadTask.on(
'state_changed',
snapshot => {
if (updateInformationUploadProgress) {
const progress =
(snapshot.bytesTransferred / snapshot.totalBytes) * 100
updateInformationUploadProgress({
name: fileName,
progress: progress,
})
}
},
error => {
console.log('upload error: ', error)
reject(error)
},
() => {
if (generateDownloadURL) {
uploadTask.snapshot.ref
.getDownloadURL()
.then(url => {
resolve(url)
})
.catch(error => {
console.log('url error: ', error.message)
reject(error)
})
} else {
resolve(uploadTask.snapshot.metadata.fullPath)
}
}
)
} catch (error) {
reject(error)
}
})
}

configuring the image file reducer Google Firestore

The following google cloud function properly uploads an image, but I would also like to compress the image as to avoid unnecessary charges due to large files being uploaded. I am using the image reducer extension from Firebase and it works but the issue is that the image file no longer shows up on my user table. is there something i need to configure in the extension so that the image url in the user table is overwritten by the reduced image??
exports.uploadImage = (req, res) => {
const BusBoy = require("busboy")
const path = require("path")
const os = require("os")
const fs = require("fs")
const busboy = new BusBoy({ headers: req.headers })
let imageToBeUploaded = {}
let imageFileName
busboy.on("file", (fieldname, file, filename, encoding, mimetype) => {
if (mimetype !== `image/jpeg` && mimetype !== `image/png`) {
return res.status(400).json({ error: `Not an acceptable file type` })
}
// my.image.png => ['my', 'image', 'png']
const imageExtension = filename.split(".")[filename.split(".").length - 1]
// 32756238461724837.png
imageFileName = `${Math.round(
Math.random() * 1000000000000
).toString()}.${imageExtension}`
const filepath = path.join(os.tmpdir(), imageFileName)
imageToBeUploaded = { filepath, mimetype }
file.pipe(fs.createWriteStream(filepath))
})
busboy.on("finish", () => {
admin
.storage()
.bucket(config.storageBucket)
.upload(imageToBeUploaded.filepath, {
resumable: false,
metadata: {
metadata: {
contentType: imageToBeUploaded.mimetype
}
}
})
.then(() => {
const imageUrl = `https://firebasestorage.googleapis.com/v0/b/${config.storageBucket}/o/${imageFileName}?alt=media`
return db.doc(`/users/${req.user.uid}`).update({ imageUrl })
})
.then(() => {
return res.json({ message: "image uploaded successfully" })
})
.catch(err => {
console.error(err)
return res.status(500).json({ error: "something went wrong" })
})
})
busboy.end(req.rawBody)
}
The Resize Images extension only handles the resizing of the image in Cloud Storage. It does not update data in any other location, including Cloud Firestore. If you need such functionality, you'll need to create it yourself.
Also see:
this discussion on the extensions open-source repo about allowing to specify a callback that gets invoked after resizing.

Deleting original image once the image url is updated

I am using firebase and as you can see in the code I am updating the user's image url that is stored on the user table. Is there a way to delete the old image file that's still being stored in my storage bucket once an image is updated?
exports.uploadImage = (req, res) => {
const BusBoy = require("busboy")
const path = require("path")
const os = require("os")
const fs = require("fs")
const busboy = new BusBoy({ headers: req.headers })
let imageToBeUploaded = {}
let imageFileName
busboy.on("file", (fieldname, file, filename, encoding, mimetype) => {
if (mimetype !== `image/jpeg` && mimetype !== `image/png`) {
return res.status(400).json({ error: `Not an acceptable file type` })
}
// my.image.png => ['my', 'image', 'png']
const imageExtension = filename.split(".")[filename.split(".").length - 1]
// 32756238461724837.png
imageFileName = `${Math.round(
Math.random() * 1000000000000
).toString()}.${imageExtension}`
const filepath = path.join(os.tmpdir(), imageFileName)
imageToBeUploaded = { filepath, mimetype }
file.pipe(fs.createWriteStream(filepath))
})
busboy.on("finish", () => {
admin
.storage()
.bucket(config.storageBucket)
.upload(imageToBeUploaded.filepath, {
resumable: false,
metadata: {
metadata: {
contentType: imageToBeUploaded.mimetype
}
}
})
.then(() => {
const imageUrl = `https://firebasestorage.googleapis.com/v0/b/${config.storageBucket}/o/${imageFileName}?alt=media`
return db.doc(`/users/${req.user.uid}`).update({ imageUrl })
})
.then(() => {
return res.json({ message: "image uploaded successfully" })
})
.catch(err => {
console.error(err)
return res.status(500).json({ error: "something went wrong" })
})
})
busboy.end(req.rawBody)
}
Any suggestions would be greatly appreciated
The best way for you to achieve that is by using a Cloud Function to be run, once you have a new upload of photo done on your function.
I would recommend you to take a look at the article Automatically delete your Firebase Storage Files from Firestore with Cloud Functions for Firebase, to get more information, on how to perform these automatic deletions with Cloud Functions. Besides that, on this post from the Community here, you can check that with node.js language.
On these other two posts from the Community, you can get more ideas and insights to achieve this goal.
Firebase function (written in NodeJS) to delete file from Cloud Storage when an object is removed from Realtime Database
Firebase Storage-How to delete file from storage with node.js?
Let me know if the information helped you!

Upload a file to Google Cloud, in a specific directory

How to upload a file on Google Cloud, in a specific bucket directory (e.g. foo)?
"use strict";
const gcloud = require("gcloud");
const PROJECT_ID = "<project-id>";
let storage = gcloud.storage({
projectId: PROJECT_ID,
keyFilename: 'auth.json'
});
let bucket = storage.bucket(`${PROJECT_ID}.appspot.com`)
bucket.upload("1.jpg", (err, file) => {
if (err) { return console.error(err); }
let publicUrl = `https://firebasestorage.googleapis.com/v0/b/${PROJECT_ID}.appspot.com/o/${file.metadata.name}?alt=media`;
console.log(publicUrl);
});
I tried:
bucket.file("foo/1.jpg").upload("1.jpg", ...)
But there's no upload method there.
How can I send 1.jpg in the foo directory?
In Firebase, on the client side, I do:
ref.child("foo").put(myFile);
bucket.upload("1.jpg", { destination: "YOUR_FOLDER_NAME_HERE/1.jpg" }, (err, file) => {
//Do something...
});
This will put 1.jpg in the YOUR_FOLDER_NAME_HERE-folder.
Here is the documentation. By the way, gcloud is deprecated and you should use google-cloud instead.
UPDATE 2020
according to google documentation:
const { Storage } = require('#google-cloud/storage');
const storage = new Storage()
const bucket = storage.bucket('YOUR_GCLOUD_STORAGE_BUCKET')
const blob = bucket.file('youFolder/' + 'youFileName.jpg')
const blobStream = blob.createWriteStream({
resumable: false,
gzip: true,
public: true
})
blobStream.on('error', (err) => {
console.log('Error blobStream: ',err)
});
blobStream.on('finish', () => {
// The public URL can be used to directly access the file via HTTP.
const publicUrl = ('https://storage.googleapis.com/'+ bucket.name + '/' + blob.name)
res.status(200).send(publicUrl);
});
blobStream.end(req.file.buffer)//req.file is your original file
Here you go...
const options = {
destination: 'folder/new-image.png',
resumable: true,
validation: 'crc32c',
metadata: {
metadata: {
event: 'Fall trip to the zoo'
}
}
};
bucket.upload('local-image.png', options, function(err, file) {
// Your bucket now contains:
// - "new-image.png" (with the contents of `local-image.png')
// `file` is an instance of a File object that refers to your new file.
});
If accessing from the same project projectId , keyFilename,.. not required,I use the below code for both upload and download , it works fine.
// Imports the Google Cloud client library
const Storage = require('#google-cloud/storage');
const storage = new Storage();
var destFilename = "./test";
var bucketName = 'cloudtesla';
var srcFilename = 'test';
const options = {
destination: destFilename,
};
//upload file
console.log("upload Started");
storage.bucket(bucketName).upload(srcFilename, {}, (err, file) => {
if(!err)
console.log("upload Completed");
else
console.log(err);
});
//Download file
console.log("Download Started");
storage
.bucket(bucketName)
.file(srcFilename)
.download(options)
.then(() => {
console.log("Download Completed");
})
.catch(err => {
console.error('ERROR:', err);
});
To upload inside specific directory in .NET Core, use
var uploadResponse= await storageClient.UploadObjectAsync(bucketName, $"{foldername}/"+fileName, null, memoryStream);
This should upload your file 'fileName' inside folder 'foldername' in the bucket
I think just adding foo/ to the filename should work, like bucket.upload("foo/1.jpg", (err, file) ... In GCS, directories just a matter of having a '/' in the file name.
If you want to use async-await while uploading files into storage buckets the callbacks won't do the job, Here's how I did it.
async function uploadFile() {
const destPath = 'PATH_TO_STORAGE/filename.extension';
await storage.bucket("PATH_TO_YOUR_BUCKET").upload(newFilePath, {
gzip: true,
destination: destPath,
});
}
Hope it helps someone!

Resources