I have a simple trigger set on file upload to Firebase. It reads uploaded file, process it and saves results to database. It works for a while, after that it crashes and stops working. Usually uploading function helps. Does anybody have an idea what might be the reason? Am I getting out of memory or ... ?
Here is the code :
const {Storage} = require('#google-cloud/storage');
const path = require('path');
const storage = new Storage();
exports.processLogs = functions
.region('europe-west1')
.storage
.object()
.onFinalize(async (object) => {
const filename = path.basename(object.name);
const bucket = storage.bucket(object.bucket);
try {
await bucket.file(filename).download(async (err, contents) => {
if (err) {
console.log('error', err);
return null
}
//Proces file and store into db
// (...)
bucket.file(filename).delete();
});
} catch(e){
console.log('error',e)
}
});
Error I am getting is :
Anonymous caller does not have storage.objects.get access to project-name.appspot.com/CrTwBuyNR2-log-2020-1-16-12-18.csv.'
thanks
As you are using Firebase, I recommend initialising the bucket from firebase-admin instead of #google-cloud/storage directly. This will sort out permissions so that security rules are skipped.
In your code you have also incorrectly mixed the callback and async/await APIs. Because this code is running in a Cloud Function, I recommend exclusively using Promises and async/await.
The code below is a rewrite with the following changes:
Code has been split into logical blocks
No callback API usage (see File#download)
Each block will log and throw errors separately for easier debugging
One-line log messages (i.e. no stack trace)
Leaves full error logging to Cloud Functions (makes finding erroneous runs easier)
const admin = require('firebase-admin');
const functions = require('firebase-functions');
admin.initializeApp();
exports.processLogs = functions.region('europe-west1').storage.object()
.onFinalize(async (object) => {
const bucketRef = admin.storage().bucket(object.bucket);
const fileRef = bucketRef.file(object.name);
console.log('Processing "' + object.id + '"...');
// 1) DOWNLOAD
let [contents] = await fileRef.download()
.catch((err) => {
console.log('DOWNLOAD FAILED: ', (err.code ? err.code + ': ' : '') + err.message);
throw err;
});
// 2) PARSE
let dataToUpload = {};
try {
// Transform file contents
dataToUpload = JSON.parse(contents);
} catch (err) {
console.log('PARSE FAILED: ', (err.code ? err.code + ': ' : '') + err.message);
throw err;
}
// 3) DATABASE SET
const dbRef = admin.database().ref('path/to/data');
await dbRef.set(dataToUpload)
.catch((err) => {
console.log('DATABASE SET FAILED: ', (err.code ? err.code + ': ' : '') + err.message);
throw err;
});
// 4) CLEANUP
await fileRef.delete()
.catch((err) => {
console.log('CLEANUP FAILED: ', (err.code ? err.code + ': ' : '') + err.message);
throw err;
});
// 5) LOG SUCCESS
console.log('SUCCEEDED');
});
The log messages above can also be bundled into a helper function if desired:
function logAndRethrowError(err, name) {
console.log((name || 'ERROR') + ': ', ((err.code ? err.code + ': ' : '') + err.message) || err);
throw err;
}
// Usage:
let [contents] = await fileRef.download()
.catch(err => logAndRethrowError(err, 'DOWNLOAD FAILED'));
try {
// ...
} catch (err) { logAndRethrowError(err, 'PARSE FAILED') }
This looks like a problem with Security Rules not being set properly? Please try this document or video.
For starters try this:
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read, write: if true;
}
}
}
And if it proves successful set Storage Rules properly.
Related
I am new to Node JS and I had done a lot of research and couldnt find any solution for the same. This is my code below
if(msg.body == 'Track ' + slug){
const str = msg.body;
const slug = str.substring(str.indexOf("Track") + 6); // 01-2020
var http = require('https');
var options = {
host: 'example.com',
path: '/example/example?id=' + slug
};
callback = function(response) {
var str = '';
response.on('data', function (chunk) {
str += chunk;
});
response.on('error', (err) => {
msg.reply('Error!');
})
response.on('end', function () {
var jsonObject = JSON.parse(str);
msg.reply('Current status of ' + slug + ': ' + jsonObject[0]['body']);
});
}
http.request(options, callback).end();
}
So if I enter Track and a value the value will be captured and then sent in the json request. The json request very much works unless there is an error where the app crashes. This becomes a very big problem. So If I enter the wrong value then the app crashes saying undefined in the log. I want it to msg.reply the error instead of the app crashing. Please help me out. Thank you in advance
Like what commenter Joe says you should wrap your response code in a try/catch block. That will let you print out the error message properly.
try {
.. code youre expecting to hopefully not crash ..
} catch (error) {
msg.reply('Error!', error);
}
Here are some helpful examples for node
Handling Errors
try {
//lines of code
} catch (e) {
msg.reply('Error!');
console.log(e);
}
Handling uncaught exceptions
process.on('uncaughtException', err => {
console.error('There was an uncaught error', err)
process.exit(1) //mandatory (as per the Node.js docs)
})
Exceptions with promises
doSomething1()
.then(doSomething2)
.then(doSomething3)
.catch(err => console.error(err))
With async functions
async function someFunction() {
try {
await someOtherFunction()
} catch (err) {
console.error(err.message)
}
}
Learn more here https://nodejs.dev/learn/error-handling-in-nodejs
This my connection code :
`
var sqlite3 = require('sqlite3').verbose()
const DBSOURCE = "db_path";
let db = new sqlite3.Database(DBSOURCE, (err) => {
if (err) {
console.error(err.message)
throw err
}else{
console.log('Connected to the SQLite database.');
}
});
module.exports = db`
I found something here, but I dont know its the correct way to do or to make it work :
https://github.com/mapbox/node-sqlite3/wiki/Extensions#databaseloadextensionpath-callback
tried this :
` let params = [];
db.all(`select load_extension('./config/math.dll')`, params, (err, rows) => {
console.log(err);
if (err) {
res.status(400).json({ StatusCode: 400, error: err.message });
return;
}
console.log(rows)
})`
and got error : Error: SQLITE_ERROR: not authorized
I couldn't find any sources for loading extension in node.js .
Using SELECT load_extension(...) is disabled by default for security reasons. However, you can load an extension like this:
const db = new sqlite3.Database('db.sqlite3');
db.loadExtension('./lib/uuid.dll');
db.all('SELECT uuid()', (error, rows) => {
console.log(rows);
});
db.close();
Notice the file name uuid.dll is not accidental. In my case, the uuid extension contains entry-point function called sqlite3_uuid_init, therefore the file must be called uuid.<extension> to make it work out of box. You can read more in SQLite documentation.
I've been working on a "artwall" bot with, basically write to .json file command. I want for the bot to get the first message attachment, get it's url and save it to save.json.
If there's a an attachment present, everything works fine, but if the command was initiated with a url or without any arguments at all, it gives this error:
TypeError: Cannot read property 'url' of undefined
Here's the command code:
const fs = require('fs');
// Export code for command.
module.exports = {
// In name type name of this command to execute it.
name: 'done',
// In description type description.
description: 'N/A',
// In execute() {...} circle brackets type execution parameters.
execute(client, message, args) {
// Type command code here.
const safety = JSON.parse(fs.readFileSync('./assets/save.json', 'utf8'));
const currentdeg = JSON.parse(fs.readFileSync('./assets/currentDEBUG.json', 'utf8'));
// const attache = JSON.parse(fs.readFileSync('./assets/attach.json', 'utf8'));
if(safety == 'no') {
if(currentdeg == 'not claimed') {
message.channel.send('The wall is not claimed yet. Claim it by using `wol claim`');
}
else if(currentdeg == message.author.id) {
const Attachment = (message.attachments).array();
console.log(Attachment);
if (Attachment == []) {
if (!args) {
message.reply('there\'s no image present. Make sure you attached one message or used url.');
return;
}
else {
fs.writeFile('assets/currentDEBUG.json', JSON.stringify('not claimed', null, 2), err => {
// Checking for errors
if (err) throw err;
console.log('Done writing (claim)');
});
fs.writeFile('assets/attach.json', JSON.stringify(args[0], null, 2), err => {
// Checking for errors
if (err) throw err;
console.log('Done writing (attache)');
});
}
}
// stuff
fs.writeFile('assets/currentDEBUG.json', JSON.stringify('not claimed', null, 2), err => {
// Checking for errors
if (err) throw err;
console.log('Done writing (claim)');
});
fs.writeFile('assets/attach.json', JSON.stringify(Attachment[0].url, null, 2), err => {
// Checking for errors
if (err) throw err;
console.log('Done writing (attache)');
});
}
else {
// stuff
message.channel.send('The artwall was claimed by someone else already. Wait for them to finish their work.');
}
}
else {
message.channel.send('The artwall is locked right now. Please wait for the next event!');
}
},
};
Thanks in advance!
I updated the function to create the CSV file but now I'm getting an error:
In upload function
internal/streams/legacy.js:57
throw er; // Unhandled stream error in pipe.
^
Error: ENOENT: no such file or directory, open 'C:\Users\shiv\WebstormProjects\slackAPIProject\billingData\CSV\1548963844106output.csv'
var csvFilePath = '';
var JSONFilePath = '';
function sendBillingData(){
var message = '';
axios.get(url, {
params: {
token: myToken
}
}).then(function (response) {
message = response.data;
fields = billingDataFields;
// saveFiles(message, fields, 'billingData/');
saveFilesNew(message, fields, 'billingData/');
var file = fs.createReadStream(__dirname + '/' + csvFilePath); // <--make sure this path is correct
console.log(__dirname + '/' + csvFilePath);
uploadFile(file);
})
.catch(function (error) {
console.log(error);
});
}
The saveFilesNew function is:
function saveFilesNew(message, options, folder){
try {
const passedData = message;
var relevantData='';
if (folder == 'accessLogs/'){
const loginsJSON = message.logins;
relevantData = loginsJSON;
console.log(loginsJSON);
}
if(folder == 'billingData/'){
relevantData = passedData.members;
const profile = passedData.members[0].profile;
}
//Save JSON to the output folder
var date = Date.now();
var directoryPath = folder + 'JSON/' + date + "output";
JSONFilePath = directoryPath + '.json';
fs.writeFileSync(JSONFilePath, JSON.stringify(message, null, 4), function(err) {
if (err) {
console.log(err);
}
});
//parse JSON onto the CSV
const json2csvParser = new Json2csvParser({ fields });
const csv = json2csvParser.parse(relevantData);
// console.log(csv);
//function to process the CSV onto the file
var directoryPath = folder + 'CSV/' + date + "output";
csvFilePath = directoryPath + '.csv';
let data = [];
let columns = {
real_name: 'real_name',
display_name: 'display_name',
email: 'email',
account_type: 'account_type'
};
var id = passedData.members[0].real_name;
console.log(id);
console.log("messageLength is" +Object.keys(message.members).length);
for (var i = 0; i < Object.keys(message.members).length; i++) {
console.log("value of i is" + i);
var display_name = passedData.members[i].profile.display_name;
var real_name = passedData.members[i].profile.real_name_normalized;
var email = passedData.members[i].profile.email;
var account_type = 'undefined';
console.log("name: " + real_name);
if(passedData.members[i].is_owner){
account_type = 'Org Owner';
}
else if(passedData.members[i].is_admin){
account_type = 'Org Admin';
}
else if(passedData.members[i].is_bot){
account_type = 'Bot'
}
else account_type = 'User';
data.push([real_name, display_name, email, account_type]);
}
console.log(data);
stringify(data, { header: true, columns: columns }, (err, output) => {
if (err) throw err;
fs.writeFileSync(csvFilePath, output, function(err) {
console.log(output);
if (err) {
console.log(err);
}
console.log('my.csv saved.');
});
});
} catch (err) {
console.error(err);
}
}
The upload file function is:
function uploadFile(file){
console.log("In upload function");
const form = new FormData();
form.append('token', botToken);
form.append('channels', 'testing');
form.append('file', file);
axios.post('https://slack.com/api/files.upload', form, {
headers: form.getHeaders()
}).then(function (response) {
var serverMessage = response.data;
console.log(serverMessage);
});
}
So I think the error is getting caused because node is trying to upload the file before its being created. I feel like this has something to do with the asynchronous nature of Node.js but I fail to comprehend how to rectify the code. Please let me know how to correct this and mention any improvements to the code structure/design too.
Thanks!
You don't wait for the callback provided to stringify to be executed, and it's where you create the file. (Assuming this stringify function really does acccept a callback.)
Using callbacks (you can make this cleaner with promises and these neat async/await controls, but let's just stick to callbacks here), it should be more like:
function sendBillingData() {
...
// this callback we'll use to know when the file writing is done, and to get the file path
saveFilesNew(message, fields, 'billingData/', function(err, csvFilePathArgument) {
// this we will execute when saveFilesNew calls it, not when saveFilesNew returns, see below
uploadFile(fs.createReadStream(__dirname + '/' + csvFilePathArgument))
});
}
// let's name this callback... "callback".
function saveFilesNew(message, options, folder, callback) {
...
var csvFilePath = ...; // local variable only instead of your global
...
stringify(data, { header: true, columns: columns }, (err, output) => {
if (err) throw err; // or return callbcack(err);
fs.writeFile(csvFilePath , output, function(err) { // NOT writeFileSync, or no callback needed
console.log(output);
if (err) {
console.log(err);
// callback(err); may be a useful approach for error-handling at a higher level
}
console.log('my.csv saved.'); // yes, NOW the CSV is saved, not before this executes! Hence:
callback(null, csvFilePath); // no error, clean process, pass the file path
});
});
console.log("This line is executed before stringify's callback is called!");
return; // implicitly, yes, yet still synchronous and that's why your version crashes
}
Using callbacks that are called only when the expected events happen (a file is done writing, a buffer/string is done transforming...) allows JS to keep executing code in the meantime. And it does keep executing code, so when you need data from an async code, you need to tell JS you need it done before executing your piece.
Also, since you can pass data when calling back (it's just a function), here I could avoid relying on a global csvFilePath. Using higher level variables makes things monolithic, like you could not transfer saveFilesNew to a dedicated file where you keep your toolkit of file-related functions.
Finally, if your global process is like:
function aDayAtTheOffice() {
sendBillingData();
getCoffee();
}
then you don't need to wait for the billing data to be processed before starting making coffee. However, if your boss told you that you could NOT get a coffee until the billing data was settled, then your process would look like:
function aDayAtTheOffice() {
sendBillingData(function (err) {
// if (err) let's do nothing here: you wanted a coffee anyway, right?
getCoffee();
});
}
(Note that callbacks having potential error as first arg and data as second arg is a convention, nothing mandatory.)
IMHO you should read about scope (the argument callback could be accessed at a time where the call to saveFilesNew was already done and forgotten!), and about the asynchronous nature of No... JavaScript. ;) (Sorry, probably not the best links but they contain the meaningful keywords, and then Google is your buddy, your friend, your Big Brother.)
I have a firebase (Google) cloud-function as follows
// Initialize the Auth0 client
var AuthenticationClient = require('auth0').AuthenticationClient;
var auth0 = new AuthenticationClient({
domain: 'familybank.auth0.com',
clientID: 'REDACTED'
});
function getAccountBalance(app) {
console.log('accessToken: ' + app.getUser().accessToken);
auth0.getProfile(app.getUser().accessToken, function (err, userInfo) {
if (err) {
console.error('Error getting userProfile from Auth0: ' + err);
}
console.log('getAccountBalance userInfo:' + userInfo)
let accountowner = app.getArgument(PARAM_ACCOUNT_OWNER);
// query firestore based on user
var transactions = db.collection('bank').doc(userInfo.email)
.db.collection('accounts').doc(accountowner)
.collection('transactions');
var accountbalance = transactions.get()
.then( snapshot => {
var workingbalance = 0
snapshot.forEach(doc => {
workingbalance = workingbalance + doc.data().amount;
});
app.tell(accountowner + " has a balance of $" + workingbalance)
})
.catch(err => {
console.log('Error getting transactions', err);
app.tell('I was unable to retrieve your balance at this time.')
});
});
}
actionMap.set(INTENT_ACCOUNT_BALANCE, getAccountBalance);
app.handleRequest(actionMap);
When this executes, I see the following logs
Notice that parts of the function are being executed multiple times, and the second execution is failing. If I close out the auth0.getProfile call after logging userInfo, then the function works, but obviously doesn't have userInfo.
Any idea why parts of this function are executing multiple times and why some calls would fail?
The userInfo is undefined at point (2) because there has been an error (reported on the line right beneath it, which was the previous logged message). Your error block does not leave the function, so it continues to run with an invalid userInfo object.
But that doesn't explain why the callback is getting called twice - once with a valid userInfo and once with an err. The documentation (although not the example) for AuthenticationClient.getProfile() indicates that it returns a Promise (or undefined - although it doesn't say why it might return undefined), so I am wondering if this ends up calling the callback twice.
Since it returns a promise, you can omit the callback function and just handle it with something like this:
function getAccountBalance(app) {
let accountowner = app.getArgument(PARAM_ACCOUNT_OWNER);
console.log('accessToken: ' + app.getUser().accessToken);
var accessToken = app.getUser().accessToken;
auth0.getProfile( accessToken )
.then( userInfo => {
console.log('getAccountBalance userInfo:' + userInfo)
// query firestore based on user
var transactions = db.collection('bank').doc(userInfo.email)
.db.collection('accounts').doc(accountowner)
.collection('transactions');
return transactions.get();
})
.then( snapshot => {
var workingbalance = 0
snapshot.forEach(doc => {
workingbalance = workingbalance + doc.data().amount;
});
app.tell(accountowner + " has a balance of $" + workingbalance)
})
.catch( err => {
console.error('Error:', err );
app.tell('I was unable to retrieve your balance at this time.')
})
});
}