Do node js worker never times out? - node.js

I have an iteration that can take up to hours to complete.
Example:
do{
//this is an api action
let response = await fetch_some_data;
// other database action
await perform_operation();
next = response.next;
}while(next);
I am assuming that the operation doesn't times out. But I don't know it exactly.
Any kind of explanation of nodejs satisfying this condition is highly appreciated. Thanks.
Update:
The actual development code is as under:
const Shopify = require('shopify-api-node');
const shopServices = require('../../../../services/shop_services/shop');
const { create } = require('../../../../controllers/products/Products');
exports.initiate = async (redis_client) => {
redis_client.lpop(['sync'], async function (err, reply) {
if (reply === null) {
console.log("Queue Empty");
return true;
}
let data = JSON.parse(reply),
shopservices = new shopServices(data),
shop_data = await shopservices.get()
.catch(error => {
console.log(error);
});
const shopify = new Shopify({
shopName: shop_data.name,
accessToken: shop_data.access_token,
apiVersion: '2020-04',
autoLimit: false,
timeout: 60 * 1000
});
let params = { limit: 250 };
do {
try {
let response = await shopify.product.list(params);
if (await create(response, shop_data)) {
console.log(`${data.current}`);
};
data.current += data.offset;
params = response.nextPageParameters;
} catch (error) {
console.log("here");
console.log(error);
params = false;
};
} while (params);
});
}
Everything is working fine till now. I am just making sure that the execution will ever happen in node or not. This function is call by a cron every minute, and data for processing is provided by queue data.

Related

how to use await instead of then in promise?

How to correctly resolve a Promise.all(...), I'm trying that after resolving the promise which generates a set of asynchronous requests (which are simple database queries in supabase-pg SQL) I'm iterating the result in a forEach , to make a new request with each of the results of the iterations.
But, try to save the result that it brings me in a new array, which prints fine in the console, but in the response that doesn't work. It comes empty, I understand that it is sending me the response before the promise is finished resolving, but I don't understand why.
In an answer to a previous question I was told to use await before the then, but I didn't quite understand how to do it.
What am I doing wrong?
export const getReportMonthly = async(req: Request & any, res: Response, next: NextFunction) => {
try {
let usersxData: UsersxModalidadxRolxJob[] = [];
let data_monthly: HoursActivityWeeklySummary[] = [];
let attendance_schedule: AttendanceSchedule[] = [];
let time_off_request: TimeOffRequestRpc[] = [];
let configs: IndicatorConfigs[] = [];
const supabaseService = new SupabaseService();
const promises = [
supabaseService.getSummaryWeekRpcWihoutFreelancers(req.query.fecha_inicio, req.query.fecha_final).then(dataFromDB => {
data_monthly = dataFromDB as any;
}),
supabaseService.getUsersEntity(res).then(dataFromDB => {
usersxData = dataFromDB as any;
}),
supabaseService.getAttendaceScheduleRpc(req.query.fecha_inicio, req.query.fecha_final).then(dataFromDB => {
attendance_schedule = dataFromDB as any;
}),
supabaseService.getTimeOffRequestRpc(req.query.fecha_inicio, req.query.fecha_final).then(dataFromDB => {
time_off_request = dataFromDB as any;
}),
supabaseService.getConfigs(res).then(dataFromDB => {
configs = dataFromDB;
}),
];
let attendanceInMonthly = new Array();
await Promise.all(promises).then(() => {
attendance_schedule.forEach(element => {
let start_date = element.date_start.toString();
let end_date = element.date_end.toString();
supabaseService.getTrackedByDateAndIDArray(start_date, end_date).then(item => {
console.log(item);
attendanceInMonthly.push(item);
});
});
})
res.json(attendanceInMonthly)
} catch (error) {
console.log(error);
res.status(500).json({
title: 'API-CIT Error',
message: 'Internal server error'
});
}
If you await a promise you could write the return of this in a variable and work with this normaly.
So instead of your current code you could use the following changed code:
export const getReportMonthly = async(req: Request & any, res: Response, next: NextFunction) => {
try {
let usersxData: UsersxModalidadxRolxJob[] = [];
let data_monthly: HoursActivityWeeklySummary[] = [];
let attendance_schedule: AttendanceSchedule[] = [];
let time_off_request: TimeOffRequestRpc[] = [];
let configs: IndicatorConfigs[] = [];
const supabaseService = new SupabaseService();
const promises = [
supabaseService.getSummaryWeekRpcWihoutFreelancers(req.query.fecha_inicio, req.query.fecha_final).then(dataFromDB => {
data_monthly = dataFromDB as any;
}),
supabaseService.getUsersEntity(res).then(dataFromDB => {
usersxData = dataFromDB as any;
}),
supabaseService.getAttendaceScheduleRpc(req.query.fecha_inicio, req.query.fecha_final).then(dataFromDB => {
attendance_schedule = dataFromDB as any;
}),
supabaseService.getTimeOffRequestRpc(req.query.fecha_inicio, req.query.fecha_final).then(dataFromDB => {
time_off_request = dataFromDB as any;
}),
supabaseService.getConfigs(res).then(dataFromDB => {
configs = dataFromDB;
}),
];
const resolvedPromises = await Promise.all(promises)
const attendanceInMonthly = await Promise.all(
resolvedPromises.map(
async (element) => {
let start_date = element.date_start.toString();
let end_date = element.date_end.toString();
return supabaseService.getTrackedByDateAndIDArray(start_date, end_date)
}
)
)
console.log(attendanceInMonthly) // this should be your finaly resolved promise
res.json(attendanceInMonthly)
} catch (error) {
console.log(error);
res.status(500).json({
title: 'API-CIT Error',
message: 'Internal server error'
});
}
Something like this should your code looks like. I am not sure if this solves exactly your code because your code has some syntax errors wich you have to solve for you.
If I understand correctly, you launch a few requests, among which one (getAttendaceScheduleRpc, which assigns attendance_schedule) is used to launch some extra requests again, and you need to wait for all of these (including the extra requests) before returning?
In that case, the immediate issue is that you perform your extra requests in "subqueries", but you do not wait for them.
A very simple solution would be to properly separate those 2 steps, somehow like in DerHerrGammler's answer, but using attendance_schedule instead of resolvedPromises as input for the 2nd step:
let attendanceInMonthly = new Array();
await Promise.all(promises);
await Promise.all(attendance_schedule.map(async (element) => {
let start_date = element.date_start.toString();
let end_date = element.date_end.toString();
const item = await supabaseService.getTrackedByDateAndIDArray(start_date, end_date);
console.log(item);
attendanceInMonthly.push(item);
});
res.json(attendanceInMonthly);
If you are really looking to fine tune your performance, you could take advantage of the fact that your extra requests depend only on the result of one of your initial requests (getAttendaceScheduleRpc), so you could launch them as soon as the latter is fullfilled, instead of waiting for all the promises of the 1st step:
let attendance_schedule: AttendanceSchedule[] = [];
let attendanceInMonthly = new Array();
const promises = [
supabaseService.getAttendaceScheduleRpc(req.query.fecha_inicio, req.query.fecha_final).then(dataFromDB => {
attendance_schedule = dataFromDB as any;
// Immediately launch your extra (2nd step) requests, without waiting for other 1st step requests
// Make sure to return when all new extra requests are done, or a Promise
// that fullfills when so.
return Promise.all(attendance_schedule.map(async (element) => {
let start_date = element.date_start.toString();
let end_date = element.date_end.toString();
const item = await supabaseService.getTrackedByDateAndIDArray(start_date, end_date);
console.log(item);
attendanceInMonthly.push(item);
});
}),
// etc. for the rest of 1st step requests
];
await Promise.all(promises);
res.json(attendanceInMonthly);

Get number of requisitions made by the user in a day

I have a POST ENDPOINT in my API where i want to register all work journals a worker made.
export const registerPoint = async (req: Request, res: Response) => {
const user = res.locals.decoded;
const {id} = user;
const ponto = new Ponto;
ponto.datePoint = new Date();
ponto.user_id = id;
//this numPoint should restart every day for that user...
ponto.numPoint = 1;
res.json({ msg: "Register point Route", user, id });
return;
}
how can i control how many times this requisition was made by a worker?
I want to control the variable numPoint, when the user makes this requisition it should increase by 01 and then in the final of the day returns it to 0.
Anyone knows about a solution or about a npm package that can handle this?
EDIT: Im storing all the data with SEQUELIZE + MYSQL.
As a starting point you could use a simple database for storing the data such as https://www.npmjs.com/package/sqlite3 or MySQL.
Running jobs daily you could consider https://www.npmjs.com/package/node-cron , for example having a daily job (outside of an API call function);
var cron = require('node-cron');
cron.schedule('0 0 1 * *', () => {
console.log('running every minute to 1 from 5');
});
From what I understand, you need a logging/audit trail mechanism. Since you are using MySQL you can create a new table with columns like (datetime, user_id, action). Every time the user does any action, it will be logged here. Then you can easily find out the count by aggregation. You won't need to reset any count for any user, if data doesn't exist for a given date then it's count will be 0.
I've made a solution that it worked for what i want.
Here is the solution below:
export const registerPoint = async (req: Request, res: Response) => {
const decodedUser = res.locals.decoded;
const user = await User.findByPk(decodedUser.id);
const TODAY_START = new Date().setHours(0, 0, 0, 0);
const NOW = new Date();
const markedPointsOfDay = await Ponto.findAll({
where: {
datePoint: {
[Op.gt]: TODAY_START,
[Op.lt]: NOW
},
}
});
const markedPointsOfDayByUser = await Ponto.findAll({
where: {
datePoint: {
[Op.gt]: TODAY_START,
[Op.lt]: NOW
},
UserId: (user !== null) ? user.id : decodedUser.id
}
})
if (!markedPointsOfDay || markedPointsOfDay.length === 0) {
const ponto = new Ponto;
ponto.datePoint = new Date();
if (user) {
console.log(user)
ponto.UserId = user.id as number;
console.log(ponto.UserId);
}
if (markedPointsOfDayByUser) {
ponto.numPoint = markedPointsOfDayByUser.length + 1;
}
const newPoint = await ponto.save();
res.json({ msg: "Ponto registrado com sucesso", msg2: "Caiu no IF de quando nao encontrou ponto do DIA", newPoint })
return;
}
if (markedPointsOfDay) {
const ponto = new Ponto;
ponto.datePoint = new Date();
if (user) {
ponto.UserId = user.id as number;
}
if (markedPointsOfDayByUser) {
ponto.numPoint = markedPointsOfDayByUser.length + 1;
}
const newPoint = await ponto.save();
res.json({ msg: "ponto registrado", markedPoint: newPoint, markedPointsOfDayByUser });
return;
}
return;
}

Update large number of documents

I am trying to update the content of large number of documents in firebase, I have tried the following:
First, reading all documents on client side and looping over the documents and updating them by refence.
the problem here is that I am doing intensive operation on the client side and that would be unpredictable, therefore I switched to Firebase's Functions.
Second, reading all documents in firebase functions and then updating them using bulkwriter
here's the code:
exports.testingFunction1 = functions.runWith({
timeoutSeconds: 540,
memory: "8GB",
}).https.onCall(async (data, context) => {
const storeId = data.text;
if (!(typeof storeId === 'string') || storeId.length === 0) {
throw new functions.https.HttpsError('invalid-argument', 'The function must be called with ' +
'one arguments containing the storeId.');
}
if (!context.auth) {
throw new functions.https.HttpsError('failed-precondition', 'The function must be called ' +
'while authenticated.');
}
const uid = context.auth.uid;
const name = context.auth.token.name || null;
const picture = context.auth.token.picture || null;
const email = context.auth.token.email || null;
let bulk1 = new admin.firestore().bulkWriter();
let products = await admin.firestore().collection("Products").get(); //here's the problems's source
products.forEach((document) => {
bulk1.update(document.ref, { "IsStorePublished": true });
});
await bulk1.flush().then(() => {
return { "result": "Success!" };
})
.catch((error) => {
throw new functions.https.HttpsError('unknown', error.message, error);
});
return { "Result": "Success" }
});
the problem here appears when I try to read more that about 8000 documents at a single time, I get the following error although I have changed the memory limitations for the function to the max possible:
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap
out of memory
Is there a good way to achieve this task ?
For anyone interested, I have solved the issue as following:
the problem with reading a large amount of data to update it in firebase functions is that the memory is filling up, so I have made a recursive function enables reading 500 items at a time and apply the "BulkWrite" to only 500 documents at the time, and to achieve this I used the "startAfter" reading method.
This is the main firebase v1 function:
where it reads the first 500 items and apply the function "operationToDo" to them, and then calls the recursive function to continue the process.
exports.testingFunction1 = functions.runWith({
timeoutSeconds: 540,
memory: "8GB",
}).https.onCall(async (data, context) => {
const storeId = data.text;
if (!(typeof storeId === 'string') || storeId.length === 0) {
throw new functions.https.HttpsError('invalid-argument', 'The function must be called with ' +
'one arguments "storeId" containing the storeId.');
}
if (!context.auth) {
throw new functions.https.HttpsError('failed-precondition', 'The function must be called ' +
'while authenticated.');
}
const uid = context.auth.uid;
const name = context.auth.token.name || null;
const picture = context.auth.token.picture || null;
const email = context.auth.token.email || null;
let first = admin.firestore()
.collection("Products")
.orderBy("Name").where("Store", '==', storeId)
.limit(500);
await first.get().then(
async (documentSnapshots) => {
if (documentSnapshots.docs.length == 0) {
return;
} else {
await operationToDo(documentSnapshots, "update", { "IsStorePublished": false })
}
let lastVisible =
documentSnapshots.docs[documentSnapshots.size - 1];
await recursivePublishingTheStore(lastVisible, storeId);
},
);
return { "Result": "Success" }
});
The recursive function:
async function recursivePublishingTheStore(lastVisible, storeId) {
let next = admin.firestore()
.collection("Products")
.orderBy("Name").where("Store", '==', storeId)
.startAfter(lastVisible)
.limit(500);
await next.get().then(
async (documentSnapshots) => {
if (documentSnapshots.docs.length == 0) {
return;
} else {
await operationToDo(documentSnapshots, "update", { "IsStorePublished": false })
let lastVisible =
documentSnapshots.docs[documentSnapshots.size - 1];
await recursivePublishingTheStore(lastVisible, storeId);
}
}
);
}
The operation can be anything but in my case it would be "update":
async function operationToDo(documents, operation, value) {
let bulk1 = new admin.firestore().bulkWriter();
documents.forEach((document) => {
if (operation == 'update')
bulk1.update(document.ref, value);
});
await bulk1.flush().then(() => {
})
.catch((error) => {
throw new functions.https.HttpsError('unknown', error.message, error);
});
}
The performance of the code above is pretty good, for updating about 15k documents it would take about 2 minutes.
Note: I have chosen the number 500 randomly, different number might work and might perform better, and I will be experimenting with it this week.

Use headless chrome to intercept image request data

I have a use case that needs to use Headless Chrome Network (https://chromedevtools.github.io/devtools-protocol/tot/Network/) to intercept all images requests and find out the image size before saving it (basically discard small images such as icons).
However, I am unable to figure out a way to load the image data in memory before saving it. I need to load it in Img object to get width and height. The Network.getResponseBody is taking requestId which I don't have access in Network.requestIntercepted. Also Network.loadingFinished always gives me "0" in encodedDataLength variable. I have no idea why. So my questions are:
How to intercept all responses from jpg/png request and get the image data? Without saving the file via URL string to the disk and load back.
BEST: how to get image dimension from header response? Then I don't have to read the data into memory at all.
My code is below:
const chromeLauncher = require('chrome-launcher');
const CDP = require('chrome-remote-interface');
const file = require('fs');
(async function() {
async function launchChrome() {
return await chromeLauncher.launch({
chromeFlags: [
'--disable-gpu',
'--headless'
]
});
}
const chrome = await launchChrome();
const protocol = await CDP({
port: chrome.port
});
const {
DOM,
Network,
Page,
Emulation,
Runtime
} = protocol;
await Promise.all([Network.enable(), Page.enable(), Runtime.enable(), DOM.enable()]);
await Network.setRequestInterceptionEnabled({enabled: true});
Network.requestIntercepted(({interceptionId, request, resourceType}) => {
if ((request.url.indexOf('.jpg') >= 0) || (request.url.indexOf('.png') >= 0)) {
console.log(JSON.stringify(request));
console.log(resourceType);
if (request.url.indexOf("/unspecified.jpg") >= 0) {
console.log("FOUND unspecified.jpg");
console.log(JSON.stringify(interceptionId));
// console.log(JSON.stringify(Network.getResponseBody(interceptionId)));
}
}
Network.continueInterceptedRequest({interceptionId});
});
Network.loadingFinished(({requestId, timestamp, encodedDataLength}) => {
console.log(requestId);
console.log(timestamp);
console.log(encodedDataLength);
});
Page.navigate({
url: 'https://www.yahoo.com/'
});
Page.loadEventFired(async() => {
protocol.close();
chrome.kill();
});
})();
This should get you 90% of the way there. It gets the body of each image request. You'd still need to base64decode, check size and save etc...
const CDP = require('chrome-remote-interface');
const sizeThreshold = 1024;
async function run() {
try {
var client = await CDP();
const { Network, Page } = client;
// enable events
await Promise.all([Network.enable(), Page.enable()]);
// commands
const _url = "https://google.co.za";
let _pics = [];
Network.responseReceived(async ({requestId, response}) => {
let url = response ? response.url : null;
if ((url.indexOf('.jpg') >= 0) || (url.indexOf('.png') >= 0)) {
const {body, base64Encoded} = await Network.getResponseBody({ requestId }); // throws promise error returning null/undefined so can't destructure. Must be different in inspect shell to app?
_pics.push({ url, body, base64Encoded });
console.log(url, body, base64Encoded);
}
});
await Page.navigate({ url: _url });
await sleep(5000);
// TODO: process _pics - base64Encoded, check body.length > sizeThreshold, save etc...
} catch (err) {
if (err.message && err.message === "No inspectable targets") {
console.error("Either chrome isn't running or you already have another app connected to chrome - e.g. `chrome-remote-interface inspect`")
} else {
console.error(err);
}
} finally {
if (client) {
await client.close();
}
}
}
function sleep(miliseconds = 1000) {
if (miliseconds == 0)
return Promise.resolve();
return new Promise(resolve => setTimeout(() => resolve(), miliseconds))
}
run();

Why is Cloud Functions for Firebase taking 25 seconds?

For clarity I have other cloud functions that all run intermittently (i.e from 'cold' in around 2-6 seconds, and all use the same boilerplate set up of importing an admin instance and exporting the function as a module)
I've seen other similar posts but this is really bugging me. I have a cloud function like so:
const admin = require('../AdminConfig');
const { reportError } = require('../ReportError');
module.exports = (event) => {
const uid = event.params.uid;
const snapshot = event.data;
if (snapshot._newData === null ) {
return null;
}
console.log('Create org begin running: ', Date.now());
const organisation = event.data.val();
const rootRef = admin.database().ref();
const ref = rootRef.child('/organisations').push();
const oid = ref.key;
const userData = {
level: 'owner',
name: organisation.name,
};
const orgShiftInfo = {
name: organisation.name,
startDay: organisation.startDay || 'Monday',
};
const updatedData = {};
updatedData[`/users/${uid}/currentOrg`] = oid;
updatedData[`/users/${uid}/organisations/${oid}`] = userData;
updatedData[`/organisations/${oid}`] = organisation;
updatedData[`/org_shift_info/${oid}`] = orgShiftInfo;
rootRef.update(updatedData, (err) => {
if (err) {
return rootRef.child(`/users/${uid}/addOrgStatus`).set({ error: true })
.then(() => {
console.log(`error adding organisation for ${uid}: `, err);
return reportError(err, { uid });
});
}
console.log('Create org wrote succesfully: ', Date.now());
return rootRef.child(`/users/${uid}/addOrgStatus`).set({ success: true });
});
}
I understand the 'cold start' thing but I think something is seriously wrong that it's taking 25 seconds. The logs don't return any error and are as so:
Is there some deeper way I can debug this to try and figure out why it's taking so long? It's unusable at the moment. Thanks a lot.
Solved:
Sorry,
I misunderstood the API a bit. I should have watched the promise video first!
I needed to put
return rootRef.update...
instead of
rootRef.update...

Resources