I have a nestjs cron problem.
I called just one time a cron function but function running 6 times. My code is below.
#Cron(CronExpression.EVERY_MINUTE)
async watcher() {
const now = new Date();
const openTravelRequests = await this.getOpenTravelRequestsByDate(now);
if(openTravelRequests.length) {
openTravelRequests.map(async(otr) => {
await this.assignNewBoatToOpenTravelRequests(otr._id)
})
}
}
}
I have tried counter but not working.
Related
I want to run a Script every 6 Hours
const { IgApiClient } = require("instagram-private-api")
const ig = new IgApiClient()
const USERNAME = "abc"
const PASSWORD = "xyz"
ig.state.generateDevice(USERNAME)
const main = async () => {
var birthday = new Date(2069, 05, 14);
var today = new Date();
birthday.setFullYear(today.getFullYear());
if (today > birthday) {
birthday.setFullYear(today.getFullYear() + 1);
}
var daystill = Math.floor((birthday - today) / (1000*60*60*24))
await ig.simulate.preLoginFlow()
await ig.account.login(USERNAME, PASSWORD)
process.nextTick(async () => await ig.simulate.postLoginFlow())
await ig.account.setBiography(`${daystill} Days till my Birthday, Today is ${new Date().getDate()}/${new Date().getMonth()}/${new Date().getFullYear()}. (AutoGenerated)`)
}
main()
instagram-private-api
About Script: update my Instagram Bio with Async Await
Problem / Goal:
I Tried using node-cron, but It returns some Error (I think Async is causing the Problem), I also tried while loops and setInterval()s
I want this Script/File to run every 6 Hours, I have a heroku account (if that helps)
Error when i use node-cron:
node:internal/process/promises:288
triggerUncaughtException(err, true /* fromPromise */);
Code for node-cron:
cron.schedule('* * * * *', () => { // this is not every 6hrs
const main = async () => {
//same as above
}
main()
})
Doing it the async await way as the title says.
// used to measure time
import { performance } from 'perf_hooks';
const interval = 1000; // in ms
(async function main(){
let start_time = performance.now();
// do stuff
let stop_time = performance.now();
let timeout = interval - (stop_time - start_time);
setTimeout(main, timeout);
})();
edit:
To explain the syntax behind the main function.
()(); will automatically call the function inside of first braces on script start.
I have an app built with angular and node.js (with pg npm package, version = 8.7.1)
The app divided to microservice . Each server-app have "pg" package installed and have a connection to postgres db.
The problem is that if I run some "update" query and after this I running getList query, then I Got the old value instead the updated object. If I add setTimeout for 5 sec then it works fine
On my localhost all works fine. The issue occur only on heroku (with postgres on cloud) on the srever. sometimes I got the updated data and sometimes not
Here is my code:
Client code (angular) - calling to update func and then getList func with async & await
async filter({ value }) {
const list: any = await this.getList()
const [myData]: any = await this.updateData(this.value)
const list: any = await this.getList() // Here is the issue !!
}
The function calls to API to the server like this:
getList(): Promise<any> {
return this.http.get<any>(`${ENV.BASE_API}/doGetApiCalls`).toPromise();
}
updateData(value: any): Promise<any> {
return this.http.put<any>(`${ENV.BASE_API}/doUpdateApiCalls`, value).toPromise();
}
The server code is:
Bl code
async function updateData(description, id) {
let query = updateDataQuery(description, id);
let results = await postgressQuery(query);
return getDataResults;
}
DEL code
function updateDataQuery(description: string, id:number) {
const query = `UPDATE public.books
SET description='${description}',
WHERE book =${id}
RETURNING *`
return query;
}
And here is the connection to postgres db (BL calling to lib by import this)
const DATABASE_URL = process.env.DATABASE_URL;
const pool = new Pool({
connectionString:DATABASE_URL,
ssl:{rejectUnauthorized: false}
})
let openConnect = async () => {
await pool.connect();
}
let postgressQuery = async (q) => {
try {
const result = await pool.query(q);
return await result.rows;
}
catch (e) {
console.log(e);
}
}
========================================================
If I added await to client then it works fine. It takes a while for update?
async filter({ value }) {
const list: any = await this.getList() //
const [myData]: any = await this.updateData(this.value) //get the RETURN from server with correct data
await new Promise(resolve => setTimeout(resolve, 5000)) //added for wait for 5 sec
const list: any = await this.getList() // then data is correct (aafer 5 sec)
}
What wrong in this code above ?
Thanks in advance
I found a solution:
It was cache issue. Just need to install this package and all fine !
https://www.npmjs.com/package/nocache
I have the problem that in the server the Crone Task is running X4 number of times for each scheduled interval, not only the logs are shown x4 but also the API is called 4 times.
const taskCron = nodeCron.schedule('*/1 * * * *', function () {
const timeZone = moment.tz('America/Mexico_City').format();
const currentDate = splitDate(timeZone);
const currentTime = splitTime(timeZone);
console.info(`EXECUTING CRON TASK ${currentDate} ${currentTime}`);
campaign.find(
{ hour_to_send: currentTime, date_to_send: { $gte: currentDate, $lte: currentDate}}
).exec().then(data => {
console.log('number of campaigns programmed: ' + data.length);
const campaignsMapped = data.map(campaign => {
return {
id: campaign._id
};
});
const secretBulk = 'secret-bulk';
const bodyRequest = {
campaignsList: campaignsMapped,
key: secretBulk
};
if (campaignsMapped.length > 0) {
axios.post(url_bulk, bodyRequest, {})
.then(data => {
console.info(data.status);
})
.catch(error => {
console.error(error.code);
});
}
}).catch(error => {console.log(error)});
function splitDate(date) {
return date.slice(0, 10);
}
function splitTime(date) {
return date.slice(11, 16) + ':00';
}
}, {scheduled: false});
taskCron.start();
Use the library node-cron npm
The server is derived from centos Oracle version, the program runs in a docker container official Node JS image version 10, docker Docker version 19.03.11, build 42e35e61f3, It is worth mentioning that on my local machine with windows 10 this behavior does not happen.
I have a fairly straightforward script that reads summary data from an api and the loops through the records to save the detail to a database.
The code runs without problems when I launch it from VS Code but when I move it into a Lambda function it only runs halfway through.
There are two api calls using axios. The first gets the summary and the second pulls the detail.
The first call works in Lambda. The second, which uses the same method, does not. I can tell through logging statements that the correct data is getting to the second method. The only real differences are that the second is in a loop and it also uses Bottleneck to prevent overloading a touchy api.
I have put logging statements all over the place but once the routine enters the second api call I get no response at all. The logging statement directly inside the routine shows that it is getting there but I don't get anything back from axios. No success or error.
Here is the code.
var Bottleneck = require("bottleneck");
const axios = require('axios');
const Sequelize = require('sequelize');
let apiKey = process.env.APIKEY;
var timeDelay = 1000;
const instance = axios.create({
baseURL: 'https://anapi.com/api/v1/',
headers: {
'Content-Type': "application/json",
'X-Auth-Key': apiKey,
}
});
const limiter = new Bottleneck({
maxConcurrent: 1,
minTime: timeDelay
});
const sequelize = new Sequelize(
"postgres://postgres:reallystrongpassword#awsrdsdb.cluster-vms39sknjssk1.us-west-2.rds.amazonaws.com/targetdatabase"
);
const notes = sequelize.define(
"notes",
{
appointmentid: {
type: Sequelize.STRING,
}, ...
questions: {
type: Sequelize.JSONB,
},
},
{
tableName: "notes",
timestamps: false
}
);
async function notesInject(detailData) {
log.info("inside notesInject");
const injector = await notes.create({
appointmentid: detailData.AppointmentId,
...
questions: detailData.Questions,
}).then(function(){
log.info("created note ", detailData.Id)
}).catch(function(error){
log.info(error)
})
}
function getDetail(detailId) {
log.info(detailId)
try {
instance.get('notes/' + detailId)
.then ((resp) => {
try {
var detailData = (resp.data)
} catch {
log.info("detailData success", resp.status)
}
try {
notesInject(detailData)
} catch (error) {
log.info("notesInject catch", resp.status);
}
})
} catch (error) {
log.info("error in the detail instance")
}
}
function procDetail(apiData) {
for (let i = 0; i < apiData.length; i++) {
const element = apiData[i];
let detailId = element.Id;
getDetail(detailId)
}
}
function getTodayData() {
const pullDate = new Date();
const dateY = pullDate.getFullYear();
const dateM = pullDate.getMonth()+1;
const dateD = pullDate.getDate()-1;
const apiDate = (dateY+'-'+dateM+'-'+dateD)
try {
instance.get('notes/summary?startDate=' + apiDate)
.then ((resp) => {
try {
var apiData = (resp.data)
} catch {
log.info("set apiData", resp.status)
}
try {
procDetail(apiData)
} catch (error) {
log.info("saveDetail", resp.status);
}
})
} catch (error) {
log.info("in the summary instance")
}
}
exports.handler = async (event) => {
getTodayData();
};
I was thinking that the problem was with Bottleneck because that is the most significant difference between the first and second axios calls. When I isolated the database write code after the api pull, it had the same behavior. No error or response.
I'm stumped. I've logged everything I can think of. Lambda doesn't display console.log messages for this so I've been using Lambda-Log.
I'm sure it's something dumb but it works just fine when I run it from Code.
If anyone has any idea what I'm doing wrong, please let me know.
Sorry if I posted too much code but I really don't know where the problem is.
Many thanks
I have an iteration that can take up to hours to complete.
Example:
do{
//this is an api action
let response = await fetch_some_data;
// other database action
await perform_operation();
next = response.next;
}while(next);
I am assuming that the operation doesn't times out. But I don't know it exactly.
Any kind of explanation of nodejs satisfying this condition is highly appreciated. Thanks.
Update:
The actual development code is as under:
const Shopify = require('shopify-api-node');
const shopServices = require('../../../../services/shop_services/shop');
const { create } = require('../../../../controllers/products/Products');
exports.initiate = async (redis_client) => {
redis_client.lpop(['sync'], async function (err, reply) {
if (reply === null) {
console.log("Queue Empty");
return true;
}
let data = JSON.parse(reply),
shopservices = new shopServices(data),
shop_data = await shopservices.get()
.catch(error => {
console.log(error);
});
const shopify = new Shopify({
shopName: shop_data.name,
accessToken: shop_data.access_token,
apiVersion: '2020-04',
autoLimit: false,
timeout: 60 * 1000
});
let params = { limit: 250 };
do {
try {
let response = await shopify.product.list(params);
if (await create(response, shop_data)) {
console.log(`${data.current}`);
};
data.current += data.offset;
params = response.nextPageParameters;
} catch (error) {
console.log("here");
console.log(error);
params = false;
};
} while (params);
});
}
Everything is working fine till now. I am just making sure that the execution will ever happen in node or not. This function is call by a cron every minute, and data for processing is provided by queue data.