I'm trying to do a remote call in my node.js application to an external URL, then parse that request and perform an action based on the return. I'm using express and mysql.
I was able to get the remote URL content, however, I'm having some kind of race condition where my output is always changing and is not reliable. I tried to use async/await but wasn't able to.
This is the function called to run the app:
function lista(servidores) {
return new Promise(function(resolve, reject) {
var sql = ' SELECT sv.id as svid, sv.ip as svip'+
' FROM servidores sv'
dbconfig.conexao.query(sql, function (err, result, fields) {
Promise.all(
result.map(row => {
var ipsv = row.svip;
var urlprobe = 'http://201.182.96.14:8000/PING/' + ipsv;
fetch(urlprobe, {
method: 'get',
headers: {
'Accept': 'application/json, application/xml, text/plain, text/html, *.*',
'Content-Type': 'application/x-www-form-urlencoded; charset=utf-8'
},})
.then(
response => response.json(),
error => console.log('Ocorreu um erro', error)
)
.then(
json => console.log(json),
)
})
).then(result => result);
return resolve();
});
})
}
When all these functions are called, everything works ok until the monitora() function. The output is random based on which fetch answers faster, and therefore the result is not reliable. The ideal is that
monitora() performs each fetch separetely and then process the if's based on each one of the results.
#edit: I edited the code and made the fetch directly on the main function, however I'm still receiving inconsistent results, like if there was some sort of caching somewhere.
I'm not sure if I understand correctly but you can try using Promise.all() function to wait for all fetch requests to finish, before taking an action.
Promise.all(
result.map(row => {
var urlMon = 'http://' + row.pip + ':8000/PING/' + ipsv;
monitora(urlMon, idsv);
})
).then(result => result /*Do something */);
Related
first time poster, excited to have a community to collaborate with here!
So here's my situation. I'm using Node to put together an automation for my company's compliance - taking reports from our MDM server and posting them to our compliance platform (Tugboat Logic). The idea is to have it deployed on a recurring basis via AWS Lambda. The basic logic is this: getToken fetches an auth token which is then passed to getReports. getReports loops through an array of endpoints to get reports from the MDM and then passes those along to fileReport - which then posts that data to the endpoint.
The problem is that the final endpoint needs a file as the payload (sample POST request below). I managed to get the whole fetch chain working by using fs writeFile/readFile (and a delay), and while that worked, it doesn't translate well into a Lambda environment. Ideally, I want to just take the payload from getReports (which comes through as JSON but can also be accepted as text) and push it straight to the endpoint. Any help on how I could clean up this code would be appreciated!
Here's the bit giving me the most trouble (from the last file)
form.append('file', x, `${reportsArray[i].name}.json`);
// Sample post request for final endpoint
curl -v --user <provided-username>:<given-password> \
-H "X-API-KEY: <given-x-api-key>" \
-F "collected=<date-of-evidence>" -F "file=#<local_filename_csv>;type=text/csv" \
<given-collector-url>
//getReports.js accepts a token from an earlier function and takes fileReport as the cb
function getReports(token, cb) {
const headers = {
method: 'GET',
headers: {
'accept': 'application/json',
'Authorization': `Bearer ${token}`
},
redirect: 'follow'
}
for (let i = 0; i < reportsArray.length; i++) {
fetch(reportsArray[i].source, headers)
.then(res => res.json())
// writeFile leftover from successful deploy
/*.then(data => fs.writeFile(`./reports/${reportsArray[i].name}.json`, data, function (err) {
if (err) throw err;
}))*/
.then(res => cb(i, res))
.catch(error => console.log('error', error))
}
};
//fileReport.js - i identifies the right endpoint from the imported array and sets filename. x is the JSON payload passed down from getReports
function fileReport(i, x) {
const form = new FormData();
form.append('collected', getTimestamp());
form.append('file', x, `${reportsArray[i].name}.json`);
fetch(`${reportsArray[i].dest}`, {
method: 'POST',
headers: {
'X-API-KEY': `${process.env.TUGBOAT_X_API_KEY}`,
'Authorization': 'Basic ' + btoa(`${process.env.TUGBOAT_USERNAME}:${process.env.TUGBOAT_PASSWORD}`)
},
body: form
});
};
Having some issues when calling an external api to fetch information inside a loop in my node/express backend. I need information I get from the loop to get the correct data back from the endpoint. When I loop through the data and make the call I get this error.
{ message: 'Too Many Requests Limit 30. Reset time 1594218437315',
status: 429 }
I get the correct data back sometimes and sometimes only this message. There will be about 10 000 or so calls I need to make. I've tried a multiple of throttling libs and a lot of the code here on SO but it's pretty much always the same result which is the error message or it doesnt work at all.
I think I need a way send about 20-30 requests at a time and then wait a second or so and then continue. Or is there another better way? How would I achieve this?
const product = await NewProduct.find({});
product.map((item, index) => {
item.stores.map(async inner => {
inner.trackingUrl = await fetchRetry(inner.programId, inner.productUrl)
})
})
async function fetchRetry(id, urlToTrack) {
url1 = 'https://api.adtraction.com/v2/affiliate/tracking/link/?token=token';
const data = {
channelId,
programId: id,
shortLink: true,
url: urlToTrack
};
const options = {
method: 'POST',
headers: {
'Content-type': 'application/json',
Accept: 'application/json',
'Accept-Charset': 'utf-8',
},
body: JSON.stringify(data),
};
const res = await fetch(url, {
method: 'POST',
headers: {
'Content-type': 'application/json',
Accept: 'application/json',
'Accept-Charset': 'utf-8',
},
body: JSON.stringify(data),
});
const json = await res.json();
console.log(json) // Error
return json.trackingUrl;
}
Well, I'll share with you something that I've done in one of my projects. For all of your data that you call with your api. Keep making requests and when you encounter any error or response with throttling then wait for 5 minutes (or required in your api) and move to previous index (previous data to call the api with)
async function callAPI(data) {
for (let index = 0; index < data.length; index++) {
try {
// api call with your data
yourApiCall(data[index]);
} catch (e) {
if (e.type == "RequestThrottled") {
let oneMinute = 60000;
// wait for the time your request will be availabl
await sleep(5 * oneMinute);
// because of error in this request get back to
//previous request (or data)
index--;
}
}
}
}
Function using setTimeout to wait synchronously.
async function sleep(ms) {
await new Promise((resolve) => setTimeout(resolve, ms));
}
Also, you may want to run it on some background process as it might block your main event loop.
I have to make get requests to a slow API and keep track of the response status. I can only fetch the data (with another get request and a different url) once the initial dispatch status is 'Done'. Some search queries are fetched faster than others, so I have to keep that in mind as well
I used the javascript setTimeout function and waited 20 seconds for all the search queries to finish. This is a hit or miss approach as some queries are fetched faster than 20 seconds and some later
async function get_dispatch_state(sid) {
let dispatchState = "";
let json = await axios.get(
`https://digitals.devfg.test.com:8089/services/search/jobs/${sid}?output_mode=json`,
{
method: "get",
auth: auth,
headers: options
}
);
dispatchState = json.data.entry[0]["content"].dispatchState;
return dispatchState;
}
function get__data() {
axios({
method: "get",
url: `https://digitalsp.devfg.test.com:8089/services/search/jobs/test_search_1/results?output_mode=json`,
auth: auth,
headers: options
})
.then(datax => {
fraud_line_1d = datax.data;
console.log("***Fraud line 1****" + JSON.stringify(fraud_line_1d));
})
.catch(error => {
console.log("second error is " + error);
});
// repeat other get requests
}
setTimeout(function() {
get_data();
}, 20000);
All data is eventually fetched but at different intervals depending on how large the search query is. I need some advice on the best way to fetch the data once the dispatch status is Done.
You can use Promise.all() method returns a single Promise that resolves when all of the promises passed as an iterable have resolved or when the iterable contains no promises. It rejects with the reason of the first promise that rejects.
function get_dispatch_state(sid) {
return axios.get(
`https://digitals.devfg.test.com:8089/services/search/jobs/${sid}?output_mode=json`,
{
method: 'get',
auth: auth,
headers: options
}
)
.then(json => {
return json.data.entry[0]['content'].dispatchState;
});
}
function get__data() {
axios({
method: 'get',
url: `https://digitalsp.devfg.test.com:8089/services/search/jobs/test_search_1/results?output_mode=json`,
auth: auth,
headers: options
})
.then(datax => {
fraud_line_1d = datax.data;
return JSON.stringify(fraud_line_1d);
});
// repeat other get requests
}
Promise.all([get_dispatch_state() , get__data()])
.then(data => {
// Array with the response of both request at the same time
})
Small sample of how we should implement it.
More information on the subject here.
Looking to find a clear and complete example of a case where async is used to handle chained function steps. I have psuedo code below showing intent, but not certain if there is an example out there that would show; clearly, the actual code needed to call multiple steps from within the async function.
The function is using a basic async wrapper.
getUserById: asyncHandler ( (req, res, next) => {
validateUser();
SavetoDB();
res.json({"message": "TBD...success"});
})
Well, your question isn't very well-formed.
Your SaveToDB() method will probably do some database task, which will be async. So you'll want to return a promise. So SaveToDB() could be something like :
exports.submitPost = async () => {
const data = await fetch('/post', {
method: 'POST',
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json'
},
body: JSON.stringify({
field: {
txt: this.state.newfield
}
})
})
const jsonData = await data.json();
return jsonData;
}
Then you can call is with .then:
submitPost().then(data => console.log(data))
i'm gonna test REST API using Cypress.io , but using chaining request, it wants to be work like this, JSON response body on the first API will be used on the next API Headers for Authorization
I'm already try doing by using cypress commands and printing on console.log, but seems like it does not captured on the log, or is there any clue for this, or i just using another command like cy.route ?
Cypress.Commands.add("session", () => {
return cy.request({
method: 'POST',
url: '/auth/',
headers: {
'Content-Type': 'application/json',
},
body: {
"client_secret" : ""+config.clientSecret_staging,
"username": ""+config.email_staging,
"password": ""+config.password_staging
}
}).then(response => {
const target = (response.body)
})
})
it('GET /capture', () => {
cy.session().then(abc =>{
cy.request({
method: 'GET',
url: '/capture/'+target
})
})
})
the goal is to capture parse of JSON array from target = (response.body)
You have two options:
leave the code as is, be aware that the
.then(response => {
const target = (response.body)
})
code isn't returning anything so the cy.session().then(abc =>{ ... code is getting the whole response (abc is the response of the first .then)
change the code into
.then(response => {
const target = (response.body)
return target // I added this return
})
and then your abc param will be equal to response.body and not to response
That's because if you don't return a subject from your chainable calls the default one will be passed to the next .then function.
Let me know if it satisfies your issue.
p.s. Welcome 👋