Experts,
It seems that yelp recently changed their REST API to limit the amount of requests you can make per second. I've tried using setTimeout and various sleep functions with no success. I believe it has to do with setTimeout although. I only get a few responses back and a slew of TOO_Many_Requests_Per_Second. Also, I'm using the Node.js Fusion API Client. Any help would be appreciated. Thanks in advance.
Here is the code below as I'm getting the Yelp URL from my Parse Server, and I want to get the Yelp Business Name response:
'use strict';
var Parse = require('parse/node');
Parse.initialize("ServerName");
Parse.serverURL = 'ParseServerURL';
const yelp = require('yelp-fusion');
const client = yelp.client('Key');
var object;
var Business = Parse.Object.extend("Business");
var query = new Parse.Query(Business);
query.notEqualTo("YelpURL", "Bus");
query.find({
success: function(results) {
for (var i = 0; i < results.length; i++) {
object = results[i];
//I belive a setTimeout block needs to come somewhere in here. Tried many places but with no success.
client.business(object.get('YelpURL')).then(response => {
console.log(response.jsonBody.name);
}).catch(e => {
console.log(e);
});
}
},
error: function(error) {
alert("Error" + error.code + " " + error.message);
}
});
Use query each, which will iterate over each object and perform the requests in a sequence rather than all more or less at once:
query.each(
function(object) {
return client.business(object.get('YelpURL')).then(response => {
console.log(response.jsonBody.name);
});
}
).catch( e => {
res.json('error');
});
One cool thing about this is that it'll automatically propagate the error from client.bussiness() call if there is one to the catch block at the bottom. It will iterate over the objects one at a time, and since we "return" the results of the client.business() call, it's not going to move on to the next object until you've gotten the response. query.each() will also iterate over every object in a collection that meets your query criteria, so you don't have to worry about limits.
Im not quite sure if this is what your looking for, but you can retrieve up to 50 records per request, in the example below will return 20 business names within that zip code, or you can tweak it a little to return all that data for those businesses, does this help:
app.get('/:id', (req, res) => {
let zipcode = req.params.id;
let names = [];
let searchRequest = {
term: 'Business', // or for ex. food
limit: 20, //set the number of responses you want up to 50
radius: 20000, // 20 miles
location: zipcode
};
client.search(searchRequest)
.then(response => {
response.jsonBody.businesses.map(elem => {
names.push(elem.name);
})
res.json(names); // business names only
//or
//res.json(response.jsonBody.businesses) //all details included with business name
}).catch(e => {
res.json('error');
});
})
Related
There is a collection in the database, which has about 4.6 million documents.
(data structure can be seen in the screenshot.)
When trying to pull at least one value from this collection, the response time to the request reaches more than 3 seconds:
(And this is on my local powerful machine.)
Work with the code request and response looks like this:
// server.js
const http = require('http');
const { getProducts } = require('./controllers/currencyController');
const server = http.createServer((req, res) => {
if(req.url.match(/\/api\/currency.+/g) && req.method === 'GET') {
getProducts(req, res);
} else {
res.writeHead(404, { 'Content-Type': 'application/json' })
res.end(JSON.stringify({ message: 'Route Not Found' }))
}
})
const PORT = process.env.PORT || 5000
server.listen(PORT, () => console.log(`Server running on port ${PORT}`))
module.exports = server;
// currencyModel.js
const {MongoClient} = require('mongodb');
const client = new MongoClient('mongodb://127.0.0.1:27017/');
const findAll = async (currencyPair) => {
try{
await client.connect();
const testingData = client.db('Trading').collection(currencyPair);
return testingData;
}
catch(e){
console.log(e);
}
}
module.exports = {
findAll,
}
// currencyController.js
const currencyData = require('../models/currencyModel')
async function getProducts(req, res) {
try {
const currency_data = await currencyData.findAll('EUR/USD');
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify(await currency_data.find({id:1}).toArray()));
} catch (error) {
console.log(error)
}
}
module.exports = {
getProducts,
}
I suspect that the reason for such a long answer may be somehow connected with the lack of indexes.
(those numbers that are present are custom created.)
Question:
How can I speed up the response time from the server after receiving the request?
(preferably without using ORM and frameworks.)
And what should ideally be this time?
I'm editing my answer as per the insight gained though the comments.
The findAll method there is a bit of a misnomer, as it just is selecting a collection via the MongoDB driver. The problem is that you are triggering a table scan, that is, MongoDB has to go through every row in the database to see if it matches your expression id:1.
You can cut down the execution time by using findOne instead of find. Once a single record is found that satisfies the expression, it will return it as a result. This is usually not enough, as if the matching record is the last in the database, the performance will be the same as for find.
You should instead create an index:
db.collection.createIndex({"id": 1})
Run this in the MongoDB console, or use MongoDB Compass to create an index. The example creates an index for queries using the id field in ascending order.
If you wanted to query for two fields, say vol and close, vol ascending and close descending, you would create an index via:
db.collection.createIndex({"vol": 1, "close":-1})
I'm working on an interview project for which I have to add an endpoint that lets me POST an array of products (listings) and it should create them (MongoDB + Mongoose) or update them accordingly. The problem is I'm clearly not dealing with Promises properly and I'm getting a timeout on my test.
Here's the spec:
it.only('should create listings or update them if they already exist, incrementing the quantity with the difference ' +
'between the current sold_quantity and the initial_sold_quantity', (done) => {
var iPhone = { ... };
var samsung = { ... };
request(app).post('/listings/myEndpoint').send([iPhone, samsung]).expect(200, {
created: 1,
updated: 1
}, done);
});
exports.myEndpoint = (req, res) => {
var listings = req.body;
var created, updated = 0;
listings.forEach(reqListing => {
Listing.findOne({ listing_id: reqListing.listing_id })
.then(foundListing => {
if (!foundListing) {
var newListing = reqListing;
newListing.initial_sold_quantity = newListing.sold_quantity;
Listing.create(newListing);
created++;
} else {
var newQuantity = reqListing.sold_quantity - foundListing._doc.initial_sold_quantity;
if (foundListing._doc.quantity != newQuantity) {
foundListing._doc.quantity = newQuantity;
foundListing.save();
updated++;
}
}
});
return {
created: created,
updated: updated
};
});
};
THINGS I'VE TRIED:
Giving it more time. I tried changing the default timeout for Mocha tests but it doesn't really matter if it's 2 seconds or 20, it'll still timeout.
Isolating the update vs the creation. Really doesn't matter either if I'm only updating a product or if I'm only creating one, it'll still timeout.
Removing the logic. As far as I've checked it doesn't really matter what happens inside the if/else blocks because it'll still give me a timeout. So even if the code looks like this:
exports.myEndpoint = (req, res) => {
var listings = req.body;
var created, updated = 0;
listings.forEach(reqListing => {
Listing.findOne({ listing_id: reqListing.listing_id })
.then(foundListing => {
if (!foundListing) {
console.log("creating");
} else {
console.log("updating");
}
});
return {
created: created,
updated: updated
};
});
};
it'll still timeout.
After asking some questions in the Nodeiflux Discord server I managed to find a solution, maybe not the prettiest because it doesn't make use of async/await but I'm not supposed to change the style of the project too much so I'll leave it without async/await.
First, the silly problem which came after fixing the post's question:
var created = 0, updated = 0;
instead of not initializing created.
Second, making use of forEach with Promises inside didn't make too much sense because it would discard whatever was returning inside so I put the return outside the forEach clause and changed the forEach iteration for a map instead. I also made use of Promise.all() to get all promises to resolve before returning:
exports.upsert = (req, res) => {
var listings = req.body;
var created = 0, updated = 0;
var allowedArrayLength = 50;
return Promise.all(listings.map(reqListing =>
Listing.findOne({
listing_id: reqListing.listing_id
})
.then(foundListing => {
if (!foundListing) {
createListing(reqListing);
created++;
} else {
var prevQuantity = foundListing.quantity;
if (updateListing(reqListing, foundListing).quantity > prevQuantity) {
updated++;
}
}
})
)).then(() => ({ created: created, updated: updated }));
};
I'm making a website which tracks kill statistics in a game but I can only access 50 key value pairs per request, so to make sure that my data is accurate I want to make a request about every 30 seconds.
I feel like I may have gone wrong at some stage in the implementation. Like, maybe there is a way to make requests that doesn't involve using
express.get('/route/', (req, res) => { //code })
syntax and I just don't know about it. In short, what I want is for the database to be updated every 30 seconds without having to refresh the browser. I've tried wrapping my get request in a function and putting it in set interval but it still doesn't run.
const express = require('express');
const zkillRouter = express.Router();
const axios = require('axios');
const zkillDbInit = require('../utils/zkillTableInit');
const sqlite3 = require('sqlite3').verbose();
zkillDbInit();
let db = new sqlite3.Database('zkill.db');
setInterval(() => {
zkillRouter.get('/', (req, res)=>{
axios.get('https://zkillboard.com/api/kills/w-space/', {headers: {
'accept-encoding':'gzip',
'user-agent': 'me',
'connection': 'close'
}})
.then(response => {
// handle success
const wormholeData = (response.data)
res.status(200);
//probably not the best way to do this but it's fine for now.
for (let i = 0; i < Object.keys(wormholeData).length; i++) {
const currentZKillId = Object.keys(wormholeData)[i]
const currentHash = Object.values(wormholeData)[i]
let values = {
$zkill_id: currentZKillId,
$hash: currentHash
};
//puts it in the database
db.run(`INSERT INTO zkill (zkill_id, hash) VALUES ($zkill_id, $hash)`,
values
, function(err){
if(err){
throw(err)
}
})
}
})
})
}, 1000)
module.exports = zkillRouter;
One thing I have considered is that I don't necessarily need this functionality to be part of the same program. All I need is the database, so if I have to I could run this code separately in say, the terminal as just a little node program that makes requests to the api and updates the database. I don't think that would be ideal but if it's the only way to do what I want then I'll consider it.
clarification: the .get() method is called on zkillRouter which is an instance of express.Router() declared on line two. This in turn links back to my app.js file through an apiRouter, so the full route is localhost:5000/api/zkill/. That was a big part of the problem, I didn't know you could call axios.get without specifying a route, so I was stuck on this for a while.
I fixed it myself.
(edit 4)
I was using setInterval wrong.
I just got rid of the router statement as it wasn't needed.
I definitely need to tweak the interval so that I don't get so many sql errors for violating the unique constraint. Every 5 minutes should be enough I think.
Don't throw an error.
function myfunction(){
axios.get('https://zkillboard.com/api/kills/w-space/', {headers: {
'accept-encoding':'gzip',
'user-agent': 'me lol',
'connection': 'close'
}})
.then(response => {
// handle success
const wormholeData = (response.data)
for (let i = 0; i < Object.keys(wormholeData).length; i++) {
const currentZKillId = Object.keys(wormholeData)[i]
const currentHash = Object.values(wormholeData)[i]
let values = {
$zkill_id: currentZKillId,
$hash: currentHash
};
db.run(`INSERT INTO zkill (zkill_id, hash) VALUES ($zkill_id, $hash)`,
values
, function(err){
if(err){
return console.log(i);
}
})
}
})
}
setInterval(myfunction, 1000 * 30)
I'm writing a text editing app with Node.js & express and want to achieve Google Docs-esque auto-saving whenever the user edits their text.
Currently I'm doing this by saving to the database with AJAX whenever the user presses a key within the textarea. As soon as I start typing at any decent speed the saving process freezes up and doesn't save most of the content.
This however works perfectly when typing slowly.
I'm currently using mLab, MongoDB hosting, could this be the problem?
In fact, what is the best way to handle this task?
edit.ejs (front-end js):
$(document).ready(function() {
$('#board-lyrics').keyup(updateLyrics);
$('#board-title').keyup(updateLyrics);
function updateLyrics() {
let boardData = {
title: $('#board-title').val(),
content: $('#board-lyrics').val()
}
$.ajax({
type: "POST",
url: `./<%= id %>`,
data: boardData,
success: function(data) {
},
error: function() {
console.log('error');
}
});
}
});
app.js
app.post('/edit/:id', urlencodedParser, (req, res) => {
let user = req.user;
let boardId = req.params.id;
let query = {"_id": req.user.id};
let update = {"$set": {}};
let index;
for (let i = 0; i < user.boards.length; i++) {
if (user.boards[i]._id == boardId) {
index = i;
}
}
update.$set[`boards.${index}.title`] = req.body.title;
update.$set[`boards.${index}.content`] = req.body.content;
let options = { new: true };
User.findOneAndUpdate(query, update, options, function(err, doc){
console.log(query);
console.log(update);
console.log(doc);
});
});
Okay, two things here.
firstly, it isn't a good idea to update your DB on every keystroke. Think about it. You are making a post request to your server more than once in a second with a payload and touching DB. Not ideal. So either you keep this data cached and save it once it crosses a threshold (say after one paragraph or X number of characters) or,
secondly, you can do some tweaks at the front-end side also here. Make sure you catch only valid or wanted keystrokes. Use reactive programming. Check for Rxjs and filter out invalid characters or catch only at a certain interval. Hope this will help.
I need to fetch two different MongoDB collections (db.stats and db.tables ) for the same request req.
Now, in the code below, I am nesting the queries within the callback function.
router.post('/', (req, res) => {
let season = String(req.body.year);
let resultData, resultTable;
db.stats.findOne({Year: season}, function (err, data) {
if (data) {
resultData = getResult(data);
db.tables.findOne({Year: season}, function (err, data) {
if (data) {
resultTable = getTable(data);
res.render('index.html', {
data:{
result : resultData,
message: "Working"}
});
} else {
console.log("Error in Tables");
}
});
} else {
console.log("Error in Stats");
}
});
});
This code works, but there a few things that don't seem right. So my question is:
How do I avoid this nested structure? Because it not only looks ugly but also, while I am processing these requests the client side is unresponsive and that is bad.
What you have right now is known as the callback hell in JavaScript. This is where Promises comes in handy.
Here's what you can do:
router.post('/', (req, res) => {
let season = String(req.body.year);
var queries = [
db.stats.findOne({ Year: season }),
db.tables.findOne({ Year: season })
];
Promise.all(queries)
.then(results => {
if (!results[0]) {
console.log("Error in Stats");
return; // bad response. a better way is to return status 500 here
} else if (!results[1]) {
console.log("Error in Tables");
return; // bad response. a better way is to return status 500 here
}
let resultData = getResult(results[0]);
let resultTable = getTable(results[1]);
res.render('index.html', { data: {
result : resultData,
message: "Working"
} });
})
.catch(err => {
console.log("Error in getting queries", err);
// bad response. a better way is to return status 500 here
});
});
It looks like you are using Mongoose as your ODM to access your mongo database. When you don't pass in a function as the second parameter, the value returned by the function call (e.g. db.stats.findOne({ Year: season })) will be a Promise. We will put all of these unresolved Promises in an array and call Promise.all to resolve them. By using Promise.all, you are waiting until all of your database queries get executed before moving on to render your index.html view. In this case, the results of your database function calls will be stored in the results array in the order of your queries array.
Also, I would recommend doing something like res.status(500).send("A descriptive error message here") whenever there is an error on the server side in addition to the console.log calls.
The above will solve your nested structure problem, but latter problem will still be there (i.e. client side is unresponsive when processing these requests). In order to solve this, you need to first identify your bottleneck. What function calls are taking up most of the time? Since you are using findOne, I do not think that will be the bottleneck unless the connection between your server and the database has latency issues.
I am going to assume that the POST request is not done through AJAX since you have res.render in it, so this problem shouldn't be caused by any client-sided code. I suspect that either one of getResult or getTable (or both) is taking up quite a significant amount of time, considering the fact that it causes the client side to be unresponsive. What's the size of the data when you query your database? If the size of it is so huge that it takes a significant amount of time to process, I would recommend changing the way how the request is made. You can use AJAX on the front-end to make a POST request to the back-end, which will then return the response as a JSON object. That way, the page on the browser would not need to reload, and you'll get a better user experience.
mongodb driver return a promise if you dont send a callback so you can use async await
router.post('/', async(req, res) => {
let season = String(req.body.year);
let resultData, resultTable;
try {
const [data1,data2] = await Promise.all([
db.stats.findOne({Year: season}),
db.tables.findOne({Year: season})
]);
if (data1 && data2) {
resultData = getResult(data1);
resultTable = getTable(data2);
return res.render('index.html', {
data: {
result: resultData,
message: "Working"
}
});
}
res.send('error');
console.log("Error");
} catch (err) {
res.send('error');
console.log("Error");
}
});