i have an API with express one route make a few time to get all data required (search through long JSON object)
router.get(
"/:server/:maxCraftPrice/:minBenef/:from/:to",
checkJwt,
async (req, res) => {
const getAllAstuces = new Promise(async (resolve, reject) => {
const { EQUIPMENTS_DIR, RESOURCES_DIR } = paths[req.params.server];
const astuces = [];
const { from, to, maxCraftPrice, minBenef } = req.params;
const filteredEquipments = getItemByLevel(from, to);
for (const equipment in filteredEquipments) {
// parsing and push to astuces array
}
resolve(astuces);
});
const resource = await getAllAstuces;
return res.json(resource);
}
);
Now in my website when someone go to the page associated with this route, while the data is loading EVERY other request is just locked like in a queue
I tried to add Promise to handle this but no change
Is there a way to handle requests simultanously or maybe should i refactor that route to make it faster ?
If your request takes a long time to process, it will block all other requests until it is done. If you can make the request take less processing time, that's a good place to start, but you're probably going to need to take further steps to make multiple requests faster.
There are various methods for getting around this situation. This article describes a few approaches.
Related
I'm new to Next.js and I'm trying to understand the suggested structure and dealing with data between pages or components.
For instance, inside my page home.js, I fetch an internal API called /api/user.js which returns some user data from MongoDB. I am doing this by using fetch() to call the API route from within getServerSideProps(), which passes various props to the page after some calculations.
From my understanding, this is good for SEO, since props get fetched/modified server-side and the page gets them ready to render. But then I read in the Next.js documentation that you should not use fetch() to all an API route in getServerSideProps(). So what am I suppose to do to comply to good practice and good SEO?
The reason I'm not doing the required calculations for home.js in the API route itself is that I need more generic data from this API route, as I will use it in other pages as well.
I also have to consider caching, which client-side is very straightforward using SWR to fetch an internal API, but server-side I'm not yet sure how to achieve it.
home.js:
export default function Page({ prop1, prop2, prop3 }) {
// render etc.
}
export async function getServerSideProps(context) {
const session = await getSession(context)
let data = null
var aArray = [], bArray = [], cArray = []
const { db } = await connectToDatabase()
function shuffle(array) {
var currentIndex = array.length, temporaryValue, randomIndex;
while (0 !== currentIndex) {
randomIndex = Math.floor(Math.random() * currentIndex);
currentIndex -= 1;
temporaryValue = array[currentIndex];
array[currentIndex] = array[randomIndex];
array[randomIndex] = temporaryValue;
}
return array;
}
if (session) {
const hostname = process.env.NEXT_PUBLIC_SITE_URL
const options = { headers: { cookie: context.req.headers.cookie } }
const res = await fetch(`${hostname}/api/user`, options)
const json = await res.json()
if (json.data) { data = json.data }
// do some math with data ...
// connect to MongoDB and do some comparisons, etc.
But then I read in the Next.js documentation that you should not use fetch() to all an API route in getServerSideProps().
You want to use the logic that's in your API route directly in getServerSideProps, rather than calling your internal API. That's because getServerSideProps runs on the server just like the API routes (making a request from the server to the server itself would be pointless). You can read from the filesystem or access a database directly from getServerSideProps. Note that this only applies to calls to internal API routes - it's perfectly fine to call external APIs from getServerSideProps.
From Next.js getServerSideProps documentation:
It can be tempting to reach for an API Route when you want to fetch
data from the server, then call that API route from
getServerSideProps. This is an unnecessary and inefficient approach,
as it will cause an extra request to be made due to both
getServerSideProps and API Routes running on the server.
(...) Instead, directly import the logic used inside your API Route
into getServerSideProps. This could mean calling a CMS, database, or
other API directly from inside getServerSideProps.
(Note that the same applies when using getStaticProps/getStaticPaths methods)
Here's a small refactor example that allows you to have logic from an API route reused in getServerSideProps.
Let's assume you have this simple API route.
// pages/api/user
export default async function handler(req, res) {
// Using a fetch here but could be any async operation to an external source
const response = await fetch(/* external API endpoint */)
const jsonData = await response.json()
res.status(200).json(jsonData)
}
You can extract the fetching logic to a separate function (can still keep it in api/user if you want), which is still usable in the API route.
// pages/api/user
export async function getData() {
const response = await fetch(/* external API endpoint */)
const jsonData = await response.json()
return jsonData
}
export default async function handler(req, res) {
const jsonData = await getData()
res.status(200).json(jsonData)
}
But also allows you to re-use the getData function in getServerSideProps.
// pages/home
import { getData } from './api/user'
//...
export async function getServerSideProps(context) {
const jsonData = await getData()
//...
}
You want to use the logic that's in your API route directly in
getServerSideProps, rather than calling your internal API. That's
because getServerSideProps runs on the server just like the API routes
(making a request from the server to the server itself would be
pointless). You can read from the filesystem or access a database
directly from getServerSideProps
As I admit, what you say is correct but problem still exist. Assume you have your backend written and your api's are secured so fetching out logic from a secured and written backend seems to be annoying and wasting time and energy. Another disadvantage is that by fetching out logic from backend you must rewrite your own code to handle errors and authenticate user's and validate user request's that exist in your written backend. I wonder if it's possible to call api's within nextjs without fetching out logic from middlewars? The answer is positive here is my solution:
npm i node-mocks-http
import httpMocks from "node-mocks-http";
import newsController from "./api/news/newsController";
import logger from "../middlewares/logger";
import dbConnectMid from "../middlewares/dbconnect";
import NewsCard from "../components/newsCard";
export default function Home({ news }) {
return (
<section>
<h2>Latest News</h2>
<NewsCard news={news} />
</section>
);
}
export async function getServerSideProps() {
let req = httpMocks.createRequest();
let res = httpMocks.createResponse();
async function callMids(req, res, index, ...mids) {
index = index || 0;
if (index <= mids.length - 1)
await mids[index](req, res, () => callMids(req, res, ++index, ...mids));
}
await callMids(
req,
res,
null,
dbConnectMid,
logger,
newsController.sendAllNews
);
return {
props: { news: res._getJSONData() },
};
}
important NOTE: don't forget to use await next() instead of next() if you use my code in all of your middlewares or else you get an error.
Another solution: next connect has run method that do something like mycode but personally I had some problems with it; here is its link:
next connet run method to call next api's in serverSideProps
Just try to use useSWR, example below
import useSWR from 'swr'
import React from 'react';
//important to return only result, not Promise
const fetcher = (url) => fetch(url).then((res) => res.json());
const Categories = () => {
//getting data and error
const { data, error } = useSWR('/api/category/getCategories', fetcher)
if (error) return <div>Failed to load</div>
if (!data) return <div>Loading...</div>
if (data){
// {data} is completed, it's ok!
//your code here to make something with {data}
return (
<div>
//something here, example {data.name}
</div>
)
}
}
export default Categories
Please notice, fetch only supports absolute URLs, it's why I don't like to use it.
P.S. According to the docs, you can even use useSWR with SSR.
Environment: nodejs 17.2, expressjs 4.17
Task: Data arrives at the url of the type "/user-actions" from different servers at a rate of about 2 requests per second. It is necessary to aggregate them and send them to another server once a second.
For example:
Request #1: {userId: 1, action: "hitOne"}
Request #2: {userId: 2, action: "hitFive"}
Request #3: {userId:1, action: "hitFive"}
It is necessary to get 2 objects
const data = [{userId: 1, action: "hitOne"}, {userId: 2, action: "hitFive"}]
and
const data = [{userId: 1, action: "hitFive"}]
Each of these objects is sent to another server 1 time per second, something like this
http.post('http://newserver.url/user-actions', {data});
I was thinking of making a variable in which to record everything that comes in the request and send this variable to a new server once a second on a timer.
But something tells me: or there will be problems with the variable (for example, due to concurrent request) and there will not always be the data I was waiting for, or some nonsense will come out with the timer.
How to implement such a scenario correctly?
So you're creating some sort of a proxy service. You have two potential issues:
data persistence and
retries and pending requests.
I think your best bet would be to do something like this:
in this particular service (with the API route), you just receive requests, and store them somewhere like Redis or RabbitMQ or Amazon SQS.
in another service, you deal with retries, posting etc.
Even if you don't split up into two services, you still want to put things in specialised storage service in things like this. E.g. your process crashes, and you lose whatever data you have holding in memory. It also simplifies all the management details. Things like storing, sorting what came first, what requests are pending - those are super easy to deal with with RabbitMQ-type service.
But let's simplify things and hold them in memory. Now you have to deal with all these things yourself.
So here's a naive proxy service.
const axios = require('axios');
const axiosRetry = require('axios-retry');
const REQUEST_INTERVAL = 1000; // every second
const MAX_PARALLEL_REQUESTS = 3;
axiosRetry(axios, { retries: 3});
const bucket = [];
let exportingInterval;
let currentRequestsCount = 0;
const logRequest = (payload) => bucket.push(payload);
const makeRequest = (payload) => axios.post('http://remote-service/user-actions', payload);
const sendData = () => {
// first, make sure you don't make more then X parallel requests
if (currentRequestsCount > MAX_PARALLEL_REQUESTS) {
return
}
// clear the bucket
const data = bucket.splice(0, bucket.length);
if (!data.length) {
return;
}
// send the data, make sure you handle the failure.
currentRequestsCount = currentRequestsCount + 1;
makeRequest()
.then(() => currentRequestsCount = currentRequestsCount - 1)
.catch(() => {
// what do do now? We failed three times.
// Let's put everything back in the bucket, try in the next request.
bucket.splice(bucket.length, 0, ...data);
currentRequestsCount = currentRequestsCount - 1;
});
}
const startExporting = () => exportingInterval = setInterval(() => sendData(), REQUEST_INTERVAL);
const stopExporting = () => clearInterval(exportingInterval)
module.exports = {
logRequest,
startExporting,
stopExporting,
}
Now, you would use this like this:
const proxyService = require('./proxy-service');
const app = express();
proxyService.startExporting();
// ...
app.post('/user-data', (req, res) => {
proxyService.logRequest(req.body);
res.end();
});
Now, this is just a simple example:
You do need to make sure that retry policy is ok. You have to make sure you don't DoS wherever you're sending the data.
You want to make sure you limit how many objects you send per call.
maybe that 1-second interval is not a good thing - what if sending off the data lasts longer?
What if you start piling requests? My simple counter only counts to 3, maybe it's more complicatd then that.
Also, calling that startExporting and stopExporting should go in some common place, where you boot the app, and where you cleanup in case of a graceful shutdown.
But it gives you an idea of how it can be done.
It is a trade-off: time, data
If you want ensure enough data, you can use Promise.all() function. When both 2 requests is responded, you will call api to sent it. This will ensure that enough data but won't ensure that send data to another server once a second.
let pr1 = request1();
let pr2 = request2();
await data = promise.all([pr1,pr2]);
requestToAnotherServer(data);
If you want ensure that server will send data to another server once a second. You can set a timer, when time out, you send data that server received. But this won't ensure that enough data
sendData = [];
setInterval(()=>{
let pr1 = request1().then(data=>{sendData.push(data)});
let pr2 = request2().then(data=>{sendData.push(data)});
requestToAnotherServer(sendData);
sendData = [];
},1000)
I have a sample app, user can access some dynamic data via different URL.
The workflow is like this:
when user request get_data?id=1234567
first it checks the DB if there is data for it
if not, generate a random value
then if other users request the same url within a short time (say 10 min), it will return the value that already generated
if one of the users send a clear request, the value will be cleared from DB.
The bug is: if 2 users request the same url at the same time, since it needs time to query the DB, it would do 1 and 2 at the same time, then create different values for each user.
How to make sure that in a short period, it always generate same value for all users?
Although NodeJS is single threaded and does not have the problem of synchronization between multiple threads, its asynchronous event model still can require you to implement some kind of locking mechanism to synchronize the concurrent async operations in certain situations (like in your case).
There are a number of libraries that provide this functionality, e.g. async-mutex. Here's a very basic example of what your code could look like:
const express = require('express');
const app = express();
const Mutex = require('async-mutex').Mutex;
const locks = new Map();
app.get('/get_data', async (req, res) => {
const queryId = req.query.id;
if (!queryId) {
// handle empty queryid ...
}
if (!locks.has(queryId)) {
locks.set(queryId, new Mutex());
}
const lockRelease = await locks
.get(queryId)
.acquire();
try {
// do the rest of your logic here
} catch (error) {
// handle error
} finally {
// always release the lock
lockRelease();
}
});
app.listen(4000, function () {
console.log("Server is running at port 4000");
});
I am working on building a blog API for a practice project, but am using the data from an external API. (There is no authorization required, I am using the JSON data at permission of the developer)
The idea is that the user can enter multiple topic parameters into my API. Then, I make individual requests to the external API for the requested info.
For each topic query, I would like to:
Get the appropriate data from the external API based on the params entered (using a GET request to the URL)
Add the response data to my own array that will be displayed at the end.
Check if each object already exists in the array (to avoid duplicates).
res.send the array.
My main problem I think has to do with understanding the scope and also promises in Axios. I have tried to read up on the concept of promise based requests but I can't seem to understand how to apply this to my code.
I know my code is an overall mess, but if anybody could explain how I can extract the data from the Axios function, I think it could help me get the ball rolling again.
Sorry if this is a super low-level or obvious question - I am self-taught and am still very much a newbie!~ (my code is a pretty big mess right now haha)
Here is a screenshot of the bit of code I need to fix:
router.get('/:tagQuery', function(req, res){
const tagString = req.params.tagQuery;
const tagArray = tagString.split(',');
router.get('/:tag', function(req, res){
const tagString = req.params.tag;
const tagArray = queryString.split(',');
const displayPosts = tagArray.map(function(topic){
const baseUrl = "https://info.io/api/blog/posts";
return axios
.get(baseUrl, {
params: {
tag: tag
}
})
.then(function(response) {
const responseData = response.data.posts;
if (tag === (tagArray[0])){
const responseData = response.data.posts;
displayPosts.push(responseData);
} else {
responseData.forEach(function(post){
// I will write function to check if post already exists in responseData array. Else, add to array
}); // End if/then
})
.catch(function(err) {
console.log(err.message);
}); // End Axios
}); // End Map Function
res.send(displayPosts);
});
Node.js is a single thread non-blocking, and according to your code you will respond with the result before you fetching the data.
you are using .map which will fetch n queries.
use Promise.all to fetch all the requests || Promise.allsettled.
after that inside the .then of Promise.all || promise.allsettled, map your result.
after that respond with the mapped data to the user
router.get('/:tag', function (req, res) {
const tagString = req.params.tag;
const tagArray = queryString.split(',');
const baseUrl = "https://info.io/api/blog/posts";
const topicsPromises=tagArray.map((tobic)=>{
return axios
.get(baseUrl, {
params: {
tag: tag
}
})
});
Promise.all(topicsPromises).then(topicsArr=>{
//all the data have been fetched successfully
// loop through the array and handle your business logic for each topic
//send the required data to the user using res.send()
}).catch(err=>{
// error while fetching the data
});
});
your code will be something like this.
note: read first in promise.all and how it is working.
Just playing around with my first Node/Express App.
What I am trying to do:
On submitting a form to /wordsearch (POST) the Wikipedia API should be called with the submitted keyword. After getting back the response from Wikipedia, I want to present it back to the user in a view. Basic stuff.
But I am missing some basic understanding of how to arrange that in NODE/JS. I read about callbacks and promises lately and understand the concepts theoretically, but seem to mix things up when trying to put it into code. If someone could shed light on where I am wrong, that would be highly appreciated.
Approach 1:
This is the controller function that is hit on submitting the form:
exports.searchSources = (req, res) => {
const term = req.body.searchTerm
const url = `https://en.wikipedia.org/w/api.php?action=opensearch&search=${term}&limit=10&namespace=0&format=json`
const client = new Client()
client.get(url, function (data, response) {
//this causes the error
res.json(data)
})
}
=> Error: Can't set headers after they are sent.
I know, that the error stems from trying to set response headers twice or when the response is already in a certain state, but I don't see where that happens here. How can I wait for the result of the Wiki request and have it available in the controller function so that I can render it?
Approach 2:
Again, the controller function:
exports.searchSources = (req, res) => {
const term = req.body.searchTerm
const url = `https://en.wikipedia.org/w/api.php?action=opensearch&search=${term}&limit=10&namespace=0&format=json`
const client = new Client()
const data = client.get(url, function (data, response) {
return data
})
res.json(data)
}
=> TypeError: Converting circular structure to JSON at JSON.stringify ()
This was just a try to make the response from Wiki available in the controller function.