Superagent with absolute url prefix - node.js

I've noticed that I'm writing http://localhost everytime I want to run a node test with superagent.
import superagent from 'superagent';
const request = superagent.agent();
request
.get('http://localhost/whatever')
.end((err, res) => { ... });
Is there any way of avoiding the localhost part?
As far as I've gone is to avoid the request being hardcoded to the host:
const baseUrl = 'http://localhost:3030';
request
.get(`${baseUrl}/whatever`)
But I still have to carry the baseUrl with the agent everytime.

While not so recently updated a package as superagent-absolute, superagent-prefix is officially recommended, and has the highest adoption as of 2020.
It is such a simple package that I would not be concerned with the lack of updates.
Example usage:
import superagent from "superagent"
import prefix from "superagent-prefix"
const baseURL = "https://foo.bar/api/"
const client = superagent.use(prefix(baseURL))

TL;DR: superagent-absolute does exactly that.
Detailed:
You can create one abstraction layer on top of superagent.
function superagentAbsolute(agent) {
return baseUrl => ({
get: url => url.startsWith('/') ? agent.get(baseUrl + url) : agent.get(url),
});
}
⬑ That would override the agent.get when called with a starting /
global.request = superagentAbsolute(agent)('http://localhost:3030');
Now you would need to do the same for: DELETE, HEAD, PATCH, POST and PUT.
https://github.com/zurfyx/superagent-absolute/blob/master/index.js
const OVERRIDE = 'delete,get,head,patch,post,put'.split(',');
const superagentAbsolute = agent => baseUrl => (
new Proxy(agent, {
get(target, propertyName) {
return (...params) => {
if (OVERRIDE.indexOf(propertyName) !== -1
&& params.length > 0
&& typeof params[0] === 'string'
&& params[0].startsWith('/')) {
const absoluteUrl = baseUrl + params[0];
return target[propertyName](absoluteUrl, ...params.slice(1));
}
return target[propertyName](...params);
};
},
})
);
Or you can simply use superagent-absolute.
const superagent = require('superagent');
const superagentAbsolute = require('superagent-absolute');
const agent = superagent.agent();
const request = superagentAbsolute(agent)('http://localhost:3030');
it('should should display "It works!"', (done) => {
request
.get('/') // Requests "http://localhost:3030/".
.end((err, res) => {
expect(res.status).to.equal(200);
expect(res.body).to.eql({ msg: 'It works!' });
done();
});
});

Related

How to make a GET Request for a unique register with AXIOS and NodeJS/Express

I'm trying to make GET request to external API (Rick and Morty API). The objective is setting a GET request for unique character, for example "Character with id=3". At the moment my endpoint is:
Routes file:
import CharacterController from '../controllers/character_controller'
const routes = app.Router()
routes.get('/:id', new CharacterController().get)
export default routes
Controller file:
async get (req, res) {
try {
const { id } = req.params
const oneChar = await axios.get(`https://rickandmortyapi.com/api/character/${id}`)
const filteredOneChar = oneChar.data.results.map((item) => {
return {
name: item.name,
status: item.status,
species: item.species,
origin: item.origin.name
}
})
console.log(filteredOneChar)
return super.Success(res, { message: 'Successfully GET Char request response', data: filteredOneChar })
} catch (err) {
console.log(err)
}
}
The purpose of map function is to retrieve only specific Character data fields.
But the code above doesn't work. Please let me know any suggestions, thanks!
First of all I don't know why your controller is a class. Revert that and export your function like so:
const axios = require('axios');
// getCharacter is more descriptive than "get" I would suggest naming
// your functions with more descriptive text
exports.getCharacter = async (req, res) => {
Then in your routes file you can easily import it and attach it to your route handler:
const { getCharacter } = require('../controllers/character_controller');
index.get('/:id', getCharacter);
Your routes imports also seem off, why are you creating a new Router from app? You should be calling:
const express = require('express');
const routes = express.Router();
next go back to your controller. Your logic was all off, if you checked the api you would notice that the character/:id endpoint responds with 1 character so .results doesnt exist. The following will give you what you're looking for.
exports.getCharacter = async (req, res) => {
try {
const { id } = req.params;
const oneChar = await axios.get(
`https://rickandmortyapi.com/api/character/${id}`
);
console.log(oneChar.data);
// return name, status, species, and origin keys from oneChar
const { name, status, species, origin } = oneChar.data;
const filteredData = Object.assign({}, { name, status, species, origin });
res.send(filteredData);
} catch (err) {
return res.status(400).json({ message: err.message });
}
};

Pass query from Link to server, first time load query value undefined, after reload get correct query

I try to create some API to external adobe stock.
Like in the title, first time i get query from Link router of undefined, but after reload page it work correctly. My
main page
<Link
href={{
pathname: "/kategoria-zdjec",
query: images.zdjecia_kategoria
}}
as={`/kategoria-zdjec?temat=${images.zdjecia_kategoria}`}
className={classes.button}>
</Link>
and my server
app
.prepare()
.then(() => {
server.get("/kategoria-zdjec", async (req, res) => {
const temat = await req.query.temat;
console.log(temat)
const url = `https://stock.adobe.io/Rest/Media/1/Search/Files?locale=pl_PL&search_parameters[words]=${temat}&search_parameters[limit]=24&search_parameters[offset]=1`;
try {
const fetchData = await fetch(url, {
headers: { ... }
});
const objectAdobeStock = await fetchData.json();
res.json(objectAdobeStock);
const totalObj = await objectAdobeStock.nb_results;
const adobeImages = await objectAdobeStock.files;
} catch (error) {
console.log(error);
}
});
and that looks like getInitialProps on page next page
Zdjecia.getInitialProps = async ({req}) => {
const res = await fetch("/kategoria-zdjec");
const json = await res.json();
return { total: json.nb_results, images: json.files };
}
I think it is problem due asynchronous.
I think this might be due to the fact that you are using fetch which is actually part of the Web API and this action fails when executed on server.
You could either use isomorphic-fetch which keeps fetch API consistent between client and server, or use node-fetch when fetch is called on the server:
Zdjecia.getInitialProps = async ({ req, isServer }) => {
const fetch = isServer ? require('node-fetch') : window.fetch;
const res = await fetch("/kategoria-zdjec");
const json = await res.json();
return { total: json.nb_results, images: json.files };
}
This problem is solved, the issue was in another part of my app, directly in state management, just created new variables, and pass to link state value.

fs.writefile is making POST request loop infinitly within my express app

I have this current server code:
const express = require("express")
const fs = require("fs")
const router = express.Router()
const path = require("path")
const todos = JSON.parse(fs.readFileSync(path.join(__dirname, "../db", "todolist.json"), "utf8"))
router.get("/", async (req, res) => {
res.send(todos)
})
router.post("/new", async (req, res) => {
const { title, description } = req.body
const todoItem = {
id: "3",
title,
description
}
todos.todos.push(todoItem)
const data = JSON.stringify(todos, null, 2)
fs.writeFile(path.join(__dirname, "../db", "todolist.json"), data, () => {})
res.status(201).json(todoItem)
})
client:
console.log("Hello world!")
const somedata = {
title: "A new boy",
description: "Recieved from the client"
}
const main = async () => {
const response1 = await fetch("http://localhost:3000/todo", {
method: "GET",
})
const data1 = await response1.json()
const response2 = await fetch("http://localhost:3000/todo/new", {
method: "POST",
body: JSON.stringify(somedata),
headers: {
'Content-Type': 'application/json',
"Accept": "application/json"
}
})
const data2 = await response2.json()
return { data1, data2 }
}
main().then(data => console.log(data))
When I make a /POST request to create a new entity the browser just loops the request over and over until I manually have to quit the server. This does not happen if I use postman for some reason. Does anybody see any obvious error here with how the writeFile-method is used and why it continuously reloads the browser to keep pushing POST requests?
Thanks! :)
i had the same problem! And it took me about 1 hour to understand what my Problem is:
If you use "live server extension", the server will restart everytime, when you write, change or delete a file in the project folder!
So, if your node-app wirte a file, the live-server will restart and the app writes the file again! => loop
In my case, i write a pdf-file. All i had to do, is to tell the live server extension to ignore pdf files:
So i just add to "settings.json":
"liveServer.settings.ignoreFiles":["**/*.pdf"]
fs.writeFile is asynchronous function. So, to send a response after file written you must do it in the callback. And of course, don't forget about error checking. I.e.
router.post("/new", async (req, res) => {
const { title, description } = req.body
const todoItem = {
id: "3",
title,
description
}
todos.todos.push(todoItem)
const data = JSON.stringify(todos, null, 2)
fs.writeFile(path.join(__dirname, "../db", "todolist.json"), data, (err) => {
if(err) {
throw err;
}
res.status(201).json(todoItem)
})
})
Or you can use fs.writeFileSync as Muhammad mentioned earlier.
I think I found the problem. It seemed that the live server extension was messing things up when I had the client and server on separate ports, making the browser refresh for every request made somehow. I switched back to them sharing port, which then makes it work. I have to find a good way of separating them on a later basis without this bug happening, but that is for another time.
Thanks for your help :)
I share my working sample.body-parser dependency is need to get body in post request.Please don't change the order in server.js.Check and let me know.
and also check once whether your client code is in in loop.
My server.js
const express = require("express")
const fs = require("fs")
const router = express.Router()
const path = require("path")
const app = express();
const bodyParser = require("body-parser")
const todos = JSON.parse(fs.readFileSync(path.join(__dirname, "../db", "todolist.json"), "utf8"))
app.use(bodyParser.json());
app.use("/",router)
router.get("/todo", async (req, res) => {
res.send(todos)
})
router.post("/todo/new", async (req, res) => {
const { title, description } = req.body
const todoItem = {
id: "3",
title,
description
}
todos.todos.push(todoItem)
const data = JSON.stringify(todos, null, 2)
fs.writeFile(path.join(__dirname, "../db", "todolist.json"), data, () => {})
res.status(201).json(todoItem)
});
app.listen(3000, () => {
console.log(`Server running in Port`);
});
todolist.json
{
"todos": []
}
I think you should use fs.writeFileSync() or write some code in its callback

How do I get data out of a Node http(s) request?

How do I get the data from a https request outside of its scope?
Update
I've seen Where is body in a nodejs http.get response?, but it doesn't answer this question. In fact, that question isn't answered accurately, either. In the accepted answer (posted by the asker), a third party library is used. Since the library returns an object different from that returned by http.get() it doesn't answer the question.
I tried to set a variable to the return value of http.get() using await, but that returns a http.clientRequest and doesn't give me access to the response data that I need.
I'm using Node v8.9.4 with Express and the https module to request data from Google's Custom Search.
I have two routes. One for a GET request and one for a POST request used when submitting a form on the front page. They both basically serve the same purpose... request the data from CSE and present the data as a simple JSON string. Rather than repeat myself, I want to put my code for the CSE request into a function and just call the function within the callback for either route.
I thought about returning all the way up from the innermost callback, but that won't work because it wouldn't get to the request's error event handler or the necessary .end() call.
Here's a subset of the actual code:
app.get('/api/imagesearch/:query', newQuery)
app.post('/', newQuery)
function newQuery (req, res) {
let query = req.body.query || req.params.query
console.log(`Search Query: ${query}`)
res.status(200)
res.set('Content-Type', 'application/json')
// This doesn't work
let searchResults = JSON.stringify(cseSearch(req))
res.end(searchResults)
}
function cseSearch (request) {
let cseParams = '' +
`?q=${request.params.query}` +
`&cx=${process.env.CSE_ID}` +
`&key=${process.env.API_KEY}` +
'&num=10' +
'&safe=high' +
'&searchType=image' +
`&start=${request.query.offset || 1}`
let options = {
hostname: 'www.googleapis.com',
path: '/customsearch/v1' + encodeURI(cseParams)
}
let cseRequest = https.request(options, cseResponse => {
let jsonString = ''
let searchResults = []
cseResponse.on('data', data => {
jsonString += data
})
cseResponse.on('end', () => {
let cseResult = JSON.parse(jsonString)
let items = cseResult.items
items.map(item => {
let resultItem = {
url: item.link,
snippet: item.title,
thumbnail: item.image.thumbnailLink,
context: item.image.contextLink
}
searchResults.push(resultItem)
})
// This doesn't work... wrong scope, two callbacks deep
return searchResults
})
})
cseRequest.on('error', e => {
console.log(e)
})
cseRequest.end()
}
If you're curious, it's for a freeCodeCamp project: Image Search Abstraction Layer
using promise method solve this issue.
cseSearch(req).then(searchResults=>{
res.end(searchResults)
}).catch(err=>{
res.status(500).end(searchResults)
})
function cseSearch (request) {
return new Promise((resolve, reject)=>{
...your http request code
cseResponse.on('end', () => {
let cseResult = JSON.parse(jsonString)
let items = cseResult.items
items.map(item => {
let resultItem = {
url: item.link,
snippet: item.title,
thumbnail: item.image.thumbnailLink,
context: item.image.contextLink
}
searchResults.push(resultItem)
})
resolve(searchResults);
})
})
}
Based on what I explained in the comments, to give you an idea how compact your code could be using the request-promise library, here's what you could use:
const rp = require('request-promise-native');
app.get('/api/imagesearch/:query', newQuery)
app.post('/', newQuery)
function newQuery (req, res) {
let query = req.body.query || req.params.query
console.log(`Search Query: ${query}`)
cseSearch(req).then(results => {
res.json(results);
}).catch(err => {
console.log("newQueryError ", err);
res.sendStatus(500);
});
}
function cseSearch (request) {
let cseParams = '' +
`?q=${request.params.query}` +
`&cx=${process.env.CSE_ID}` +
`&key=${process.env.API_KEY}` +
'&num=10' +
'&safe=high' +
'&searchType=image' +
`&start=${request.query.offset || 1}`
let options = {
hostname: 'www.googleapis.com',
path: '/customsearch/v1' + encodeURI(cseParams),
json: true
};
return rp(options).then(data => {
return data.items.map(item => {
return {
url: item.link,
snippet: item.title,
thumbnail: item.image.thumbnailLink,
context: item.image.contextLink
};
});
});

Fetch with absolute url prefix

Most of the times I prefix fetch or node-fetch with an http://localhost (to make it an absolute url).
import fetch from 'node-fetch';
fetch('http://localhost/whatever')
Is there any way of avoiding the localhost part, other than simply placing localhost in a variable?
const baseUrl = 'http://localhost';
fetch(`${baseUrl}/whatever`)
Very related to Superagent with absolute url prefix
TL;DR: fetch-absolute does exactly that.
Detailed:
You can create one abstraction layer on top of fetch.
function fetchAbsolute(fetch) {
return baseUrl => (url, ...otherParams) => url.startsWith('/') ? fetch(baseUrl + url, ...otherParams) : fetch(url, ...otherParams)
}
Or you can simply use fetch-absolute.
const fetch = require('node-fetch');
const fetchAbsolute = require('fetch-absolute');
const fetchApi = fetchAbsolute(fetch)('http://localhost:3030');
it('should should display "It works!"', async () => {
const response = await fetchApi('/');
const json = await response.json();
expect(json).to.eql({ msg: 'It works!' });
});
You can override the fetch function:
import origFetch from 'node-fetch';
const fetch = (url, ...params) => {
if (url.startsWith('/')) return origFetch('http://localhost' + url, ...params)
else return origFetch(url, ...params);
}
The other answer creates a function that returns a function that returns a function--that's not necessary; you just need to return a function.
function fetchAbsolute(fetch, base_url) {
return (url, ...params) => {
if (url.startsWith('/')) return fetch(base_url + url, ...params)
else return fetch(url, ...params);
}
}
const fetch = fetchAbsolute(origFetch, 'http://localhost');

Resources