Nodejs - Axios not using Cookie for post request - node.js

I'm struggling with AXIOS: it seems that my post request is not using my Cookie.
First of all, I'm creating an Axios Instance as following:
const api = axios.create({
baseURL: 'http://mylocalserver:myport/api/',
header: {
'Content-type' : 'application/json',
},
withCredentials: true,
responseType: 'json'
});
The API I'm trying to interact with is requiring a password, thus I'm defining a variable containing my password:
const password = 'mybeautifulpassword';
First, I need to post a request to create a session, and get the cookie:
const createSession = async() => {
const response = await api.post('session', { password: password});
return response.headers['set-cookie'];
}
Now, by using the returned cookie (stored in cookieAuth variable), I can interact with the API.
I know there is an endpoint allowing me to retrieve informations:
const readInfo = async(cookieAuth) => {
return await api.get('endpoint/a', {
headers: {
Cookie: cookieAuth,
}
})
}
This is working properly.
It's another story when I want to launch a post request.
const createInfo = async(cookieAuth, infoName) => {
try {
const data = JSON.stringify({
name: infoName
})
return await api.post('endpoint/a', {
headers: {
Cookie: cookieAuth,
},
data: data,
})
} catch (error) {
console.log(error);
}
};
When I launch the createInfo method, I got a 401 status (Unauthorized). It looks like Axios is not using my cookieAuth for the post request...
If I'm using Postman to make the same request, it works...
What am I doing wrong in this code? Thanks a lot for your help

I finally found my mistake.
As written in the Axios Doc ( https://axios-http.com/docs/instance )
The specified config will be merged with the instance config.
after creating the instance, I must follow the following structure to perform a post requests:
axios#post(url[, data[, config]])
My requests is working now :
await api.post('endpoint/a', {data: data}, {
headers: {
'Cookie': cookiesAuth
}
});

Related

API call from https node application never reaches the destination

I have a node.js application served over https. I would like to call an API from that application. The API is also served over https and it has been generated using the express-generator.
Unfortunately the call never works. There is no error message. The call never reaches the API.
Strangely enough if I try to call another public API (e.g. https://api.publicapis.org/entries') that is working perfectly.
Here is my call:
const requestBody = {
'querystring': searchQuery,
};
const options = {
rejectUnauthorized: false,
keepAlive: false, // switch to true if you're making a lot of calls from this client
};
return new Promise(function (resolve, reject) {
const sslConfiguredAgent = new https.Agent(options);
const requestOptions = {
method: 'POST',
body: JSON.stringify(requestBody),
agent: sslConfiguredAgent,
redirect: 'follow',
};
fetch('https://192.168.112.34:3003/search', requestOptions)
.then(response => response.text())
.then(result => resolve(result))
.catch(error => console.log('error', error));
});
};
And here is the API which I would like to call:
router.post('/', cors(), async function(req, res, next) {
req.body;
queryString = req.body.querystring;
let data = JSON.stringify({
"query": {
"match": {
"phonetic": {
"query": queryString,
"fuzziness": "AUTO",
"operator": "and"
}
}
}
});
const { body } = await client.search({
index: 'phoneticindex',
body: data
});
res.send(body.hits.hits)
});
What is wrong with my API and/or the way I am trying to communicate with it?
UPDATE: I receive the following error in the fetch catch block: 'TypeError: Failed to fetch'
When I create a request in Postman I receive the expected response.
UPDATE 2: This is most probably an SSL related issue. The webapp is expecting an API with a valid certificate. Obviously my API can only have a self signed cert which is not enough here. How can I generate a valid cert for an API which is running on the local network and not publicly available?
UPDATE 3: I managed to make it work by changing the fetch parameters like this:
fetch(url, {
method: 'POST',
headers: {'Content-Type': 'application/json'},
mode: 'cors',
body: raw,
agent: httpsAgent,
redirect: 'follow',
})
and on the API side I added the following headers:
'Content-Type': 'application/json',
'Access-Control-Allow-Origin' : 'https://localhost:2200',
'Access-Control-Allow-Methods' : 'POST',
'Access-Control-Allow-Headers' : 'Content-Type, Authorization'
I also added app.use(cors()) and regenerated the self-signed certificates.

AWS-Lambda 302 Not Redirecting after getting response from Axios (Frontend)

I'm trying to setup a Google-OAuth flow using serverless and AWS-Lambdas. To start, I have a button that kicks off the process by hitting a lambda endpoint. However, the page never actually redirects to the authentication page. Instead I get an error on the FE:
Request failed with status code 302
Frontend logic:
const redirectToGoogleOAuth = async (user) => {
try {
const endpoint = process.env.GOOGLE_PATH_ENDPOINT;
const response = await axios.get(endpoint, {
responseType: 'text',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${user}`,
},
});
// Expect redirect at this point
return response.data.data;
} catch (err) {
throw new Error(err.message);
}
};
Lambda Endpoint:
module.exports = async (event, context) => {
const responseType = 'code'
const googleAuthorizeURL = 'https://accounts.google.com/o/oauth2/v2/auth'
const scope = 'openid email https://www.googleapis.com/auth/contacts.readonly'
const accessType = 'offline'
try {
const params = [
`response_type=${responseType}`,
`client_id=${googleClientId}`,
`redirect_uri=${baseURL}`,
`scope=${scope}`,
`state="state"`,
`access_type=${accessType}`
]
const googleOAuthEndPath = `${googleAuthorizeURL}?${params.join('&')}`
const response = {
statusCode: 302,
body: '',
headers: {
location: googleOAuthEndPath
}
}
return response
} catch (err) {
return response(400, err.message)
}
}
In the lambda-response, I've added a header for location with the google-path. However, the frontend does not seem to consume the response correctly. The frontend interprets the 302 as in error instead of redirecting to the specific page. Any ideas on how I may resolve this so it actually redirects?
Axios uses XHR, which always follows redirects by itself and therefore Axios can't do anything about it (unless you rely on hacks, discussed in the same link).
You might have to use something other than Axios for this part, such as the Fetch API, which supports manual redirects.
GitHub user parties suggested the fetch() equivalent in the same Axios issue linked above:
fetch("/api/user", {
redirect: "manual"
}).then((res) => {
if (res.type === "opaqueredirect") {
window.location.href = res.url;
} else {
return res;
}
}).catch(handleFetchError);

Nodejs googleAPI how to authenticate JWT client with only GET request

its known that the GoogleAPI is pretty much just built on GET requests, the helper node.js module googleapis seemingly just lets you send those GET requests in an easier way, for example:
fetch("https://www.googleapis.com/drive/v3/files/MYID/export?mimeType=text/plain",
{
method:"GET",
headers: {
'Accept-Encoding': 'gzip',
'User-Agent': 'google-api-nodejs-client/0.7.2 (gzip)',
Authorization:t.token_type +" "+ t.access_token,
Accept:"application/json"
}
})
.then(res => res.text())
.then(body => console.log(body));
accomplishes the same as using their built in method:
var drive = google.drive("v3");
drive.files.export({
auth:j,
fileId:"1oBL8sBylvODW1Md0gjXf2UgwBHtb6aRcgyGOYr0kRvY"
}, (e,r) => {
if(e) console.log(e);
else {
console.log(r,"data:::::",r.data);
}
});
So the question: When I'm validating the JWT client in nodejs using a custom service key, I do this:
var google = require("googleapis").google,
creds = require("./mycreds.json"),
fetch = require("node-fetch");
var j = new google.auth.JWT(
creds.client_email,
null,
creds.private_key,
[
"https://www.googleapis.com/auth/drive"
]
);
console.log(google.auth.JWT, "END this");
j.authorize((r,t) => {
doOther(t);
});
I suspect that this is, like every other part of the google API I've encountered so far, just an easier way to send a GET request to a certain URL with certain headers (containing the authorization key).
The question: What exactly is that URL and the headers necessary to do the same thing as j.authorize ? I couldn't find where to look in the reference.
HINT: from line 159 in jwtclient.js file:
createGToken() {
if (!this.gtoken) {
this.gtoken = new gtoken_1.GoogleToken({
iss: this.email,
sub: this.subject,
scope: this.scopes,
keyFile: this.keyFile,
key: this.key,
additionalClaims: this.additionalClaims
});
}
return this.gtoken;
}
still looking though at gtoken_1.GoogleToken etc..

400 Bad request while fetching data in post call

I am running my React js web app in one port 3000.
For node server I am using 4000.
While calling fetch method it returns `400 Bad request'.
Error
POST http://localhost:4006/auth/admin 400 (Bad Request)
react code npm started in 3000 port
fetch('http://localhost:4000/auth/admin',
{ mode: 'no-cors',
method: "POST",
body: JSON.stringify({
username:"admin",
password:"1234"
}),
headers: {
"Content-Type": "application/json",
'Accept': 'application/json, text/plain, */*',
credentials: "omit", //
// "Content-Type": "application/x-www-form-urlencoded",
},
})
.then((response) => console.log(response));
node code running in 4000 port
const passport = require("passport");
const route = require("../constants/routeStrings");
const keys = require("../config/keys");
const processStatus = require("../constants/processStatus");
const success = {
status: processStatus.SUCCESS
};
const failute = {
status: processStatus.FAILURE
};
module.exports = app => {
app.post('/auth/admin', passport.authenticate("local"), (req, res) => {
res.send(success);
});
};
Do not stringify the body. Change from
body: JSON.stringify({
username:"admin",
password:"1234"
}),
to
body: {
username:"admin",
password:"1234"
},
The 400 response is raised by passport since it is unable to read your params. You need to tell your "node" app to parse them before your actual routes.
// Import body parser, you should read about this on their git to understand it fully
const parser = require('body-parser');
const urlencodedParser = parser.urlencoded({extended : false});
// before your routes
app.use(parser .json());
app.use(urlencodedParser) // This will parse your body and make it available for your routes to use
Then do your other calls.
Also, make sure that you are sending username and password keys, otherwise read the documentation on how to change these key names to something else
I suffered long hours, but I overcame it throw writing those lines of code blocks. I successfully send the request to the server's controller, hopefully yours: make it try.
First define a async function to make POST request:
async function _postData(url = '', data = {}) {
const response = await fetch(url, {
method: 'POST',
mode: 'cors',
cache: 'no-cache',
credentials: 'same-origin',
redirect: 'follow',
referrerPolicy: 'no-referrer',
headers: {
"Content-type": "application/json; charset=UTF-8"
},
body: JSON.stringify(data)
});
return response.json();
}
Now create a request JSON payload:
let requestPayload = {
propertyName1: 'property value1',
propertyName2: 'property value23',
propertyName3: 'property value',
So on
}
Note: Request model will be your desired model, what request payload you actually send.
Now make a request using this payload including your end point URL:
_postData('http://servername/example', requestPayload )
.then(json => {
console.log(json) // Handle success
})
.catch(err => {
console.log(err) // Handle errors
});
100% worked on my project.

Node JS upload file streams over HTTP

I'm switching one of my projects from request over to something a bit more light-weight (such as got, axios, or fetch). Everything is going smoothly, however, I'm having an issue when attempting to upload a file stream (PUT and POST). It works fine with the request package, but any of the other three return a 500 from the server.
I know that a 500 generally means an issue on the server's end, but it is consistent only with the HTTP packages that I'm testing out. When I revert my code to use request, it works fine.
Here is my current Request code:
Request.put(`http://endpoint.com`, {
headers: {
Authorization: `Bearer ${account.token.access_token}`
},
formData: {
content: fs.createReadStream(localPath)
}
}, (err, response, body) => {
if (err) {
return callback(err);
}
return callback(null, body);
});
And here is one of the attempts using another package (in this case, got):
got.put(`http://endpoint.com`, {
headers: {
'Content-Type': 'multipart/form-data',
Authorization: `Bearer ${account.token.access_token}`,
},
body: {
content: fs.createReadStream(localPath)
}
})
.then(response => {
return callback(null, response.body);
})
.catch(err => {
return callback(err);
});
Per the got documentation, I've also tried using the form-data package in conjunction with it according to its example and I still get the same issue.
The only difference between these 2 I can gather is with got I do have to manually specify the Content-Type header otherwise the endpoint does give me a proper error on that. Otherwise, I'm not sure how the 2 packages are constructing the body with the stream, but as I said, fetch and axios are also producing the exact same error as got.
If you want any of the snippets using fetch or axios I'd be happy to post them as well.
I know this question was asked a while ago, but I too am missing the simple pipe support from the request package
const request = require('request');
request
.get("https://res.cloudinary.com/demo/image/upload/sample.jpg")
.pipe(request.post("http://127.0.0.1:8000/api/upload/stream"))
// Or any readable stream
fs.createReadStream('/Users/file/path/localFile.jpeg')
.pipe(request.post("http://127.0.0.1:8000/api/upload/stream"))
and had to do some experimenting to find similar features from current libraries.
Unfortunately, I haven't worked with "got" but I hope the following 2 examples help someone else that are interested in working with the Native http/https libraries or the popular axios library
HTTP/HTTPS
Supports piping!
const http = require('http');
const https = require('https');
console.log("[i] Test pass-through: http/https");
// Note: http/https must match URL protocol
https.get(
"https://res.cloudinary.com/demo/image/upload/sample.jpg",
(imageStream) => {
console.log(" [i] Received stream");
imageStream.pipe(
http.request("http://localhost:8000/api/upload/stream/", {
method: "POST",
headers: {
"Content-Type": imageStream.headers["content-type"],
},
})
);
}
);
// Or any readable stream
fs.createReadStream('/Users/file/path/localFile.jpeg')
.pipe(
http.request("http://localhost:8000/api/upload/stream/", {
method: "POST",
headers: {
"Content-Type": imageStream.headers["content-type"],
},
})
)
Axios
Note the usage of imageStream.data and that it's being attached to data in the Axios config.
const axios = require('axios');
(async function selfInvokingFunction() {
console.log("[i] Test pass-through: axios");
const imageStream = await axios.get(
"https://res.cloudinary.com/demo/image/upload/sample.jpg",
{
responseType: "stream", // Important to ensure axios provides stream
}
);
console.log(" [i] Received stream");
const upload = await axios({
method: "post",
url: "http://127.0.0.1:8000/api/upload/stream/",
data: imageStream.data,
headers: {
"Content-Type": imageStream.headers["content-type"],
},
});
console.log("Upload response", upload.data);
})();
Looks like this was a headers issue. If I use the headers directly from FormData (i.e., headers: form.getHeaders()) and just add in my additional headers afterwards (Authorization), then this ends up working just fine.
For me just works when I added other parameters on FormData.
before
const form = new FormData();
form.append('file', fileStream);
after
const form = new FormData();
form.append('file', fileStream, 'my-whatever-file-name.mp4');
So that way I can send stream from my backend to another backend in node, waiting a file in multipart/form-data called 'file'

Resources