How do you handle CORS in an electron app? - node.js

I'm building an electron app and need to call APIs where the API provider has not enabled CORS. The typically proposed solution is to use a reverse proxy which is trivial to do when running locally by using node and cors-anywhere like this:
let port = (process.argv.length > 2) ? parseInt (process.argv[2]) : 8080;
require ('cors-anywhere').createServer ().listen (port, 'localhost');
The app can then be configured to proxy all requests through the reverse proxy on localhost:8080.
So, my questions are:
Is it possible to use node and cors-anywhere in an electron app to create a reverse proxy? I don't want to force the app to make calls to a remote server.
Is there a better or standard way of doing this in an Electron app? I'm assuming I'm not the first to run into CORS issues. :)

Just overide header before send request using webRequest.onBeforeSendHeaders
const filter = {
urls: ['*://*.google.com/*']
};
const session = electron.remote.session
session.defaultSession.webRequest.onBeforeSendHeaders(filter, (details, callback) => {
details.requestHeaders['Origin'] = null;
details.headers['Origin'] = null;
callback({ requestHeaders: details.requestHeaders })
});
put these codes in renderer process

In my application, it wasn't sufficient to remove the Origin header (by setting it to null) in the request. The server I was passing the request to always provided the Access-Control-Allow-Origin header in the response, regardless of it the Origin header is present in the request. So the embedded instance of Chrome did not like that the ACAO header did not match its understanding of the origin.
Instead, I had to change the Origin header on the request and then restore it on the Access-Control-Allow-Origin header on the response.
app.on('ready', () => {
// Modify the origin for all requests to the following urls.
const filter = {
urls: ['http://example.com/*']
};
session.defaultSession.webRequest.onBeforeSendHeaders(
filter,
(details, callback) => {
console.log(details);
details.requestHeaders['Origin'] = 'http://example.com';
callback({ requestHeaders: details.requestHeaders });
}
);
session.defaultSession.webRequest.onHeadersReceived(
filter,
(details, callback) => {
console.log(details);
details.responseHeaders['Access-Control-Allow-Origin'] = [
'capacitor-electron://-'
];
callback({ responseHeaders: details.responseHeaders });
}
);
myCapacitorApp.init();
});

Try this if you are running web apps in localhost
const filter = {
urls: ['http://example.com/*'] // Remote API URS for which you are getting CORS error
}
browserWindow.webContents.session.webRequest.onBeforeSendHeaders(
filter,
(details, callback) => {
details.requestHeaders.Origin = `http://example.com/*`
callback({ requestHeaders: details.requestHeaders })
}
)
browserWindow.webContents.session.webRequest.onHeadersReceived(
filter,
(details, callback) => {
details.responseHeaders['access-control-allow-origin'] = [
'capacitor-electron://-',
'http://localhost:3000' // URL your local electron app hosted
]
callback({ responseHeaders: details.responseHeaders })
}
)

Just had this issue today API calls with axios inside a React app bundled in Electron is returning 400
From what I can see Electron calls act as normal calls to the API urls meaning they are not affected by CORS.
Now when you wrap your calls with a CORS proxy and make a regular call to the proxy, it should error with a 400 error because it's not a CORS call.
This thread explains why cors-anywhere responds like that => https://github.com/Rob--W/cors-anywhere/issues/39
I actually removed my CORS proxies from the app before the Electron build. I still need the CORS proxy for development since I'm testing in the browser.
Hope this helps.

You can have the main process, the NodeJS server running Electron, send the request. This avoids CORS because this is a server-to-server request. You can send an event from the frontend (the render process) to the main process using IPC. In the main process you can listen to this event, send the HTTP request, and return a promise to the frontend.
In main.js (the script where the Electron window is created):
import { app, protocol, BrowserWindow, ipcMain } from ‘electron’
import axios from 'axios'
ipcMain.handle('auth', async (event, ...args) => {
console.log('main: auth', event, args) const result = await axios.post(
'https://api.com/auth',
{
username: args[0].username,
password: args[0].password,
auth_type: args[1],
},
) console.log('main: auth result', result)
console.log('main: auth result.data', result.data) return result.data
})
In your frontend JS:
import { ipcRenderer } from 'electron'
sendAuthRequestUsingIpc() {
return ipcRenderer.invoke('auth',
{
username: AuthService.username,
password: AuthService.password,
},
'password',
).then((data) => {
AuthService.AUTH_TOKEN = data['access_token']
return true
}).catch((resp) => console.warn(resp))
}
I wrote an article that goes into more depth here.

While I have struggled a while with the existing answers I will provide here the solution that finally worked for me, assuming that you are on the main process.
Here are the steps involved:
You need to have access to the session object which can be obtained by one of two ways:
A) via the global session.defaultSession which is available after the app is ready.
const { session } = require('electron');
const curSession = session.defaultSession;
B) The other method is via the session on the BrowserWindow, this assumes that the windnows is already created.
win = new BrowserWindow({});
const curSession = win.webContents.session;
Once you have the session object you set the response header to the site you are sending the request from.
For example, let's say your electron BrowserWindow is loaded from http://localhost:3000 and you are making a request to example.com, here would be some sample code:
const { app, BrowserWindow, session } = require('electron');
app.whenReady().then(_ => {
// If using method B for the session you should first construct the BrowserWindow
const filter = { urls: ['*://*.example.com/*'] };
session.defaultSession.webRequest.onHeadersReceived(filter, (details, callback) => {
details.responseHeaders['Access-Control-Allow-Origin'] = [ 'http://localhost:3000' ];
callback({ responseHeaders: details.responseHeaders });
}
// Construct the BrowserWindow if haven't done so yet...
});

Have you tried using fetch()
Check how to use fetch to make a no-cors request here
https://developers.google.com/web/updates/2015/03/introduction-to-fetch?hl=en

Related

Why in svelte-kit's load function a valid route can't be resolved when code runs on the server

In my svelte-kit application I was struggeling with this NODE error ERR_INVALID_URL but was able to fix it with a solution provided in this thread. Unfortunately a deeper explanation as of why NODE can't parse the url - which is obviously only a valid route when the code runs on the client - was ommitted.
In svelte-kit's load function I'm implicitly fetch -ing an, from nodejs' perspective, invalid url (ERR_INVALID_URL)
So what I'd love to understand is, WHY does NODE fail to resolve/parse the given url?
Prerequisits:
// in $lib/utils/http.js
export function post(endpoint, data = {}) {
return fetch(endpoint, {
method: "POST",
credentials: "include",
body: JSON.stringify(data),
headers: {
"Content-Type": "application/json",
},
}).then((r) => r.json());
}
// in routes/auth/login.js -> this endpoint can't be be found by NODE
export async function post({ locals, request }) {
// ...code here
return {
body: request.json()
}
}
Here the distinction has to be made of whether the code runs on the client or on the server:
// in routes/login.svelte
import { browser } from '$app/env';
import { post } from '$lib/utils/http.js';
export async function load() {
const { data } = someDataObject;
if (browser) { // NODE wouldn't be able to find the endpoint in question ('/auth/login'), whereas the client does
return await post(`/auth/login`, { data }).then((response) => {
// ...do something with the response
});
}
return {};
}
Thanks for any explanation that sheds some light into this.
You should refactor your load function to use the fetch provided by SvelteKit. This will allow you to use relative requests on the server, which normally requires an origin. From the docs (emphasis mine):
fetch is equivalent to the native fetch web API, with a few additional
features:
it can be used to make credentialed requests on the server, as it inherits the cookie and authorization headers for the page request
it can make relative requests on the server (ordinarily, fetch requires a URL with an origin when used in a server context)
requests for endpoints go direct to the handler function during server-side rendering, without the overhead of an HTTP call
during server-side rendering, the response will be captured and inlined into the rendered HTML
during hydration, the response will be read from the HTML, guaranteeing consistency and preventing an additional network request
So, get the fetch from the parameter passed to load...
export async function load({ fetch }) {
const { data } = someDataObject;
return await post(`/auth/login`, fetch, { data }).then((response) => {
// ...do something with the response
});
}
... and use it in your post function
// in $lib/utils/http.js
export function post(endpoint, fetch, data = {}) { /* rest as before */ }
A future enhancement to SvelteKit may make it so you don't have to pass fetch to your utility function, but this is what you have to do for now.

PayFast integration in NodeJS / ReactJS

I am trying to integrate PayFast into my React / NodeJS app. Using Express, my NodeJS successfully retrieves a payment uuid from the PayFast endpoint (I see this uuid in my console log) -
app.get("/api", async (req, res) => {
paymentData["signature"] = generateSignature(paymentData, phrase);
console.log(paymentData["signature"])
const str = dataToString(paymentData)
const id = await getPaymentId(str)
res.json({uuid: id})
})
However, in my front end (ReactJS) I am getting an undefined response & possible CORS issue from my backend API end point when trying to retrieve this uuid -
My custom fetch hook:
export default function useFetch(baseUrl) {
const [loading, setLoading] = useState(true);
function get() {
return new Promise((resolve, reject) => {
fetch(baseUrl)
.then(res => {
console.log(res)
res.json()
})
.then(data => {
console.log(data);
if (!data) {
setLoading(false);
return reject(data);
}
setLoading(false);
resolve(data);
})
.catch(error => {
setLoading(false);
reject(error);
});
});
}
return { get, loading };
};
The error:
Response {type: 'cors', url: 'http://localhost:3001/api', redirected: false, status: 200, ok: true, …}
undefined
If I test my NodeJS end point from my browser, it successfully comes back with my payment uuid. Any one have any ideas why my React app is acting up?
Update your CORS config to accept connections from the React app host.
app.use(cors({
origin: 'http://localhost:3000',
}));
Open package.json of your react app and add a line on the bottom of the json file:
"proxy":"http://localhost:3001"
3001 is the PORT that your Node http server is running on locally, if it's another PORT just change it accordingly.
This will redirect all http traffic from your webpack dev server running on PORT 3000, to your Node server running on 3001.
For those others who might encounter a similar type of an issue, I have attached a blog post with the method that I have used to solve the CORS issue, as well as integrate with the PayFast API.
https://codersconcepts.blogspot.com/2022/04/nodejs-payfast-integration.html

504 timeout in aws EC2 when calling some external api url

I have a following Next.js api route for testing purpose.
All 4 axois call work perfectly in localhost;
But when in production (hosted by AWS EC2);
The last 2 calls failed with a reason of 504 gateway time-out.
I had thought of nginx and AWS in-bound/out-bound setup, but if that's the case, the first two mock api shouldn't work as well.
I don't know why it happens. Or it has something to do with api protection from those website?
But then why it work in localhost
import axios from "axios";
import { NextApiHandler } from "next";
export default const MockApi: NextApiHandler = async (req, res) => {
try {
// mock set
// work in localhost and production
const { data: testData } = await axios.get("https://jsonplaceholder.typicode.com/todos/1");
console.log(testData);
const { data: mockData } = await axios.get("https://reqres.in/api/users?page=2");
console.log(mockData);
// some real life api
// work in localhost but failed in production with 504 gateway time-out
const { data: mockData2 } = await axios.get("https://www.target.com.au/ws-api/v1/target/products/search?category=W95362");
console.log(mockData2);
const { data } = await axios.get("https://api.nasdaq.com/api/ipo/calendar");
console.log(data);
res.status(200).send({});
} catch (err) {
res.status(403).json(err);
}
};
After investigation, it seems like the problem has something to do with AWS internal policies, so that you are not able to use AWS services to abuse particular affiliated companies and some public APIs.
This answer could be incorrect if someone is able to sort the issue out.

How to dynamically add CORS sites to Google Cloud App Engine Node API

I am new to API deployment.
I have a Node Express API that has CORS enabled in the root app.js, for the API and a socket.io implementation:
var app = express();
app.use(cors({
origin : ["http://localhost:8080", "http://localhost:8081"],
credentials: true
}))
and
const httpServer = createServer(app);
const io = new Server(httpServer, {
cors: {
origin: ["http://localhost:8080", "http://localhost:8081"],
credentials: true,
methods: ["GET"]
}
});
I will set up a sales website that allows a customer to pay for a license to use the API with their site, i.e. https://www.customersite.com
My question is how can I dynamically add the customer's website (say after they submit a form from another site) to the CORS list? Ideally it would be via an API call. The only option which I can think of (that is not automated) is to manually maintain a global js file (i.e. config.js) with the cors list from within the Google platform using the file explorer / editor, and to iterate over it as an array similar to process.env.customerList. This will not work for me as I need to have this step happen automatically.
Any and all suggestions are appreciated.
Solution: Use a process manager like pm2 to 'reload' the API gracefully with close to no downtime.
PM2 reloads can be triggered programmatically. I made a PUT endpoint for modifying CORS list /cors/modify that sent a programmatic pm2 message when a successful modification was done.
Note: on Windows OS must use programmatic messaging:
pm2.list(function(err, list) {
pm2.sendDataToProcessId(list[0].pm2_env.pm_id,
{
type : 'process:msg',
data : {
msg : 'shutdown'
},
topic: true
},
function(err, res) {
console.log(err);
pm2.disconnect(); // Disconnects from PM2
}
);
if (err) {
console.log(err);
pm2.disconnect(); // Disconnects from PM2
}
});
which can then be caught with
process.on('message', async function(msg) {
if (msg == "shutdown" || msg.data.msg == 'shutdown') {
console.log("Disconnecting from DB...");
mongoose.disconnect((e => {
if (e) {
process.exit(1)
} else {
console.log("Mongoose connection removed");
httpServer.close((err) => {
if (err) {
console.error(err)
process.exit(1)
}
process.exit(0);
})
}
}));
}
});

This is a general expressjs running on node.js inside a docker container and on the cloud question

I have built two docker images. One with nginx that serves my angular web app and another with node.js that serves a basic express app. I have tried to access the express app from my browser in two different tabs at the same time.
In one tab the angular dev server (ng serve) serves up the web page. In the other tab the docker nginx container serves up the web page.
While accessing the node.js express app at the same time from both tabs the data starts to mix and mingle and the results returned to both tabs are a mix mash of the two requests (one from each browser tab)...
I'll try and make this more simple by showing my express app code here...but to answer this question you may not even need to know what the code is at all...so maybe check the question as stated below the code first.
'use strict';
/***********************************
GOOGLE GMAIL AND OAUTH SETUP
***********************************/
const fs = require('fs');
const {google} = require('googleapis');
const gmail = google.gmail('v1');
const clientSecretJson = JSON.parse(fs.readFileSync('./client_secret.json'));
const oauth2Client = new google.auth.OAuth2(
clientSecretJson.web.client_id,
clientSecretJson.web.client_secret,
'https://us-central1-labelorganizer.cloudfunctions.net/oauth2callback'
);
/***********************************
EXPRESS WITH CORS SETUP
***********************************/
const PORT = 8000;
const HOST = '0.0.0.0';
const express = require('express');
const cors = require('cors');
const cookieParser = require('cookie-parser');
const bodyParser = require('body-parser');
const whiteList = [
'http://localhost:4200',
'http://localhost:80',
'http://localhost',
];
const googleApi = express();
googleApi.use(
cors({
origin: whiteList
}),
cookieParser(),
bodyParser()
);
function getPageOfThreads(pageToken, userId, labelIds) {
return new Promise((resolve, reject) => {
gmail.users.threads.list(
{
'auth': oauth2Client,
'userId': userId,
'labelIds': labelIds,
'pageToken': pageToken
},
(error, response) => {
if (error) {
console.error(error);
reject(error);
}
resolve(response.data);
}
)
});
}
async function getPages(nextPageToken, userId, labelIds, result) {
while (nextPageToken) {
let pageOfThreads = await getPageOfThreads(nextPageToken, userId, labelIds);
console.log(pageOfThreads.nextPageToken);
pageOfThreads.threads.forEach((thread) => {
result = result.concat(thread.id);
})
nextPageToken = pageOfThreads.nextPageToken;
}
return result;
}
googleApi.post('/threads', (req, res) => {
console.log(req.body);
let threadIds = [];
oauth2Client.credentials = req.body.token;
let getAllThreadIds = new Promise((resolve, reject) => {
gmail.users.threads.list(
{ 'auth': oauth2Client, 'userId': 'me', 'maxResults': 500 },
(err, response) => {
if (err) {
console.error(err)
reject(err);
}
if (response.data.threads) {
response.data.threads.forEach((thread) => {
threadIds = threadIds.concat(thread.id);
});
}
if (response.data.nextPageToken) {
getPages(response.data.nextPageToken, 'me', ['INBOX'], threadIds).then(result => {
resolve(result);
}).catch((err) => {
console.error(err);
reject(err);
});
} else {
resolve(threadIds);
}
}
);
});
getAllThreadIds
.then((result) => {
res.send({ threadIds: result });
})
.catch((error) => {
res.status(500).send({ error: 'Request failed with error: ' + error })
});
});
googleApi.get('/', (req, res) => res.send('Hello World!'))
googleApi.listen(PORT, HOST);
console.log(`Running on http://${HOST}:${PORT}`);
The angular app makes a simple request to the express app and waits for the reply...which it properly receives...but when I try to make two requests at the exact same time data starts to get mixed together and results are given back to each browser tab from different accounts...
...and the question is... When running containers in the cloud is this kind of thing an issue? Does one need to spin up a new container for each client that wants to actively connect to the express service so that their data doesn't get mixed?
...or is this an issue I am seeing because the express app is being accessed from locally inside my machine? If two machines with two different ip address tried to access this express server at the same time would this sort of data mixing still be an issue or would each get back it's own set of results?
Is this why people use CaaS instead of IaaS solutions?
FYI: this is demo code and the data will not be actually going back to the consumer directly...plans are to have it placed into a database and then re-extracted from the database to download all of the metadata headers for each email.
-Thank you for your time
I can only clear up a small part of this question:
When running containers in the cloud is this kind of thing an issue?
No. Docker is not causing any of the quirky behaviour that you are describing.
Does one need to spin up a new container for each client?
A docker container generally can serve as much users as the application inside of it can. So as long as your application can handle a lot of users (and it should), you don't have to start the same application in multiple containers. That said, when you expect a very large number of customers, there exist docker tools like Docker Compose, Docker Swarm and a lot of alternatives that will enable you to scale up later. For now, you don't need to worry about this at all.
I think I may have found out the issue with my code...and this is actually very important if you are using the node.js googleapis client library...
It is entirely necessary to create a new oauth2Client for each request that comes in
const oauth2Client = new google.auth.OAuth2(
clientSecretJson.web.client_id,
clientSecretJson.web.client_secret,
'https://us-central1-labelorganizer.cloudfunctions.net/oauth2callback'
);
Problem:
When this oauth2Client is shared it is shared by each and every person that connects at the same time...So it is necessary to create a new one each and every time a user connects to my /threads endpoint so that they do not share the same memory space (i.e. access_token etc.) while the processing is done.
Setting the client secret etc. and creating the oauth2Client just once at the top and then simply resetting the token for each request leads to the conflicts mentioned above.
Solution:
For now simply moving the creation of this oauth2Client into each and every request that comes in makes this work properly.
Each client that connects to the service NEEDS to have their own newly created oauth2Client instance or these types of conflicts will occur...
...it's kind of a no brainer but I still find it odd that there is nothing about this in the docs? and their own examples (https://github.com/googleapis/google-api-nodejs-client) seem to show only one instance being created for the whole of their app...but those examples are snippets so...

Resources