When a user double clicks on a file with an extension of cmf, I would like to automatically launch the Electron application that I've built. I've been searching around and I've seen several mentions of the electron-builder but no examples of how this can be used to create this association.
You want to look at the protocol functionality. I don't have enough experience with it to understand the finer points: like which app takes precedence if multiple app register the same protocol. Some of that might be user-defined.
const { app, protocol } = require('electron')
const path = require('path')
app.on('ready', () => {
protocol.registerFileProtocol('atom', (request, callback) => {
const url = request.url.substr(7)
callback({ path: path.normalize(`${__dirname}/${url}`) })
}, (error) => {
if (error) console.error('Failed to register protocol')
})
})
Related
I am new to API deployment.
I have a Node Express API that has CORS enabled in the root app.js, for the API and a socket.io implementation:
var app = express();
app.use(cors({
origin : ["http://localhost:8080", "http://localhost:8081"],
credentials: true
}))
and
const httpServer = createServer(app);
const io = new Server(httpServer, {
cors: {
origin: ["http://localhost:8080", "http://localhost:8081"],
credentials: true,
methods: ["GET"]
}
});
I will set up a sales website that allows a customer to pay for a license to use the API with their site, i.e. https://www.customersite.com
My question is how can I dynamically add the customer's website (say after they submit a form from another site) to the CORS list? Ideally it would be via an API call. The only option which I can think of (that is not automated) is to manually maintain a global js file (i.e. config.js) with the cors list from within the Google platform using the file explorer / editor, and to iterate over it as an array similar to process.env.customerList. This will not work for me as I need to have this step happen automatically.
Any and all suggestions are appreciated.
Solution: Use a process manager like pm2 to 'reload' the API gracefully with close to no downtime.
PM2 reloads can be triggered programmatically. I made a PUT endpoint for modifying CORS list /cors/modify that sent a programmatic pm2 message when a successful modification was done.
Note: on Windows OS must use programmatic messaging:
pm2.list(function(err, list) {
pm2.sendDataToProcessId(list[0].pm2_env.pm_id,
{
type : 'process:msg',
data : {
msg : 'shutdown'
},
topic: true
},
function(err, res) {
console.log(err);
pm2.disconnect(); // Disconnects from PM2
}
);
if (err) {
console.log(err);
pm2.disconnect(); // Disconnects from PM2
}
});
which can then be caught with
process.on('message', async function(msg) {
if (msg == "shutdown" || msg.data.msg == 'shutdown') {
console.log("Disconnecting from DB...");
mongoose.disconnect((e => {
if (e) {
process.exit(1)
} else {
console.log("Mongoose connection removed");
httpServer.close((err) => {
if (err) {
console.error(err)
process.exit(1)
}
process.exit(0);
})
}
}));
}
});
I have a nuxt application in which I will need to append data from a generated configuration file when the application is first started. The reason I cannot do this in the actual build is because the configuration file does not exists at this point; it is generated just before calling npm start by a bootstrap script.
Why don't I generated the configuration file before starting the application you may ask and this is because the application is run in a docker container and the built image cannot include environment specific configuration files since it should be used on different environments such as testing, staging and production.
Currently I am trying to use a hook to solve this, but I am not really sure on how to actually set the configuration data in the application so it can be used everywhere:
# part of nuxt.config.js
hooks: {
listen(server, listener) {
# load the custom configuration file.
fs.readFile('./config.json', (err, data) => {
let configData = JSON.parse(data));
});
}
},
The above hook is fired when the application first starts to listen for connecting clients. Not sure this is the best or even a possible way to go.
I also made an attempt of using a plugin to solve this:
import axios from ‘axios’;
export default function (ctx, inject) {
// server-side logic
if (ctx.isServer) {
// here I would like to simply use fs.readFile to load the configuration, but this is not working?
} else {
// client-side logic
axios.get(‘/config.json’)
.then((res) => {
inject(‘storeViews’, res.data);
});
}
};
In the above code I have problems both with using the fs module and axios.
I was also thinking about using a middleware to do this, but not sure on how to proceed.
If someone else has this kind of problem here is the solution I came up with in the end:
// plugins/config.js
class Settings
{
constructor (app, req) {
if (process.server) {
// Server side we load the file simply by using fs
const fs = require('fs');
this.json = fs.readFileSync('config.json');
} else {
// Client side we make a request to the server
fetch('/config')
.then((response) => {
if (response.ok) {
return response.json();
}
})
.then((json) => {
this.json = json;
});
}
}
}
export default function ({ req, app }, inject) {
inject('config', new Settings(app, req));
};
For this to work we need to use a server middleware:
// api/config.js
const fs = require('fs');
const express = require('express');
const app = express();
// Here we pick up requests to /config and reads and return the
// contents of the configuration file
app.get('/', (req, res) => {
fs.readFile('config.json', (err, contents) => {
if (err) {
throw err;
}
res.set('Content-Type', 'application/json');
res.end(contents);
});
});
module.exports = {
path: '/config',
handler: app
};
I have built two docker images. One with nginx that serves my angular web app and another with node.js that serves a basic express app. I have tried to access the express app from my browser in two different tabs at the same time.
In one tab the angular dev server (ng serve) serves up the web page. In the other tab the docker nginx container serves up the web page.
While accessing the node.js express app at the same time from both tabs the data starts to mix and mingle and the results returned to both tabs are a mix mash of the two requests (one from each browser tab)...
I'll try and make this more simple by showing my express app code here...but to answer this question you may not even need to know what the code is at all...so maybe check the question as stated below the code first.
'use strict';
/***********************************
GOOGLE GMAIL AND OAUTH SETUP
***********************************/
const fs = require('fs');
const {google} = require('googleapis');
const gmail = google.gmail('v1');
const clientSecretJson = JSON.parse(fs.readFileSync('./client_secret.json'));
const oauth2Client = new google.auth.OAuth2(
clientSecretJson.web.client_id,
clientSecretJson.web.client_secret,
'https://us-central1-labelorganizer.cloudfunctions.net/oauth2callback'
);
/***********************************
EXPRESS WITH CORS SETUP
***********************************/
const PORT = 8000;
const HOST = '0.0.0.0';
const express = require('express');
const cors = require('cors');
const cookieParser = require('cookie-parser');
const bodyParser = require('body-parser');
const whiteList = [
'http://localhost:4200',
'http://localhost:80',
'http://localhost',
];
const googleApi = express();
googleApi.use(
cors({
origin: whiteList
}),
cookieParser(),
bodyParser()
);
function getPageOfThreads(pageToken, userId, labelIds) {
return new Promise((resolve, reject) => {
gmail.users.threads.list(
{
'auth': oauth2Client,
'userId': userId,
'labelIds': labelIds,
'pageToken': pageToken
},
(error, response) => {
if (error) {
console.error(error);
reject(error);
}
resolve(response.data);
}
)
});
}
async function getPages(nextPageToken, userId, labelIds, result) {
while (nextPageToken) {
let pageOfThreads = await getPageOfThreads(nextPageToken, userId, labelIds);
console.log(pageOfThreads.nextPageToken);
pageOfThreads.threads.forEach((thread) => {
result = result.concat(thread.id);
})
nextPageToken = pageOfThreads.nextPageToken;
}
return result;
}
googleApi.post('/threads', (req, res) => {
console.log(req.body);
let threadIds = [];
oauth2Client.credentials = req.body.token;
let getAllThreadIds = new Promise((resolve, reject) => {
gmail.users.threads.list(
{ 'auth': oauth2Client, 'userId': 'me', 'maxResults': 500 },
(err, response) => {
if (err) {
console.error(err)
reject(err);
}
if (response.data.threads) {
response.data.threads.forEach((thread) => {
threadIds = threadIds.concat(thread.id);
});
}
if (response.data.nextPageToken) {
getPages(response.data.nextPageToken, 'me', ['INBOX'], threadIds).then(result => {
resolve(result);
}).catch((err) => {
console.error(err);
reject(err);
});
} else {
resolve(threadIds);
}
}
);
});
getAllThreadIds
.then((result) => {
res.send({ threadIds: result });
})
.catch((error) => {
res.status(500).send({ error: 'Request failed with error: ' + error })
});
});
googleApi.get('/', (req, res) => res.send('Hello World!'))
googleApi.listen(PORT, HOST);
console.log(`Running on http://${HOST}:${PORT}`);
The angular app makes a simple request to the express app and waits for the reply...which it properly receives...but when I try to make two requests at the exact same time data starts to get mixed together and results are given back to each browser tab from different accounts...
...and the question is... When running containers in the cloud is this kind of thing an issue? Does one need to spin up a new container for each client that wants to actively connect to the express service so that their data doesn't get mixed?
...or is this an issue I am seeing because the express app is being accessed from locally inside my machine? If two machines with two different ip address tried to access this express server at the same time would this sort of data mixing still be an issue or would each get back it's own set of results?
Is this why people use CaaS instead of IaaS solutions?
FYI: this is demo code and the data will not be actually going back to the consumer directly...plans are to have it placed into a database and then re-extracted from the database to download all of the metadata headers for each email.
-Thank you for your time
I can only clear up a small part of this question:
When running containers in the cloud is this kind of thing an issue?
No. Docker is not causing any of the quirky behaviour that you are describing.
Does one need to spin up a new container for each client?
A docker container generally can serve as much users as the application inside of it can. So as long as your application can handle a lot of users (and it should), you don't have to start the same application in multiple containers. That said, when you expect a very large number of customers, there exist docker tools like Docker Compose, Docker Swarm and a lot of alternatives that will enable you to scale up later. For now, you don't need to worry about this at all.
I think I may have found out the issue with my code...and this is actually very important if you are using the node.js googleapis client library...
It is entirely necessary to create a new oauth2Client for each request that comes in
const oauth2Client = new google.auth.OAuth2(
clientSecretJson.web.client_id,
clientSecretJson.web.client_secret,
'https://us-central1-labelorganizer.cloudfunctions.net/oauth2callback'
);
Problem:
When this oauth2Client is shared it is shared by each and every person that connects at the same time...So it is necessary to create a new one each and every time a user connects to my /threads endpoint so that they do not share the same memory space (i.e. access_token etc.) while the processing is done.
Setting the client secret etc. and creating the oauth2Client just once at the top and then simply resetting the token for each request leads to the conflicts mentioned above.
Solution:
For now simply moving the creation of this oauth2Client into each and every request that comes in makes this work properly.
Each client that connects to the service NEEDS to have their own newly created oauth2Client instance or these types of conflicts will occur...
...it's kind of a no brainer but I still find it odd that there is nothing about this in the docs? and their own examples (https://github.com/googleapis/google-api-nodejs-client) seem to show only one instance being created for the whole of their app...but those examples are snippets so...
I'm trying to set up HTTP2 for an Express app I've built. As I understand, Express does not support the NPM http2 module, so I'm using SPDY. Here's how I'm thinking to go about it-I'd appreciate advice from people who've implemented something similar.
1) Server setup-I want to wrap my existing app with SPDY, to keep existing routes. Options are just an object with a key and a cert for SSL.
const app = express();
...all existing Express stuff, followed by:
spdy
.createServer(options, app)
.listen(CONFIG.port, (error) => {
if (error) {
console.error(error);
return process.exit(1)
} else {
console.log('Listening on port: ' + port + '.')
}
});
2) At this point, I want to enhance some of my existing routes with a conditional PUSH response. I want to check to see if there are any updates for the client making a request to the route (the client is called an endpoint, and the updates are an array of JSON objects called endpoint changes,) and if so, push to the client.
My idea is that I will write a function which takes res as one of its parameters, save the endpoint changes as a file (I haven't found a way to push non-file data,) and then add them to a push stream, then delete the file. Is this the right approach? I also notice that there is a second parameter that the stream takes, which is a req/res object-am I formatting it properly here?
const checkUpdates = async (obj, res) => {
if(res.push){
const endpointChanges = await updateEndpoint(obj).endpointChanges;
if (endpointChanges) {
const changePath = `../../cache/endpoint-updates${new Date().toISOString()}.json`;
const savedChanges = await jsonfile(changePath, endpointChanges);
if (savedChanges) {
let stream = res.push(changePath, {req: {'accept': '**/*'}, res: {'content-type': 'application/json'}});
stream.on('error', function (err) {
console.log(err);
});
stream.end();
res.end();
fs.unlinkSync(changePath);
}
}
}
};
3) Then, within my routes, I want to call the checkUpdates method with the relevant parameters, like this:
router.get('/somePath', async (req, res) => {
await checkUpdates({someInfo}, res);
ReS(res, {
message: 'keepalive succeeded'
}, 200);
}
);
Is this the right way to implement HTTP2?
I am new to electron and converting an web app to desktop application.I am loading pages from file system.Cookies are working if pages are served from web server but when I load pages from local folder I am not able to save them. I am saving cookie using document.cookie in web.I want to know how I can I enable file:// cookies in electron .
Regards
Well, I want to answer my question in case somebody is having the same problem. I have fixed the cookie problem by registerStandardSchemes. The sample code is as follows and code works for saving cookies from web pages as well:
protocol.registerStandardSchemes(["app"], {
secure: true
});
and on ready event
protocol.registerFileProtocol('app', (request, callback) => {
const urls = request.url.substr(6)
const parsedUrl = url.parse(urls);
// extract URL path
const pathname = `.${parsedUrl.pathname}`;
// based on the URL path, extract the file extention. e.g. .js, .doc, ...
const ext = path.parse(pathname).ext;
callback({
path: path.normalize(`${__dirname}/${parsedUrl.pathname}`)
})
}, (error) => {
if (error) {
console.error('Failed to register protocol');
}
});
Follow the documentation to get it done, and use the standard.https://electronjs.org/docs/api/cookies
const {session} = require('electron')
// Query all cookies.
session.defaultSession.cookies.get({}, (error, cookies) => {
console.log(error, cookies)
})
// Query all cookies associated with a specific url.
session.defaultSession.cookies.get({url: 'http://www.github.com'}, (error, cookies) => {
console.log(error, cookies)
})
// Set a cookie with the given cookie data;
// may overwrite equivalent cookies if they exist.
const cookie = {url: 'http://www.github.com', name: 'dummy_name', value: 'dummy'}
session.defaultSession.cookies.set(cookie, (error) => {
if (error) console.error(error)
})
OK, I got it working with Electron 5. Below are the relevant bits based on #zahid-nisar's solution, and below that a full sample Electron main.js to show how it all fits together. Obviously, change the location of your app in mainWindow.loadURL('app://www/index.html');.
Relevant code to insert in main.js:
const { protocol } = require('electron');
protocol.registerSchemesAsPrivileged([{
scheme: 'app',
privileges: {
standard: true,
secure: true
}
}]);
Inside app.on('ready') function:
protocol.registerFileProtocol('app', (request, callback) => {
const url = request.url.substr(6);
callback({
path: path.normalize(`${__dirname}/${url}`)
});
}, (error) => {
if (error) console.error('Failed to register protocol');
});
Then, inside your createWindow function, load your app like this:
mainWindow.loadURL('app://www/index.html');
And finally, here is a complete sample main.js with the above code (plus extras that I need, like Service Worker):
// Modules to control application life and create native browser window
const {
app,
protocol,
BrowserWindow
} = require('electron');
const path = require('path');
// This is used to set capabilities of the app: protocol in onready event below
protocol.registerSchemesAsPrivileged([{
scheme: 'app',
privileges: {
standard: true,
secure: true,
allowServiceWorkers: true,
supportFetchAPI: true
}
}]);
// Keep a global reference of the window object, if you don't, the window will
// be closed automatically when the JavaScript object is garbage collected.
let mainWindow;
function createWindow() {
// Create the browser window.
mainWindow = new BrowserWindow({
width: 800,
height: 600
//, webPreferences: {
// preload: path.join(__dirname, 'preload.js')
// }
});
// and load the index.html of the app.
mainWindow.loadURL('app://www/index.html');
// DEV: Enable code below to check cookies saved by app in console log
// mainWindow.webContents.on('did-finish-load', function() {
// mainWindow.webContents.session.cookies.get({}, (error, cookies) => {
// console.log(cookies);
// });
// });
// Open the DevTools.
// mainWindow.webContents.openDevTools()
// Emitted when the window is closed.
mainWindow.on('closed', function () {
// Dereference the window object, usually you would store windows
// in an array if your app supports multi windows, this is the time
// when you should delete the corresponding element.
mainWindow = null;
});
}
// This method will be called when Electron has finished
// initialization and is ready to create browser windows.
// Some APIs can only be used after this event occurs.
app.on('ready', () => {
protocol.registerFileProtocol('app', (request, callback) => {
const url = request.url.substr(6);
callback({
path: path.normalize(`${__dirname}/${url}`)
});
}, (error) => {
if (error) console.error('Failed to register protocol');
});
// Create the new window
createWindow();
});
// Quit when all windows are closed.
app.on('window-all-closed', function () {
// On macOS it is common for applications and their menu bar
// to stay active until the user quits explicitly with Cmd + Q
if (process.platform !== 'darwin') app.quit();
});
app.on('activate', function () {
// On macOS it's common to re-create a window in the app when the
// dock icon is clicked and there are no other windows open.
if (mainWindow === null) createWindow();
});
// In this file you can include the rest of your app's specific main process
// code. You can also put them in separate files and require them here.