I want to get logs if something wrong happens to my electron app when in production mode i.e after giving .exe file to a user wrt windows platform.
How to go about it, how can i basically write my errors to a file which will be in cyclic in nature.
Take a look at electron log
// Setup file logging
const log = require('electron-log');
log.transports.file.level = 'info';
log.transports.file.file = __dirname + 'log.log';
// Log a message
log.info('log message');
EDIT:
As mentioned in the comments the "log.transports.file.file" is deprecated.
Instead I would suggest to use the following method.
log.transports.file.resolvePath = () => __dirname + "/log.log";
Create a file next to you electron.js called logger.js
const path = require("path");
const log = require('electron-log');
log.transports.file.resolvePath = () => path.join(__dirname, '/logsmain.log');
log.transports.file.level = "info";
exports.log = (entry) => log.info(entry)
then on your app
const logger = require('electron').remote.require('./logger');
logger.log("some text")
Please have a look here:
https://www.electronjs.org/docs/api/net-log
const { netLog } = require('electron')
app.whenReady().then(async () => {
await netLog.startLogging('/path/to/net-log')
// After some network events
const path = await netLog.stopLogging()
console.log('Net-logs written to', path)
})
Related
Hi,
I have an app on node.js which consists of a single file app.js that looks like this:
//variables
app = require("express")();
//many more variables here
//functions
function dosomething {}
//many more functions here
but since its getting a little too long I would like to break it into several files, one for variables only (variables.js) and another one for functions only (functions.js) and load them from app.js like this like when you do it with php
//variables
include(variables.js);
//functions
include(functions.js);
is it even possible to do that? Or I have to include everything in one single file like I do now?
Thank you.
You can use Module.Export to export a separate file, and import it into another file using the require statement. Please check here for details:
https://www.geeksforgeeks.org/import-and-export-in-node-js/
Happy Learning :-)
Importing API Endpoints
You can do this by using app.use(...) and point each endpoint to a specific file like so:
const express = require("express");
const app = express();
// User Functions
app.use("/api/user", require("./routes/api/user"));
//Orders functions
app.use("/api/orders/", require("./routes/api/orders"));
/**
* Express Server Init
*/
const PORT = process.env.PORT || 5000;
app.listen(PORT, () => console.log(`Server started on ${PORT}`));
Then in /routes/api/user/user.js you would have something like:
const express = require("express");
const router = express.Router();
router.post("/create", (req, res) => {
try {
// Create user
} catch (error) {
console.log(error);
res.sendStatus(500);
}
});
module.exports = router;
Add and index.js inside /routes/api/user to point at the user file to keep things pretty when importing (otherwise you have to import it like /routes/api/user/user):
const user = require("./user");
module.exports = user;
Importing Single File
Not sure your use case but variables could be a bad naming convention since these values are more like constants than variables. Either way, you create it like this:
const variables = {
varibleOne: "valueOne",
varibleTwo: "valueTwo",
varibleThree: "valueThree",
};
module.exports = variables;
Then wherever you want to import it you can do:
const variables = require("./variables");
and access it like so variables.variableOneand so on.
Importing functions
You can also import different functions, say you need a file called helper.js where these are commonly functions needed all over you app, you could do something like this:
const twoDecimals = (number, decimal = ",") => {
let val = (Math.round(number * 100) / 100).toFixed(2);
return decimal === "." ? val : val.replace(".", decimal);
};
const getRandomInt = (max) => {
return Math.floor(Math.random() * Math.floor(max));
};
module.exports = { twoDecimals, getRandomInt };
Then wherever you needed you can import it by:
const { twoDecimals } = require("helper.js");
Now you have access to your helper functions anywhere.
You should get help from the JavaScript modular system (preferably COMMONJS).
For example, suppose we have two files:
1-module.js 2-app.js
So now let's create this files
module.js
let name = "hello world";
function printSomething(message) {
console.log(message)
}
//here export all function and variable
module.exports.name = name;
module.exports.printSomething = printSomething
ok so Well now it is enough that "require" this file in main file :
main.js
// we
const {name, printSomething} = require("./module.js");
printSomething(name);
for export all variable You need to create an object and specify your variables as a property:
let host = "localhost"
let dbname = "laravel_8"
let username = "root"
let password = "root"
function doSomething() {
console.log("hello");
}
module.exports = {host, dbname, username, password, doSomething}
so in main file :
const variables = require("./module.js")
//host
let host = variables.host
//dbname
let dbname = variables.dbname
//function doSomething
let doSomething = variables.doSomething;
doSomething()
// or directly
variables.doSomething()
In fact, in php we use the "->" symbol to access properties, and in JavaScript we use "."
This below function created mylogfile but can't error the logs of api response with timestamp and error in app
const SimpleNodeLogger = require('simple-node-logger'),
opts = {
logFilePath:'mylogfile.log',
timestampFormat:'YYYY-MM-DD HH:mm:ss.SSS'
},
log = SimpleNodeLogger.createSimpleLogger( opts );
seems you're missing something ... here's an example
// utilities/logger.js
const SimpleNodeLogger = require('simple-node-logger');
const opts = {
logFilePath:'mylogfile.log',
timestampFormat:'YYYY-MM-DD HH:mm:ss.SSS'
};
const log = SimpleNodeLogger.createSimpleLogger(opts);
module.exports = log;
and then, just use it
// index.js
const logger = require('./utilities/logger');
logger.info(`I'm an information line`);
logger.debug(`I'm a debug line`);
logger.error(`I'm an error line`);
that will output in a new created file called mylogfile.log:
2020-12-25 13:37:17.139 INFO I'm an information line
2020-12-25 13:37:17.140 ERROR I'm an error line
set the log level if you want to output more info, like debug. All options are in the package page titled "How to use"
I'm using NodeJS on a VM. One part of it serves up pages, and another part is an API. I've run into a problem, where fs.createReadStream attempts to access a different path than what is being passed into the function. I made a small test server to see if it was something else in the server affecting path usage, for whatever reason, but it's happening on my test server as well. First, here's the code:
const fs = require('fs');
const path = require('path');
const csv = require('csv-parser');
const readCSV = (filename) => {
console.log('READ CSV GOT ' + filename); // show me what you got
return new Promise((resolve, reject) => {
const arr = [];
fs.createReadStream(filename)
.pipe(csv())
.on('data', row => {
arr.push(row);
})
.on('error', err => {
console.log(err);
})
.on('end', () => {
resolve(arr);
});
}
}
// tried this:
// const dir = path.relative(
// path.join('path', 'to', 'this', 'file),
// path.join('path', 'to', 'CONTENT.csv')
// );
// tried a literal relative path:
// const dir = '../data/CONTENT.csv';
// tried a literal absolute path:
// const dir = '/repo/directory/server/data/CONTENT.csv';
// tried an absolute path:
const dir = path.join(__dirname, 'data', 'CONTENT.csv');
const content = readCSV(dir)
.then(result => {console.log(result[0]);})
.catch(err => {console.log(err);});
...but any way I slice it, I get the following output:
READCSV GOT /repo/directory/server/data/CONTENT.csv
throw er; // Unhandled 'error' event
^
Error: ENOENT: no such file or directory, open '/repo/directory/data/CONTENT.csv'
i.e., is fs.createReadStream somehow stripping out the directory of the server, for some reason? I suppose I could hard code the directory into the call to createReadStream, maybe? I just want to know why this is happening.
Some extra: I'm stuck on node v8.11, can't go any higher. On the server itself, I believe I'm using older function(param) {...} instead of arrow functions -- but the behavior is exactly the same.
Please help!!
Code is perfect working.
I think you file CONTENT.csv should be in data folder like "/repo/directory/data/CONTENT.csv".
I'm answering my own question, because I found an answer, I'm not entirely sure why it's working, and at least it's interesting. To the best of my estimation, it's got something to do with the call stack, and where NodeJS identifies as the origin of the function call. I've got my server set up in an MVC pattern so my main app.js is in the root dir, and the function that's being called is in /controllers folder, and I've been trying to do relative paths from that folder -- I'm still not sure why absolute paths didn't work.
The call stack goes:
app.js:
app.use('/somepath', endpointRouter);
...then in endpointRouter.js:
router.get('/request/file', endpointController.getFile);
...then finally in endpointController.js:
const readCSV = filename => {
//the code I shared
}
exports.getFile = (req, res, next) => {
// code that calls readCSV(filename)
}
...and I believe that because Node views the chain as originating from app.js, it then treats all relative paths as relative to app.js, in my root folder. Basically when I switched to the super unintuitive single-dot-relative path: './data/CONTENT.csv', it worked with no issue.
I have this async function in a file called index.main.js which has a variable , 'targetFiles' which I would like to export to another file which is index.js. Problem is I can't find a way to export the value of this particular variable without getting "undefined" as a result.
I have tried implementing promise, callback , export default function, and doing countless hours of research to no avail.
//this code is in index.main.js
var targetFiles = "";
async function listFilesInDepth()
{
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucketName = 'probizmy';
const [files] = await storage.bucket(bucketName).getFiles();
console.log('List Of Files Available:');
files.forEach(file =>
{
targetFiles = file.name; //this is the variable to export
console.log(`-----`+file.name);
});
return targetFiles;
}
module.exports = {
fn : targetFiles
}
trying to export the value to index.js is either empty or "undefined"
//this is the code in index.js
const a = require('./index.main');
console.log(a.fn); //undefined or empty
The expected value that should be the output is the value of targetFiles. Let's say if targetFiles is abc12345.JSON in the async function,the console.log in index.js should be of that value.
I'm hoping someone could give me some insight on how I could overcome this issue. Thank you in advance :)
Following solution might help you, but not sure for your use case. (Not using module-exports):
You can use request-context package to achieve same functionality.
What is package does is, you can set the value(data) against a key and then access the same in the following code execution within the same context.
Run npm install request-context
In your main app.js (server file), register the request-context as a middleware.
const contextService = require("request-context");
const app = express();
...
// Multiple contexts are supported, but below we are setting for per request.
app.use(contextService.middleware("request"));
...
And then in your index.main.js, once targetFiles is ready, set the targetFiles into request context.
const contextService = require("request-context");
...
files.forEach(file =>
{
targetFiles = file.name; //this is the variable to export
console.log(`-----`+file.name);
});
// Here `Request` is the namespace(context), targetFileKey is the key and targetFiles is the value.
contextService.set("request:targetFileKey", targetFiles);
return targetFiles;
}
...
And in the same request, where you wanna use targetFile, you can do the following:
index.js (Can be any file where you need targetFiles after being set):
const contextService = require("request-context");
...
// Reading from same namespace request to which we had set earlier
const targetFiles = contextService.get("request:targetFileKey");
...
Please note:
You will be able to access targetFiles in the same request you set. That means, request-context we configured in app.js is per API request, meaning, in every API request, you have to set it before reading.
Please let me know if the above solution doesn't fit for you.
I have set up an angular-cli project
(# Angular / cli: 1.0.0-rc.2 node: 6.10.0 os: linux x64)
With electron js (v1.6.2)
And I need to use the filesystem to create / delete .csv files and folders, but I can not do includ in the angular component
How could you configure the angular-cli to be able to: import fs from 'fs'?
You wouldn't configure Angular-CLI to use the NodeJS fs module.
In electron you have 2 processes; main and renderer. The main process controls items such as the browserWindow, which is essentially the 'window' the user sees when they open their app, and in turn this loads the html file for the view. Here, in the main process, you import the fs module.
In the render process, you would handle actions from the view, and send them to the main process. This is where you would use IPC to communicate via events to do something with the main process. Once that event is triggered, the render process takes the event and sends it to main. Main would do something with it, and open a file for example on the desktop.
I would recommend using the electron API demo application to see clear examples of this. Here is an example of print to pdf using FS (from the demo app).
Also, here is an electron application github example written by Ray Villalobos using React, which has some similar concepts that will show you how to integrate components in your app.
Render process:
const ipc = require('electron').ipcRenderer
const printPDFBtn = document.getElementById('print-pdf')
printPDFBtn.addEventListener('click', function (event) {
ipc.send('print-to-pdf')
})
ipc.on('wrote-pdf', function (event, path) {
const message = `Wrote PDF to: ${path}`
document.getElementById('pdf-path').innerHTML = message
})
Main Process:
const fs = require('fs')
const os = require('os')
const path = require('path')
const electron = require('electron')
const BrowserWindow = electron.BrowserWindow
const ipc = electron.ipcMain
const shell = electron.shell
ipc.on('print-to-pdf', function (event) {
const pdfPath = path.join(os.tmpdir(), 'print.pdf')
const win = BrowserWindow.fromWebContents(event.sender)
// Use default printing options
win.webContents.printToPDF({}, function (error, data) {
if (error) throw error
fs.writeFile(pdfPath, data, function (error) {
if (error) {
throw error
}
shell.openExternal('file://' + pdfPath)
event.sender.send('wrote-pdf', pdfPath)
})
})
})
You can try using const fs = (<any>window).require("fs"); within the component or better still, create a service.ts provider to handle i/o operations.