I have a large collection of asynchronous functions that I have in nodejs code that I would like to expose to lua. The basic idea is that I would like to execute lua scripts and allow those scripts to call back into some of my nodejs code, as well as asynchronously return a value from an executed lua script,
In this example myCustomNodejsAddon would be a custom addon that I write that knows how to bind lua and run lua scripts. One outstanding question is how do I asynchronously return a value from a lua script?
Has anyone done something like this before? I would be very interested in any pointers, thoughts, examples.
EDIT with better example:
-- user written lua script
getUser(1, function(err, user)
if err then
print('Error', err)
else
print('Found user with id', user.id)
return ''
end
end)
/*Create object with mapping of async functions*/
var callbacks = {
"getUser": function(userId, cb) {
db.Users.fetchById(userId).then(function(user) {
cb(null, user);
}, function(err) {
cb(err, null);
}
}
};
myCustomNodejsAddon.provideCallbacks(callbacks);
/* user written lua script has been stored into `scriptSrc` variable */
myCustomNodejsAddon.execute(scriptSrc, function(returnValueOfScript) {
console.log('done running user script: ', retrunValueOfScript);
});
More than one approaches to this problem comes to my mind.
The first would be to create a nodejs script that once executed read the program command line arguments or input stream and execute the code indicated by this channel and stream the response back in JSON format for example. This is the less invasive way of doing this. The script would be something like:
if(require.main === module){
// asume first argument to be the source module for the function of iterest
var mod = require(process.argv[2]);
var fnc = mod[process.argv[3]];
args = process.argv.slice(4);
// by convention the last argument is a callback function
args.push(function(){
console.log(JSON.stringify(arguments));
process.exit();
})
fnc.apply(null, args);
}
An example usage will be:
$ node my-script.js fs readdir /some/path
This will respond with something like [null, ['file1', 'file2']] acording with the files on /some/path. Then you can create a lua module that invoque node with this script and pass the parameters according with the functions you want to call.
Related
I am trying to write a gjs app that needs to send emails.
The way I have found to do this is using spawn_async_with_pipes() to call mail.
The app seems to spawn mail, and I don't get an error, but I don't get any useful output nor do I get the test emails...
I have been at this for a while now and have found little to no useful up to date documentation. I am working with gtk3 and gjs (and glib). I have also tried spawning a shell script that in turn calls mail. This resulted in "could not resolve host" errors and a dead letter queue. So I know that I am spawning my command. I am not concerned about the "could not resolve host command", but by the fact that I can't get it by spawning mail directly.
I am spawning mail like this:
const [res, pid, in_fd, out_fd, err_fd] =
await GLib.spawn_async_with_pipes(null,
['mail',
'-V',
`-s "${msgObj.subBlock}"`,
`-r ${to}`,
`-S smtp=${HOST}`,
'-S smtp-use-starttls',
'-S smtp-auth=login',
`-S smtp-auth-user=${USER}`,
`-S smtp-auth-password=${PASS}`,
FROM
], null, GLib.SpawnFlags.SEARCH_PATH, null);
const in_reader = new Gio.DataOutputStream({
base_stream: new Gio.UnixOutputStream({fd: in_fd})
});
var feedRes = in_reader.put_string(msgObj.msgBlock, null);
const out_reader = new Gio.DataInputStream({
base_stream: new Gio.UnixInputStream({fd: out_fd})
});
const err_reader = new Gio.DataInputStream({
base_stream: new Gio.UnixInputStream({fd: err_fd})
});
var out = out_reader.read_until("", null);
var err = err_reader.read_until("", null);
print(` > out : "${out}"`);
print(` > res : "${res}"`);
print(` > feedRes : "${feedRes}"`);
print(` > err : "${err}"`);
err is 0, and res is just true
I don't know what the output should be, but I'm not getting a recognizable error, and no email is being delivered...
How can I get my app to send emails? Is spawning mail not the way to go?
Thanks in advance for any pointers you can give me.
There's couple things here I think are confusing you I think I can clear up.
await GLib.spawn_async_with_pipes(
GLib has it's own concept of async functions, that when applicable need to be wrapped in a Promise to work effectively with the await keyword. In this case, GLib.spawn_async_with_pipes() is not asynchronous in the way you're thinking, but that's okay because we're going to use the higher level class Gio.Subprocess.
async function mail(msgObj, to, host, user, pass, cancellable = null) {
try {
let proc = new Gio.Subprocess({
argv: ['mail',
'-V',
// Option switches and values are separate args
'-s', `"${msgObj.subBlock}"`,
'-r', `${to}`,
'-S', `smtp=${host}`,
'-S', 'smtp-use-starttls',
'-S', 'smtp-auth=login',
'-S', `smtp-auth-user=${user}`,
'-S', `smtp-auth-password=${pass}`,
FROM
],
flags: Gio.SubprocessFlags.STDIN_PIPE |
Gio.SubprocessFlags.STDOUT_PIPE |
Gio.SubprocessFlags.STDERR_MERGE
});
// Classes that implement GInitable must be initialized before use, but
// you could use Gio.Subprocess.new(argv, flags) which will call this for you
proc.init(cancellable);
// We're going to wrap a GLib async function in a Promise so we can
// use it like a native JavaScript async function.
//
// You could alternatively return this Promise instead of awaiting it
// here, but that's up to you.
let stdout = await new Promise((resolve, reject) => {
// communicate_utf8() returns a string, communicate() returns a
// a GLib.Bytes and there are "headless" functions available as well
proc.communicate_utf8_async(
// This is your stdin, which can just be a JS string
msgObj.msgBlock,
// we've been passing this around from the function args; you can
// create a Gio.Cancellable and call `cancellable.cancel()` to
// stop the command or any other operation you've passed it to at
// any time, which will throw an "Operation Cancelled" error.
cancellable,
// This is the GAsyncReady callback, which works like any other
// callback, but we need to ensure we catch errors so we can
// propagate them with `reject()` to make the Promise work
// properly
(proc, res) => {
try {
let [ok, stdout, stderr] = proc.communicate_utf8_finish(res);
// Because we used the STDERR_MERGE flag stderr will be
// included in stdout. Obviously you could also call
// `resolve([stdout, stderr])` if you wanted to keep both
// and separate them.
//
// This won't affect whether the proc actually return non-
// zero causing the Promise to reject()
resolve(stdout);
} catch (e) {
reject(e);
}
}
);
});
return stdout;
} catch (e) {
// This could be any number of errors, but probably it will be a GError
// in which case it will have `code` property carrying a GIOErrorEnum
// you could use to programmatically respond to, if desired.
logError(e);
}
}
Gio.Subprocess is a better choice overall, but especially for language bindings that can't pass "out" arguments into functions. Using GLib.spawn_async_with_pipes you would usually pass in NULL to prevent opening any pipes you didn't want, and always ensure you close any pipes you don't. Since we can't do that in GJS, you can end up with dangling file descriptors you can't close.
Gio.Subprocess does a lot of leg work for you and ensures file descriptors are closing, prevents zombie processes, sets up child watches for you and other things you really don't want to worry about. It also has convenience functions for getting IO streams so you don't have to wrap the fd's yourself, among other useful things.
I wrote a longer primer on async programming in GJS that you might find helpful here. You should be able to breeze though it pretty quickly, and it tries to clear up some confusion about the relationship between GLib async, JavaScript async and the GLib main loop vs JS event loop.
I am trying to work with node.js and node-java and trying to get my head wrapped around some concepts, and in particular how to write async method calls.
I think that, for a function in Java, myclass.x():
[In Java]:
Z = myclass.x(abc);
And:
[In node.js/node-java]:
myclass.x(abc, function(err,data) {
//TODO
Z = data;});
In other words, the myclass.x function gets evaluated using the parameter abc, and if no error, then the result goes into "data" which is then assigned to Z.
Is that correct?
Here's the thing (or one of the things) that I am confused about.
What happens if the function myclass.x() doesn't take any parameters?
In other words, it is normally (in Java) just called like:
Z = myclass.x();
If that is the case, how should the node.js code look?
myclass.x(, function(err,data) {
//TODO
Z = data;});
doesn't seem right, but:
myclass.x( function(err,data) {
//TODO
Z = data;});
also doesn't seem correct.
So what is the correct way to code the node.js code in this case?
Thanks in advance!!
Jim
EDIT 1: Per comments, I'm adding the specific code I'm working with is the last couple of commented out lines from this other question at:
node.js and node-java: What is equivalent node.js code for this java code?
These are the lines (commented out in that other question):
var MyFactoryImplClass = java.import("oracle.security.jps.openaz.pep.PepRequestFactoryImpl.PepRequestFactoryImpl");
var result = myFactoryImplClass.newPepRequest(newSubject, requestACTIONString ,requestRESOURCEString , envBuilt)
I tried to make the last line use an async call:
MyFactoryImplClass.getPepRequestFactory( function(err,data) {
//TODO
pepReqF1=data;})
javaLangSystem.out.printlnSync("Finished doing MyFactoryImplClass.getPepRequestFactory() and stored it in pepReqF1 =[" + pepReqF1 + "]");
But the output was showing the value of that pepReqF1 as "undefined".
If calling the method with one parameter and a callback is:
myclass.x(abc, function(err, data) {
// ...
});
Then calling a method with only a callback would be:
myclass.x(function(err, data) {
// ...
});
The function(err, data) { } part is just a normal parameter just like abc. In fact, you can pass a named function with:
function namedFun(err, data) {
// ...
}
myclass.x(abc, namedFun);
Or even:
var namedFun = function (err, data) {
// ...
}
myclass.x(abc, namedFun);
Functions in JavaScript are first-class objects like strings or arrays. You can pass a named function as a parameter to some other function:
function fun1(f) {
return f(10);
}
function fun2(x) {
return x*x;
}
fun1(fun2);
just like you can pass a named array:
function fun3(a) {
return a[0]
}
var array = [1, 2, 3];
fun3(array);
And you can pass an anonymous function as a parameter:
function fun1(f) {
return f(10);
}
fun1(function (x) {
return x*x;
});
just like you can pass an anonymous array:
function fun3(a) {
return a[0]
}
fun3([1, 2, 3]);
There is also a nice shortcut so that instead of:
fun1(function (x) {
return x*x;
});
You can write:
fun1(x => x*x);
Making my comment into an answer...
If the issue you're experiencing is that Z does not have the value you want when you are examining it, then that is probably because of a timing issue. Asynchronous callbacks happen at some unknown time in the future while the rest of your code continues to run. Because of that, the only place you can reliably use the result passed to the asynchronous callback is inside the callback itself or in some function you would call from that function and pass it the value.
So, if your .x() method calls it's callback asynchronously, then:
var Z;
myclass.x( function(err,data) {
// use the err and data arguments here inside the callback
Z = data;
});
console.log(Z); // outputs undefined
// you can't access Z here. Even when assigned
// to higher scoped variables because the callback has not yet
// been called when this code executes
You can see this is a little more clearly by understanding the sequencing
console.log('A');
someAsyncFucntion(function() {
console.log('B');
})
console.log('C');
This will produce a log of:
A
C
B
Showing you that the async callback happens some time in the future, after the rest of your sequential code has executed.
Java, on the other hand, primarily uses blocking I/O (the function doesn't return until the I/O operation is copmlete) so you don't usually have this asynchronous behavior that is standard practice in node.js. Note: I believe there are some asynchronous capabilities in Java, but that isn't the typical way things are done and in node.js, it is the typical ways things are done.
This creates a bit of an architectural mismatch if you're trying to port code that uses I/O from environment from another because the structure has to be redone in order to work properly in a node.js environment.
I am having a problem with async shell executes in node.js.
In my case, node.js is installed on a Linux operating system on a raspberry pi. I want to fill an array with values that are parsed from a shell script which is called on the pi. This works fine, however, the exec() function is called asynchronously.
I need the function to be absolute synchron to avoid messing up my whole system. Is there any way to achieve this? Currently I am trying a lib called .exe, but the code still seems to behave asynchron.
Here's my code:
function execute(cmd, cb)
{
child = exec(cmd, function(error, stdout, stderr)
{
cb(stdout, stderr);
});
}
function chooseGroup()
{
var groups = [];
execute("bash /home/pi/scripts/group_show.sh", function(stdout, stderr)
{
groups_str = stdout;
groups = groups_str.split("\n");
});
return groups;
}
//Test
console.log(chooseGroup());
If what you're using is child_process.exec, it is asynchronous already.
Your chooseGroup() function will not work properly because it is asynchronous. The groups variable will always be empty.
Your chooseGroup() function can work if you change it like this:
function chooseGroup() {
execute("bash /home/pi/scripts/group_show.sh", function(stdout, stderr) {
var groups = stdout.split("\n");
// put the code here that uses groups
console.log(groups);
});
}
// you cannot use groups here because the result is obtained asynchronously
// and thus is not yet available here.
If, for some reason, you're looking for a synchronous version of .exec(), there is child_process.execSync() though it is rarely recommended in server-based code because it is blocking and thus blocks execution of other things in node.js while it is running.
I am creating a (nodejs) chat bot, that reads messages and should act upon them. Due to its nature, the easy way to select which command should be executed is by multiple if else statements that execute a directive (a command), to which I pass a callback function.
Is there a better way? Any way to use a yaml or any other type of xml like config file to assign the route/controller?
Here is a code example. Full project here
var main = require('./Directives/default');
if (workingMsgBody.indexOf("help") > -1) {
main.help(function(data) {
outputChannel.sendOutput(msg.medium, data);
});
//multiple elseif statements follow
module.exports.help = function(callback) {
callback('blah blah');
}
We have a buffer we'd like to write to a file. If the file already exists, we need to increment an index on it, and try again. Is there a way to create a file only if it doesn't exist, or should I just stat files until I get an error to find one that doesn't exist already?
For example, I have files a_1.jpg and a_2.jpg. I'd like my method to try creating a_1.jpg and a_2.jpg, and fail, and finally successfully create a_3.jpg.
The ideal method would look something like this:
fs.writeFile(path, data, { overwrite: false }, function (err) {
if (err) throw err;
console.log('It\'s saved!');
});
or like this:
fs.createWriteStream(path, { overwrite: false });
Does anything like this exist in node's fs library?
EDIT: My question isn't if there's a separate function that checks for existence. It's this: is there a way to create a file if it doesn't exist, in a single file system call?
As your intuition correctly guessed, the naive solution with a pair of exists / writeFile calls is wrong. Asynchronous code runs in unpredictable ways. And in given case it is
Is there a file a.txt? — No.
(File a.txt gets created by another program)
Write to a.txt if it's possible. — Okay.
But yes, we can do that in a single call. We're working with file system so it's a good idea to read developer manual on fs. And hey, here's an interesting part.
'w' - Open file for writing. The file is created (if it does not
exist) or truncated (if it exists).
'wx' - Like 'w' but fails if path exists.
So all we have to do is just add wx to the fs.open call. But hey, we don't like fopen-like IO. Let's read on fs.writeFile a bit more.
fs.readFile(filename[, options], callback)#
filename String
options Object
encoding String | Null default = null
flag String default = 'r'
callback Function
That options.flag looks promising. So we try
fs.writeFile(path, data, { flag: 'wx' }, function (err) {
if (err) throw err;
console.log("It's saved!");
});
And it works perfectly for a single write. I guess this code will fail in some more bizarre ways yet if you try to solve your task with it. You have an atomary "check for a_#.jpg existence, and write there if it's empty" operation, but all the other fs state is not locked, and a_1.jpg file may spontaneously disappear while you're already checking a_5.jpg. Most* file systems are no ACID databases, and the fact that you're able to do at least some atomic operations is miraculous. It's very likely that wx code won't work on some platform. So for the sake of your sanity, use database, finally.
Some more info for the suffering
Imagine we're writing something like memoize-fs that caches results of function calls to the file system to save us some network/cpu time. Could we open the file for reading if it exists, and for writing if it doesn't, all in the single call? Let's take a funny look on those flags. After a while of mental exercises we can see that a+ does what we want: if the file doesn't exist, it creates one and opens it both for reading and writing, and if the file exists it does so without clearing the file (as w+ would). But now we cannot use it neither in (smth)File, nor in create(Smth)Stream functions. And that seems like a missing feature.
So feel free to file it as a feature request (or even a bug) to Node.js github, as lack of atomic asynchronous file system API is a drawback of Node. Though don't expect changes any time soon.
Edit. I would like to link to articles by Linus and by Dan Luu on why exactly you don't want to do anything smart with your fs calls, because the claim was left mostly not based on anything.
What about using the a option?
According to the docs:
'a+' - Open file for reading and appending. The file is created if it does not exist.
It seems to work perfectly with createWriteStream
This method is no longer recommended. fs.exists is deprecated. See comments.
Here are some options:
1) Have 2 "fs" calls. The first one is the "fs.exists" call, and the second is "fs.write / read, etc"
//checks if the file exists.
//If it does, it just calls back.
//If it doesn't, then the file is created.
function checkForFile(fileName,callback)
{
fs.exists(fileName, function (exists) {
if(exists)
{
callback();
}else
{
fs.writeFile(fileName, {flag: 'wx'}, function (err, data)
{
callback();
})
}
});
}
function writeToFile()
{
checkForFile("file.dat",function()
{
//It is now safe to write/read to file.dat
fs.readFile("file.dat", function (err,data)
{
//do stuff
});
});
}
2) Or Create an empty file first:
--- Sync:
//If you want to force the file to be empty then you want to use the 'w' flag:
var fd = fs.openSync(filepath, 'w');
//That will truncate the file if it exists and create it if it doesn't.
//Wrap it in an fs.closeSync call if you don't need the file descriptor it returns.
fs.closeSync(fs.openSync(filepath, 'w'));
--- ASync:
var fs = require("fs");
fs.open(path, "wx", function (err, fd) {
// handle error
fs.close(fd, function (err) {
// handle error
});
});
3) Or use "touch": https://github.com/isaacs/node-touch
Todo this in a single system call you can use the fs-extra npm module.
After this the file will have been created as well as the directory it is to be placed in.
const fs = require('fs-extra');
const file = '/tmp/this/path/does/not/exist/file.txt'
fs.ensureFile(file, err => {
console.log(err) // => null
});
Another way is to use ensureFileSync which will do the same thing but synchronous.
const fs = require('fs-extra');
const file = '/tmp/this/path/does/not/exist/file.txt'
fs.ensureFileSync(file)
With async / await and Typescript I would do:
import * as fs from 'fs'
async function upsertFile(name: string) {
try {
// try to read file
await fs.promises.readFile(name)
} catch (error) {
// create empty file, because it wasn't found
await fs.promises.writeFile(name, '')
}
}
Here's a synchronous way of doing it:
try {
await fs.truncateSync(filepath, 0);
} catch (err) {
await fs.writeFileSync(filepath, "", { flag: "wx" });
}
If the file exists it will get truncated, otherwise it gets created if an error is raised.
This works for me.
// Use the file system fs promises
const {access} = require('fs/promises');
// File Exist returns true
// dont use exists which is no more!
const fexists =async (path)=> {
try {
await access(path);
return true;
} catch {
return false;
}
}
// Wrapper for your main program
async function mainapp(){
if( await fexists("./users.json")){
console.log("File is here");
} else {
console.log("File not here -so make one");
}
}
// run your program
mainapp();
Just keep eye on your async - awaits so everthing plays nice.
hope this helps.
You can do something like this:
function writeFile(i){
var i = i || 0;
var fileName = 'a_' + i + '.jpg';
fs.exists(fileName, function (exists) {
if(exists){
writeFile(++i);
} else {
fs.writeFile(fileName);
}
});
}