curl more than 300 times slower than node-fetch? - node.js

I'm writing a script to scan the /config route on some ports on a host.
I have first written the script in node.js and am now porting it to bash to achieve less dependencies.
Why is the bash script more than 300 times slower when scanning localhost. What am I missing?
I guess there are some optimizations built in node-fetch. How can I achieve the same in bash?
node.js: 10 Ports -> 79ms
bash(0.0.0.0): 10 Ports -> 2149ms
bash(localhost): 10 Ports -> 25156ms
I found out that in bash when using 0.0.0.0 instead of localhost, it is only 27 times slower, but still... (In node.js, using 0.0.0.0 does not make a significant difference.)
node.js (IFFE omitted for readability)
import fetch from 'node-fetch';
for (let port = portFrom; port <= portTo; port++) {
try {
const res = await fetch("http://localhost" + port + '/config');
const json = await res.json();
console.log(json);
} catch {/*no-op*/}
}
bash
for ((port=$port_from; port<=$port_to; port++))
do
json="$(curl -s http://localhost:$port/config)"
echo "$json"
done

Related

Pushing process to background causes high kswapd0

I have a cpu-intensive process running on a raspberry pi that's executed by running a nodejs file. Running the first command (below) and then running the file on another tab works just fine. However when I run the process via a bash shell script, the process stalls.
Looking at the processes using top I see that kswapd0 and kworker/2:1+ takes over most of the cpu. What could be causing this?
FYI, the first command begins the Ethereum discovery protocol via HTTP and IPC
geth --datadir $NODE --syncmode 'full' --port 8080 --rpc --rpcaddr 'localhost' --rpcport 30310 --rpcapi 'personal,eth,net,web3,miner,txpool,admin,debug' --networkid 777 --allow-insecure-unlock --unlock "$HOME_ADDRESS" --password ./password.txt --mine --maxpeers 100 2> results/log.txt &
sleep 10
# create storage contract and output result
node performanceContract.js
UPDATE:
performanceContract.js
const ethers = require('ethers');
const fs = require('fs')
const provider = new ethers.providers.IpcProvider('./node2/geth.ipc')
const walletJson = fs.readFileSync('./node2/keystore/keys', 'utf8')
const pwd = fs.readFileSync('./password.txt', 'utf8').trim();
const PerformanceContract = require('./contracts/PerformanceContract.json');
(async function () {
try {
const wallet = await ethers.Wallet.fromEncryptedJson(walletJson, pwd)
const connectedWallet = wallet.connect(provider)
const factory = new ethers.ContractFactory(PerformanceContract.abi, PerformanceContract.bytecode, connectedWallet)
const contract = await factory.deploy()
const deployedInstance = new ethers.Contract(contract.address, PerformanceContract.abi, connectedWallet);
let tx = await deployedInstance.loop(6000)
fs.writeFile(`./results/contract_result_xsmall_${new Date()}.txt`, JSON.stringify(tx, null, 4), () => {
console.log('file written')
})
...
Where loop is a method that loops keccak256 encryption method. It's purpose is to test diffent gas costs by alternating the loop #.
Solved by increasing the sleep time to 1min. Assume it was just a memory issue that need more time before executing the contract.

Query a remote server's operating system

I'm writing a microservice in Node.js, that runs a particular command line operation to get a specific piece of information. The service runs on multiple server, some of them on Linux, some on Windows. I'm using ssh2-exec to connect to the servers and execute a command, however, I need a way of determining the server's OS to run the correct command.
let ssh2Connect = require('ssh2-connect');
let ssh2Exec = require('ssh2-exec');
ssh2Connect(config, function(error, connection) {
let process = ssh2Exec({
cmd: '<CHANGE THE COMMAND BASED ON OS>',
ssh: connection
});
//using the results of process...
});
I have an idea for the solution: following this question, run some other command beforehand, and determine the OS from the output of said command; however, I want to learn if there's a more "formal" way of achieving this, specifically using SSH2 library.
Below would be how i would think it would be done...
//Import os module this will allow you to read the os type the app is running on
const os = require('os');
//define windows os in string there is only one but for consistency sake we will leave it in an array *if it changes in the future makes it a bit easier to add to an array the remainder of the code doesn't need to change
const winRMOS = ['win32']
//define OS' that need to use ssh protocol *see note above
const sshOS = ['darwin', 'linux', 'freebsd']
// ssh function
const ssh2Connect = (config, function(error, connection) => {
let process = ssh2Exec({
if (os.platform === 'darwin') {
cmd: 'Some macOS command'
},
if (os.platform === 'linux') {
cmd: 'Some linux command'
},
ssh: connection
});
//using the results of process...
});
// winrm function there may but some other way to do this but winrm is the way i know how
const winRM2Connect = (config, function(error, connection) => {
let process = ssh2Exec({
cmd: 'Some Windows command'
winRM: connection
});
//using the results of process...
});
// if statements to determine which one to use based on the os.platform that is returned.
if (os.platform().includes(sshOS)){
ssh2Connect(config)
} elseif( os.platform().includes(winrmOS)){
winrm2Connect(config)
}

Open up terminal/shell on remote server via tcp request

I have this:
const http = require('http');
const cp = require('child_process');
const server = http.createServer((req,res) => {
const bash = cp.spawn('bash');
req.pipe(bash.stdin, {end:false);
bash.stdout.pipe(res);
bash.stderr.pipe(res);
});
server.listen('4004');
when I hit the server with:
curl localhost:4004
and I type bash commands, nothing gets outputed to my console, anybody know why?
Note: To address security I plan to run this in a docker container, use https/ssl, and implement authentication (any recommendations on auth schemes lmk).
More importantly, I am looking for shell prompts to appear ... apparently bash by itself doesn't open up a shell/prompt?
It is possible to do this "over the web" so to speak. However, your approach will not work, because you are mixing paradigms (batch vs. interactive), and you are missing large chunks of setup that's needed to run terminal applications.
Normally I would show you how to program this, however, that's really involved. Have a look at:
https://github.com/chjj/tty.js
and,
https://github.com/xtermjs/xterm.js
as starting points to create your solution.
Both are usable directly from node.js to serve up terminal applications over HTTP.
This is a partial answer, but I started a bounty because I am looking for something better. I was able to create something rudimentary with TCP like so:
const net = require('net'); // !use net package not http
const cp = require('child_process');
const server = net.createServer(s => {
const bash = cp.spawn('bash');
s.pipe(bash.stdin, {end:false});
bash.stdout.pipe(s);
bash.stderr.pipe(s);
});
server.listen('4004');
not sure why it won't work with HTTP though. I connect to it using netcat:
nc localhost 4004
but this isn't opening a terminal, just a bash process. the experience is not ideal, as described here:
https://unix.stackexchange.com/questions/519364/bash-shell-modes-how-to-pipe-request-to-shell-on-remote-server
however I am looking to replicate the shell experience you have when you do something like:
docker exec -ti <container> /bin/bash
when I run my script it "works", but I don't get any shell prompts or anything like that. (One way to solve this might be with ssh, but I am trying to figure out a different way).
You can connect to an http server with telnet. It depends on how you're starting the http server. Here's an example
Start an http server with the npm package http-server
npm install -g http-server
cd ~/ <Any directory>
http-server
Now seperately start a telnet session
telnet localhost 8080
OR
nc localhost 8080
And then type something like GET /
Use the telnet client instead of nc
Check this: https://www.the-art-of-web.com/system/telnet-http11/
Update: Running an ssh server over nodejs. It allows you to run an ssh server
I found this at https://github.com/mscdex/ssh2
var fs = require('fs');
var crypto = require('crypto');
var inspect = require('util').inspect;
var ssh2 = require('ssh2');
var utils = ssh2.utils;
var allowedUser = Buffer.from('foo');
var allowedPassword = Buffer.from('bar');
var allowedPubKey = utils.parseKey(fs.readFileSync('foo.pub'));
new ssh2.Server({
hostKeys: [fs.readFileSync('host.key')]
}, function(client) {
console.log('Client connected!');
client.on('authentication', function(ctx) {
var user = Buffer.from(ctx.username);
if (user.length !== allowedUser.length
|| !crypto.timingSafeEqual(user, allowedUser)) {
return ctx.reject();
}
switch (ctx.method) {
case 'password':
var password = Buffer.from(ctx.password);
if (password.length !== allowedPassword.length
|| !crypto.timingSafeEqual(password, allowedPassword)) {
return ctx.reject();
}
break;
case 'publickey':
var allowedPubSSHKey = allowedPubKey.getPublicSSH();
if (ctx.key.algo !== allowedPubKey.type
|| ctx.key.data.length !== allowedPubSSHKey.length
|| !crypto.timingSafeEqual(ctx.key.data, allowedPubSSHKey)
|| (ctx.signature && !allowedPubKey.verify(ctx.blob, ctx.signature))) {
return ctx.reject();
}
break;
default:
return ctx.reject();
}
ctx.accept();
}).on('ready', function() {
console.log('Client authenticated!');
client.on('session', function(accept, reject) {
var session = accept();
session.once('exec', function(accept, reject, info) {
console.log('Client wants to execute: ' + inspect(info.command));
var stream = accept();
stream.stderr.write('Oh no, the dreaded errors!\n');
stream.write('Just kidding about the errors!\n');
stream.exit(0);
stream.end();
});
});
}).on('end', function() {
console.log('Client disconnected');
});
}).listen(0, '127.0.0.1', function() {
console.log('Listening on port ' + this.address().port);
});
Your approaches are quite mixed, nonetheless, when ever you finally connect to the remote server do not use 'bash' as a method to start the connection, BASH is just born again shell with other commands & stuff in it,
Rather use some of the following program, command-line names: i.e :
~ $ 'gnome-terminal'
~ $ 'xterm'
there you will now be referencing a true program in the system, even kernel level C code has its own recognition of these, if not changed.

Is it possible to check if a system is on LAN vs WiFi via nodejs?

Are there any packages for node that can determine if a PC is on a LAN vs Wifi connection?
I have gone through the node docs and it doesn't appear there is a native node module. (https://nodejs.org/api/os.html)
Could not find anything in NPM that could determine this either.
Remember you can run any bash command you want using exec.
So you can do something along the lines of
const util = require('util');
const exec = util.promisify(require('child_process').exec);
async function main() {
const { stdout, stderr } = await exec('tail -n+3 /proc/net/wireless | grep -q .');
if (stdout) { // wirelesss }
}
main()
Adapted: determine if connection is wired or wireless?

Curl command is not processing changes I make to file

I am trying to test parsing of a zip file in node.js using curl from the command line. Originally, I had a route that looks like this:
app.post('/processZip', (req, res) => {
const zip = req.file
console.log(req)
extractCSVFilesFromZip(zip, '/tmp/connections', '/tmp/messages')
const connectionsOutputPath = '/tmp/connections'
const messagesOutputPath = '/tmp/messages'
console.log(`Size of Parsed Connections File: ${connectionsOutputPath.size}`)
console.log(`Size of Parsed Messages File: ${messagesOutputPath.size}`)
res.send('success!')
})
which calls a function that looks like this:
const extractCSVFilesFromZip = (zipFilePath, connectionsCSVOutputPath,
messagesCSVOutputPath) => {
console.log(zipFilePath)
fs.createReadStream(zipFilePath)
.pipe(unzip.Parse())
.on('entry', entry => {
const [
fileName,
size
] = [
entry.path,
entry.size
]
if (fileName === 'Connections.csv') {
console.log(`Size of Connections File to Parse: ${size}`)
entry.pipe(fs.createWriteStream(connectionsCSVOutputPath))
} else if (fileName === 'Messages.csv') {
console.log(`Size of Messages File to Parse: ${size}`)
entry.pipe(fs.createWriteStream(messagesCSVOutputPath))
} else {
entry.autodrain()
}
})
}
I am using this curl command to test the request:
curl -F file=#../../../Downloads/Basic_LinkedInDataExport_09-14-2018.zip http://localhost:5000/processZip/
Originally, it gave me an error pointing to the first instance of createReadStream in the function, so I commented out all the code and just tried to console.log(zipFilePath) to see what is being sent. But I still get the same error. In fact, I can comment out, remove, or change any of the code in either the route or the file, but it makes no difference. I still get the same error. It's as if curl is still sending the request to a cached version of the files, and not processing the changes I am making. But if I examine the files from the command line with sudo nano I can see the updated versions. What could be causing this issue? I have saved the files and restarted the server each time. Could it be that I need to wait longer than usual for the changes to be processed because it is a larger codebase than I am used to working in, or is something else to blame. For what it is worth, the servers are being run by forever. Thanks in advance for any help!
Okay I figured it out, there was a ghost process running on 5000. killall -9 node did the trick!

Resources