How to access container id in NodeJs application in host network mode - node.js

I am trying to get a docker container's Id when network=host settings is enabled but instead of getting the containerId I am getting the host instance name. But, in case my network=host is not passed in the command it gives me the containerId as expected.
In short:
Case 1- I run my container with command – docker run --network="host" -d -it myservice:1.0
const os = require("os");
console.log(os.hostname()) /// prints **docker-desktop**
Case 2- I run my container with command – docker run -d -it myservice:1.0
const os = require("os");
console.log(os.hostname()) /// prints **67db4w32k112** as expected
Is there a way I can get the same output i.e 67db4w32k112 in case 1 as well?

From looking at this thread you can probably do something like below which will read the /proc/1/cpuset file inside the container. This file has the current container ID, the contents look like:
/docker/7be92808767a667f35c8505cbf40d14e931ef6db5b0210329cf193b15ba9d605
This will be more reliable in your case than using os.hostname() since it works both with and without the --newtwork="host"flag on the docker run command.
fs = require('fs')
fs.readFile('/proc/1/cpuset', 'utf8', function(err, data) {
if (err) {
return console.log(err);
}
let containerID = data.replace("/docker/", "");
console.log(containerID);
});

Try to use a helper package such as docker-container-id
Add the dependency in your package.json
npm install --save docker-container-id
Here's an example:
const getId = require('docker-container-id');
async function() {
console.log("I'm in container:", await getId());
}
npmjs reference

Related

npm run build returns Module not found: Error: Can't resolve in my docker container in my NuxtJS app

I've been on this issue for hours now. I kept receiving the error below when I run npm run build
ERROR in ./store/chatroom.js
Module not found: Error: Can't resolve '#/services/ChatRoomService.js' in '/usr/src/app/store'
# ./store/chatroom.js 1:0-60 9:11-26
# ./.nuxt/store.js
# ./.nuxt/index.js
# ./.nuxt/client.js
# multi ./.nuxt/client.js
What's weird is, it is working perfectly on my local alone. The error aboves occur in my docker build and when I run my container with my codebase in it.. However, it is that weirder when I run my container with bind mount on my local and try npm run build it is working properly..
At first I thought that maybe some files from my local are missing but I tried copying every file in my local to my container via docker cp . but it still does not work..
Dockerfile
FROM node:8.12.0
WORKDIR /usr/src/app
EXPOSE 3000
COPY package.json package.json
RUN npm install
# To include everything
COPY . .
RUN npm run build
ENTRYPOINT ["/usr/src/app/entrypoint.sh"]
chatroom.js
import ChatRoomService from "#/services/ChatRoomService.js";
export const state = () => ({});
export const mutations = {};
export const actions = {
getText({ commit }, data) {
return ChatRoomService.queryText(data).then(response => {
if (response.code === 1) {
commit("bbs/SET_TOP_ARR", JSON.parse(response.data.content), {
root: true
});
}
});
}
};
chatRoomService.js
import { mainApiClient, requestSetup } from "#/assets/js/axios.js";
const apiModule = "chatroom";
const resources = {
chatroomGetChatRoomText: "text/queryText"
};
export default {
queryText(body) {
const resource = resources.chatroomGetChatRoomText;
const [api, req] = requestSetup(resource, body, apiModule);
return mainApiClient.post(api, req);
}
};
UPDATE
I have solved it. It was how I imported. I imported chatRoomService.jswhere it should have been ChatRoomService.js. It works just fine in my local because it is Mac but when I put it in a node linux server inside container it is more sensitive

Query a remote server's operating system

I'm writing a microservice in Node.js, that runs a particular command line operation to get a specific piece of information. The service runs on multiple server, some of them on Linux, some on Windows. I'm using ssh2-exec to connect to the servers and execute a command, however, I need a way of determining the server's OS to run the correct command.
let ssh2Connect = require('ssh2-connect');
let ssh2Exec = require('ssh2-exec');
ssh2Connect(config, function(error, connection) {
let process = ssh2Exec({
cmd: '<CHANGE THE COMMAND BASED ON OS>',
ssh: connection
});
//using the results of process...
});
I have an idea for the solution: following this question, run some other command beforehand, and determine the OS from the output of said command; however, I want to learn if there's a more "formal" way of achieving this, specifically using SSH2 library.
Below would be how i would think it would be done...
//Import os module this will allow you to read the os type the app is running on
const os = require('os');
//define windows os in string there is only one but for consistency sake we will leave it in an array *if it changes in the future makes it a bit easier to add to an array the remainder of the code doesn't need to change
const winRMOS = ['win32']
//define OS' that need to use ssh protocol *see note above
const sshOS = ['darwin', 'linux', 'freebsd']
// ssh function
const ssh2Connect = (config, function(error, connection) => {
let process = ssh2Exec({
if (os.platform === 'darwin') {
cmd: 'Some macOS command'
},
if (os.platform === 'linux') {
cmd: 'Some linux command'
},
ssh: connection
});
//using the results of process...
});
// winrm function there may but some other way to do this but winrm is the way i know how
const winRM2Connect = (config, function(error, connection) => {
let process = ssh2Exec({
cmd: 'Some Windows command'
winRM: connection
});
//using the results of process...
});
// if statements to determine which one to use based on the os.platform that is returned.
if (os.platform().includes(sshOS)){
ssh2Connect(config)
} elseif( os.platform().includes(winrmOS)){
winrm2Connect(config)
}

Mongodb Shell command execution to drop collection via nodejs

I have a test setup in which mongoimport and mongoexportcommands are used to populate an exiting mongoDB database, say testDB from folder testDump.
The problem occurs for the files which are empty in the folder from which testDB is initially populated and then restored.
Eg. a collection file called abcInstance.json is empty in the testDump.
$ cat abcInstance.json
[]
Now when I run some test, this collection gets populated in testDB but at the end when I restore all collections from the testDump folder using mongoimport command it fails for empty files.
So, I am trying to drop those collections using mongo and spawn command.
if (statSync(collectionFile).size === 4) {
const options = [
'testDB',
'--eval',
'"db.abcInstance.drop()"'
];
const dropDB = spawn('mongo', options, { stdio: 'inherit' });
if (dropDB.status !== 0) {
throw new Error('failed to drop collection ');
}}
But this is also failing and I cannot figure out the error.
I have tested that the same command runs successfully on command line:
$ mongo testDB --eval "db.abcInstance.drop()"
MongoDB shell version v3.6.4
connecting to: mongodb://127.0.0.1:27017/alyneKickStartDB
MongoDB server version: 3.6.4
true
Any idea where I am going wrong?
So, I was able to solve the problem of executing the mongo command by a slightly different approach as stated here.
Basically, the problem I figured out was that my parent process was exiting without waiting for the child process to finish execution as both spawn and exec are asynchronous functions. So I modified my code as follows:
const { promisify } = require('util');
const exec = promisify(require('child_process').exec)
async func test () {
const res = await exec('mongo testDB --eval "db.abcInstance.drop()" --quiet')
return { res }
}
Now, when I call this test() function, my collection is successfully dropped.
Does anybody know of any problem with this approach or a better way?

Git diff gives no output although SHA for image is different

I'm trying to create a deployment script which will let me know whether or not I have the latest image of my project deployed on either my master or development branch.
I'm attempting to use git.diff to compare the SHA1 hash of the deployed image against my local repository, and although the hashes are clearly different, git.diff gives me no output. I don't understand what's going on here, since if the SHA1 is different, there must surely be changes to show from git.diff?
This is the code I have written so far:
#!/usr/bin/node
// get an exec function from node we can use to run shell commands
const exec = require('util').promisify(require('child_process').exec);
// check the user supplied the folder name of the repo as an arg
if (!process.argv[2]) {
console.error('argument missing');
process.exit(1);
}
// initialize our git client using the repo path arg
const git = require('simple-git/promise')("../../" + process.argv[2]);
var projectNameArray = process.argv[2].split(".");
const projectName = projectNameArray[0] + "-" + projectNameArray[1] + "-" + projectNameArray[2]
console.log('\x1b[36m%s\x1b[0m', 'Your project name is:', projectName);
// use an IIAFE for async/await
(async () => {
// run git rev-parse development and
var devSha1 = await git.revparse(['development']);
console.log('\x1b[36m%s\x1b[0m', 'devSha1: ', devSha1);
devSha1 = devSha1.replace(/(\n)/gm,"");
// run git rev-parse master
var masterSha1 = await git.revparse(['master']);
console.log('\x1b[36m%s\x1b[0m', 'masterSha1: ', masterSha1);
masterSha1 = masterSha1.replace(/(\n)/gm,"");
// use kubectl to export pods to JSON and then parse it
const { stdout, stderr } = await exec(`kubectl get deployment ${projectName} -o json`);
const pods = JSON.parse(stdout);
const imageName = pods.spec.template.spec.containers[0].image;
//get deployed image has
const commitHashArray = imageName.split('development-' || 'master-');
console.log('\x1b[36m%s\x1b[0m', 'Deployed image: ', commitHashArray[1]);
var diffArray = new Array(devSha1, commitHashArray[1])
//logic to tell if latest is deployed of if behind
if (commitHashArray[1] == devSha1){
console.log('\x1b[32m%s\x1b[0m', 'You have the latest image deployed');
} else {
console.log('\x1b[31m%s\x1b[0m', 'You don\'t have the latest image deployed');
await git.diff(diffArray);
}
})().then(() => console.log('\x1b[32m%s\x1b[0m', 'Ok')).catch((e) => console.error(e));
This gives me the following console output:
Your project name is: xxx-xxx-xxx
devSha1: 6a7ee89dbefc4508b03d863e5c1f5dd9dce579b4
masterSha1: 4529244ba95e1b043b691c5ef1dc484c7d67dbe2
Deployed image: 446c4ba124f7a12c8a4c91ca8eedde4c3c8652fd
You don't have the latest image deployed
Ok
I'm not sure if I'm fundamentally misunderstanding how git.diff works, or if something else is at play here. The images clearly don't match, so I would love if anyone could explain why there is no output from this function?
Thanks! :)

Accessing google play store data using GSUTIL on nodejs

I am trying to access data from google play store using GSUTIL in nodejs.
when I run the following command on my command line, I am able to get the files:
gsutil cp -r gs://mybucket /mylocalmachine
How do I run this command on my node server to get the same result? The below code doesnot yield any result
var exec = require('child_process').exec;
function copyUsingGSUTIL() {
return new Promise(function (resolve, reject) {
var child = exec("gsutil cp -r", ["gs://mybucket", "mylocalmachine"],{cwd:"/Applications/gsutil"});
console.log(child)
child.on('close', function (code) {
console.log('Exit code' + code);
});
});
}
gsutil is a python program so you have to invoke python and use the full paths in the exec command like
String command = "python c:/gsutil/gsutil.py cp -r gs://mybucket/mylocalmachine" + "C:/your absolute path here/Applications/gsutil";
...
...
var child = exec(command);
console.log(child);
...
...
There is also a java alternative for gsutil see this https://developers.google.com/api-client-library/java/apis/storage/v1
See also this thread Google cloud storage gsutil tool with Java

Resources