How to save logs of Node.js running in CyberPanel? - node.js

I am currently running a Cyber Panel v2.1 + Ubuntu + Node v14.4 on an EC2 T2.Small Instance. Using the context menu in the Open Lite Speed Interface, I run a Node.JS Website. However, their are issues with its functionality, particularly when I try to require the Mongoose or MongoDB Package.
As of now, I don't have a way to see the error logs or console logs that are generating, only the access logs. Is their a way I can configure something on either CyberPanel or Node.Js that I can get the console logs & the access logs of the project so that I can fix the issue?

You should be able to enable console log by setting environment variable
LSNODE_CONSOLE_LOG=</path/to/your/console/log/file>
when you configure the application.

if you have root access then you should be able to see the node servers and your application level logs under /usr/local/lsws/logs/.

Related

Remote Server Terminal of for specific Applications

I have a server and i want my colleague to access only the application terminal of a specific application or console log the errors or anything he wants to log. I have used console.re, a remote console client, but i couldnot access it from my own machine; It somehow works on brower of the client but not elsewhere. I want something that could solve this problem. BTW i use node js application if there is any package that could resolve this i would be really thankful.
Thank you,

Google stackdriver debug not working in Kubernetes

We have a server application based on Python 3.6 running on Google Kubernetes Engine. I added Google StackDriver Debug to aid in debugging some production issues but I cannot get our app to show up in the Stackdriver debug console. The 'application to debug' dropdown menu stays empty.
The kubernetes cluster is provisioned with the cloud-debug scope and the app starts up correctly. Also, the Stackdriver Debugging API is enabled on our project. When running the app locally on my machine, cloud debugging works as expected, but I cannot find a reason why it won't work on our production environment
In my case the problem was not with the scopes of the platform, but rather with the fact that you cannot simply pip install google-python-cloud-debugger on the official python-alpine docker images. Alpine Linux support is not tested regularly and my problem was related to missing symbols in the C-library.
Alpine Linux uses the MUSL C-library and it needs a google cloud debugger specifically built for that library. After preparing a specific docker image for this, I got it to work with the provided credentials.
As an alternative method, you can debug Python pods with Visual Studio code and good old debugpy
I wrote an open source tool that will inject debugpy into any running Python pod without prior setup.
To use it, you'll need to:
Install the tool in your cluster (see Github page)
Run a command locally from a machine with access to the cluster:
robusta playbooks trigger python_debugger name=myapp namespace=default
Port-forward to the cluster (the tool prints instructions)
Attach VSCode to localhost and the port that you're forwarding
This works by creating a new pod on the same node and then injecting debugpy using debug-toolkit

Deploy Node JS REST APi Mongo DB on Windows

I'm new to node js. In the last months, i developed a REST API using node js, Express and Mongo DB for my customer.
I would like to deploy the WEB API on Linux, but now my customer asks me to deploy the solution into a Production Environment with Windows Server 2008 / 2012 R2.
Which is the best way to deploy this solution on Windows Server?
There is some best guide to see?
Thanks
PM2 on Windows:
Personally I am using pm2 on my linux machines. Pm2 is a process manager for node apps, which makes deployment and maintainance of your node applications a lot easier. You can also use pm2 on windows, see: https://github.com/Unitech/pm2
You need to make sure pm2 start upon restarts / server crashes and therefore you want to setup a service. You could take a look at: https://github.com/jon-hall/pm2-windows-service which may be a help for that purpose.
Docker:
If you are just struggling because it's windows and you are still looking for your "common deploy methods" you could still setup a docker container with your favourite environment. However if you don't have experience with it, it is probably a bad idea to test it first in a customer project.

Developing locally using express, mongodb and mongoose

I'm currently making an app using express, mongodb and mongoose, and I'm running it locally on my machine. My problem is that if I'm not connected to the internet the app won't run at all due to the app not being able to connect to mongodb server.
I thought that if I ran the mongodb server locally on my computer along with the app then I wouldn't need an internet connection, or is my understanding wrong?
Any help would be much appreciated.
The answer is: yes.
If you install MongoDB locally then you won't need internet connection to access it.
Make sure that your connection string contains "localhost".
Also, make sure that you don't need anything else on the internet, and that you run npm install while you are connected to the internet, or otherwise your dependencies (like mongoose) won't get installed. After they are installed they can work without the internet connection just fine - if your database is on localhost.
Also, make sure that your local MongoDB server is running. You can run:
mongo test
in the command line to see if you can connect to a local database.
You're in the right path !
Here's the thing, you need to get yourself a copy of MongoDB, you can download and install the suitable version to your system from here.
Now you need to configure MongoDB in your in your path so you can launch it when you is or simply add it a process that will launch when your system starts.
In order to configure please choose the suitable conf to your system :
Windows.
Linux.
macOS.
Then, before running your application, make sure MongoDB is running in the background ad service or daemon and then simply launch your application.

How to access mongodb variables from nodejs cartridge on OpenShift?

I am trying to deploy an application prototype to openshift. It works locally with mongodb at 127.0.0.1. I am trying to get it to respect process.env.OPENSHIFT_MONGODB_DB_URL when in the openshift environment but that variable is not accessible to my nodejs cartridge at runtime.
I can see that it is being set in my application's shell environment. When I do rhc ssh and then export I see OPENSHIFT_MONGODB_DB_URL=[full_url_with_password] and it all looks good.
But when, in my node.js application, I call process.env.OPENSHIFT_MONGODB_DB_URL it returns undefined.
To double check, I did a console.log(util.inspect(process.env)) from within my node.js app, and what I saw was different from what I see within my appication's secure shell. No OPENSHIFT_MONGODB_* variables were in the environment that is exposed to my node.js app.
How can I access variables across different cartridges? Or is this a configuration error?
It sounds like a configuration error. I have a similar application and
console.log(util.inspect(process.env))
gives me a clear picture of the mongodb environment variables.
The developers page indicates that:
Database environment variables pertain to a database, if one exists, and are used to connect an application to a database. Note that these connections are only available to an application internally; you cannot connect from an external source.
This suggests, to me, that the nodejs is external to the mongodb installation. I have an idea that it can be verified with the command:
rhc app show OPENSHIFT_APP_NAME
It might lead to the source of the problem. A correctly configured app would have nodejs and mongodb in this list.

Resources