terminate matlab running through nohup and access the workspace variables - linux

I am using nohup to run a matlab script in background on a linux server. Last part of the script has commands to save the workspace variables to the disk. In my case, I do not want to wait for completion of the script. Is is possible to terminate the running script midway and access/save the variables existing at that particular stage to the disk.

Using the command nohup, you do not have a nice way to communicate with your program after the terminal is closed. What can help is the window manager screen (take a look at man screen). Even after the session is detached from the terminal once you are still able to get control over your program back.

Related

nodejs, launch an executable file and exit node process

I am writing an application in node that does some verification and configuration and should then launch another process and exit.
I want this new process to run in the same console window and to use the same output and accept keyboard input from the console.
Is this possible with node? I know that I can create child processes but as far as I am aware they will die when node exits.
Thanks
You may find some help in answer at below link at stackoverflow. It suggests running the two separate scripts (one where node.js configures options for 'new process' and second where your "new process" will run) separated with '&' character in terminal.
How do you run multiple programs in parallel from a bash script?
Using it, you might need to change in your node.js script so that you could be able to run the 'new process' directly from terminal instead of launching it through node script. Just configure options through node.js script and launch the process from terminal.

Attaching to the the output of a running process

A process has been started remotely through a SSH session. The output stream (text) is displayed OK thru SSH. I would like to display the results locally without interrupting the running process.
Is there a way to attach to a running process and 'piggyback' a stream?
A Linux-only solution is acceptable.
Thanks!
Use reptyr:
reptyr is a utility for taking an existing running program and
attaching it to a new terminal. Started a long-running process over
ssh, but have to leave and don't want to interrupt it? Just start a
screen, use reptyr to grab it, and then kill the ssh session and head
on home.
Or retty:
retty is a tiny tool that lets you attach processes running on other
terminals.

What happens to a command running on a server if you lose your Internet connection?

Sorry for the noob question. I'm running a shell script on my webserver to download 900 or so files. However I'm doing this through a Terminal shell on my laptop and I'm about to lose my current Internet connection.
Does the script on the server continue running until finished? Or does it stop because its no longer providing feedback to my terminal on the disconnected laptop.
By default, your interactive shell on the web server will send your script a HUP signal as soon as it terminates. You can prevent this by calling your script like this:
<scriptname> & disown
This way it will be run in the background (&) and removed from the shell's job list (disown), so the shell will not send it a HUP signal.
Note that the disown may not be necessary in some cases.
It will die as the parent process (the terminal) has died. However, using utilities such as screen or nohup (among others) it is possible to have the process continue even if the terminal dies, and you can even background a process and disown it after its already running, see this Q for more there.
Further reading you may want to peruse, there's a ton out there on this subject.

Run node app forever with standard i/o?

I'm really new to node.js. My friend helped me set up a node app to run a java process I need running on a server at all times. It works perfectly, except the only way I can see the standard i/o is if I use node app.js. I've looked into both forever and pm2, however neither of these use standard i/o, which I really need for this server to run commands. Could somebody help me out please? Thanks!
Assuming you have a *nix-based server:
You could use GNU Screen.
Screen is a full-screen window manager that multiplexes a physical terminal between several processes, typically interactive shells.
In plain words, you would have access to always-running processes on server and their input-output from your local command line.
After logging in to your server, all you need to do is this:
Start new screen screen -S <name>
Run you java process
Detach from screen screen -d <name>
That's it! Your java process keeps running and you could interact with it by reattaching to the screen session like this: screen -r <name>
Useful Link: GNU Screen Quick Reference
Even cooler would be creating your own service using an Upstart script, which you could then call directly from your local machine with:
Create your own service using Upstart script.

How to run Derby as a background process in Linux

I am using derby on a remote Ubuntu 12.04 server. The standard derby commands are all working correctly and I am able to open my databases and access them via ij. I need to be able to start and stop the server from the terminal while logging in and out between commands. The problem is that I can not find a way to run the server as a background process. The closest I have come is: nohup java -jar $DERBY_HOME/lib/derbyrun.jar server start & > ~/dblog.txt which works except that it requires I hit [enter] before returning to the command line. I am aware of the daemon package but I am uncertain of whether it will allow me to then stop the server. What would be helpful is a explanation of how tomcat manages it since that is my app server.
Derby is just a Java application. Any technique you wish to use to run Java applications in the background (/etc/init.d, job control in your shell, etc.) will work fine for Derby.
You can use commands like "kill" or "killall" to kill your background process. Use "jobs" command to see list of running process you've sent to background. Also you can put them back in foreground by doing - "fg %n" (where n is the job number) and kill it using CTRL-C.

Resources