Apache, Linux and Remote Commands Like 'Ping' - linux

I need to contact a server (perhaps Apache) running on a Linux box and have it return the following results. I am new to networking, but can write code in Java, PHP and maybe a little C.
I need to remotely run diagnostic tests from the Linux box on the local area network. Ultimately these tests need to be delivered to a web page on the client side.
I am not sure where to begin on this project and appreciate any suggestions or general strategies. I am familiar with a command written in Java (i.e., Runtime.getRuntime().exec("ping -c 1 " + ip); ), but I don't know if it is necessary to involve Java and I don't know how to startup a Java program on a server and return it to a PHP file. Is it possible to do this without involving Java? Again any specific or high- level suggestions are appreciated.

Perhaps you are looking for the exec function in PHP?

You can use exec function
-for ping:
//ping.php
<?php
exec ("/bin/ping -c 5 www.google.com", $response);
echo "<pre>" . join("\r\n", $response)."</pre>";
?>
use the php command-line in terminal
$ php ping.php

Related

How to execute a system command on a remote Linux server using CGI and Perl's Net::OpenSSH module?

EDIT: Read this first: What I was trying to do here is the result of extreme tunnel visioning, the post might be amusing, but not informative. You don't need to SSH into your own server to execute a command, what was I even thinking...
the title pretty much says it all. I want to host a CGI website on a Linux server (Debian, if it matters) and when clicking a button, perform a system command on the server itself. I'm doing this through Perl and it's Net::OpenSSH module.
Here is the problem. I can run the script through the terminal on the server itself, but only if I use sudo. It doesn't matter if the command is simply "ls". Unsurprisingly, when clicking on a button on a website which calls the module, it doesn't work either.
Here is my code:
#!/usr/bin/perl
use strict;
use warnings;
use Net::OpenSSH;
print("Content-type: text/html\n\n");
print("TEST");
my $ssh = Net::OpenSSH->new('localhost', user => 'myusername', password => 'mypassword');
$ssh->system("ls") or die "ERROR: " . $ssh->error;
print("TEST2");
When running it in the terminal using sudo, the script prints out TEST, then lists the folders in my home directory (Desktop, Documents, etc) and finally, TEST2.
When I'm not using sudo, it prints only TEST and after that this error message:
ERROR: unable to establish master SSH connection: the authenticity of
the target host can't be established; the remote host public key is
probably not present on the '~/.shh/known_hosts' file at
opensshtest.pl line 13.
I'm not using SSH keys at all, I'm trying to supply the username and password by hardcoding them into the script.
Also, when opened in a browser, it only prints out the first TEST.
Any help would be appreciated.
It's me again, the guy who posted the question. It's funny how I've spent hours trying to make this work, and stumbled upon a solution maybe an hour after posting the question here. But here it is:
I've added master_opts => [-o => "StrictKeyChecking=no"] as an additional argument to the creation of the Net::OpenSSH object (the line with user, password, etc).

Minimal http server for testing cgi-bin

I have developed and maintain a web application which acts as a front end for some scrips in cgi-bin which in turn call C programs on the server. The web server is Apache2, hosted both on my office Linux box for testing and on Amazon ECC for the real deployment.
My problem is that I'm off travelling, mostly without any internet connection and with only a small portable linux machine, yet I want to develop the next release of the web pages, scripts, data sets and programs. Testing static web pages is no issue but testing pages which call server-side cgi-bin scripts is always problematic, so my idea is to put a minimal http server on the portable linux box (ubuntu 14.04) which will allow the server and client to be on the same machine without any internet (and maybe with just a socket) in-between.
Of course I can and do test scripts and programs directly, but this does not exercise features such as handling top-bit set characters in $POST_DATA or setting and retrieving cookies so would inevitably result in some divergence of code-base.
So:
Is this way sensible or is there a better or simpler means to do what I want?
If it is sensible, what hppt server would you recommend? I thought of miniWeb but have no experience of it.
PS: I'm expert in the the (maths of the) server-side programs but have much less experience as an apache sysadmin.
For many things this is sufficient:
python3 -mhttp.server --cgi
Unfortunately, it's so minimal, that it doesn't support stuff like setting the HTTP Status: https://bugs.python.org/issue10487
I'm not using lighttpd because I don't want to have to write a configuration file. Another minimal server that can be used is mini-httpd:
sudo apt install mini-httpd
/usr/sbin/mini_httpd -D -p 8000 -c 'cgi-bin/*'
The -D option keeps the server in the foreground instead of daemonizing it. The -p option is the port and -c is a pattern for my cgi scripts.
I also found that the built-in webserver of busybox can handle cgi scripts just fine:
busybox httpd -p 8000 -f

How to terminate an application based on file existence check in Tcl on Linux environment

I run T-Plan robot which connects to my windows machine and executes some script.
On successful execution of script,I export the generated xml file via pscp to my linux machine.
T-paln robot acts as a 3rd party freeware to pass some command via cmd on windows machine.
This takes place by running a simple batch file on t-plan robot.However,the script which sends out command to windows disconnects itself based on some explicitly declaring timeout seconds.
I want to write a tcl code which launches this application on linux machine and once the command has generated a successful outcome as xml file and is received on linux machine,it should check whether the xml file exists on the specified directory and terminates the application right at that moment.I want this because the next code section would parse this received xml report and perform other actions.
I think there should be some class in tcl which kills the process/service on any environment ,here I need to perform that in linux.
Sincerely waiting for reply .Thanks in advance
To kill a process on the same Linux machine, provided you've got permission to do so (i.e., you're running as the same user), you do either:
package require Tclx
kill $processId
Or:
exec kill $processId
(The former doesn't require an external command — it does the syscall directly — but the second doesn't need the Tclx pacakge.)
Both of these require the process ID.
To test if a file exists, use file exists like this:
if {[file exists $theFilename]} {
puts "Woohoo! $theFilename is there!"
}
To kill something on a remote machine, you need to send a command to run to that machine. Perhaps like:
exec ssh $remoteMachine kill $remotePID
Getting the $remotePID can be “interesting” and may require some considerable thought in your whole system design. If calling from Windows to Linux, replace ssh with plink. If going the other way, you're talking about doing:
exec ssh $remoteMachine taskkill /PID $remotePID
This can get very complicated, and I'm not sure if the approach you're taking right now is the right one.

how to transfer a file to my server directly from another server

Hey guys what is the easiest way to transfer a file to my server directly from another server, this way I won't download the file to my pc and then upload it to my server, so the requested file should look like http://www.examplesite.com/file.zip
my server is running linux, but I don't have SSH access.
So how can I do this ?
and thanks guys :D
Without SSH it will be very difficult. Possibly rsync might work, if its on both servers with damons set up. RCP (remote copy) exists, its simlar to SCP with out the SSH part, but I doubt its installed due to security concerns.
You have to start a shell on your server. Then try :
man wget
And use :
wget http://www.examplesite.com/file.zip
If you can not have acces to a shell then tell us exactly what control you have over your server.
my2c

In-memory GUI session for UI automation

I'm automating web-UI testing using Selenium. All our existing non-UI related tests are executed through CLI by SSHing into the machine, and it would be great if there's a way to execute these UI tests through CLI by having an X-session run in memory. Is there such a thing in Linux?
There is, its called xfvb.
Sure. You can run a VNC server and have your browser display on that. Like so
noufal#sanitarium% vncserver
Warning: sanitarium:1 is taken because of /tmp/.X1-lock
Remove this file if there is no X server sanitarium:1
New 'X' desktop is sanitarium:2
Starting applications specified in /home/noufal/.vnc/xstartup
Log file is /home/noufal/.vnc/sanitarium:2.log
noufal#sanitarium% /usr/bin/env DISPLAY=sanitarium:2 /usr/bin/firefox --ProfileManager --no-remote
Xlib: extension "RANDR" missing on display "sanitarium:2.0".
will run a browser on the VNC
If you want to see it, you can do something like
noufal#sanitarium% vncviewer sanitarium:2

Resources