how to run jquery ajax in cronjob - cron

i want to run a ajax function in cronjob..
my code is :
<?php
function sendSMS($username, $password, $phones, $text){?>
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.8.3/jquery.min.js"></script>
<script type="text/javascript">
$(document).ready(function (){
var smsUrl='http://bulksms.mysmsmantra.com:8080/WebSMS/SMSAPI.jsp?username=<?php echo $username ?>&password=<?php echo $password ?>&sendername=NETSMS&mobileno=<?php echo $phones ?>&message=<?php echo $text?>';
$.ajax({
data : '',
type : "get",
url : smsUrl,
error : function(resp){
},
success : function(resp){
}
});
});
</script>
<?php }?>
the code is working fine if i run the php page manually in the browser.. but not work in cron job... is it possible to run ajax function in cronjob?

Not possible to my attention (or just very difficult and not user-friendly).
You can acquire the same thing using CURL.
Make your cron job run a script that contacts bulksms.mysms... through CURL.
I dont know if your server runs asp or php or whatever.
Example:
$ch = curl_init();
$remote_url = 'http://bulksms.mysmsmantra.com:8080/WebSMS/SMSAPI.jsp?username=' . $username . '&password=' . $password . '&sendername=NETSMS&mobileno=' . $phones . '&message=' . $text;
curl_setopt($ch, CURLOPT_URL, $remote_url);
// Include header in result? (0 = yes, 1 = no)
curl_setopt($ch, CURLOPT_HEADER, 0);
// Should cURL return or print out the data? (true = return, false = print)
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
// Timeout in seconds
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
// Return output
$output = curl_exec($ch);
Are you sure you installed curl library for php on your server?

Related

Why would the selfsame code instanciating PHPMailer work in one file/directory and not work in another?

The attached code works perfectly on our server.
FYI, the commented-out require() statement loads a file containing every line of what follows it exactly.
The problem occurs when I uncomment the require() statement and then comment out its identical contents below it.
When I do so, suddenly the PHPMailer object refuses to instanciate.
I have double-checked permissions for all files/folders and they are all fine.
This is a real head-scratcher.
HELP!
use PHPMailer\PHPMailer\PHPMailer;
use PHPMailer\PHPMailer\Exception;
require( CLASSES . "src/PHPMailer.php" );
require( CLASSES . "src/SMTP.php" );
require( CLASSES . "src/Exception.php" );
// require( 'receipt_email.php' );
$email = new PHPMailer();
try
$email->IsSMTP();
$email->SMTPDebug = 0;
$email->SMTPAuth = TRUE;
$email->Host = SERVER_NAME;
$email->SMTPSecure = "tls";
$email->Port = "587";
$email->Username = MAIL_ACCTNAME;
$email->Password = MAIL_PASSWORD;
$email->FromName = "do_not_reply#lcus.edu";
$email->From = "LCUonline";
$email->Subject = "TEST MESSAGE";
$email->Body = "TEST MESSAGE BODY";
$email->AddAddress( "developer#lcus.edu", "Dr. Steve Willis" );
echo $email->Send() ? "SUCCESS" : "FAILED";
}
catch( Exception $e ) {
echo "Message could not be sent. Mailer Error: {$email->ErrorInfo}";
}
unset( $email );
It's because you don't understand how use statements work. They are local aliases that apply only to the file in which they appear, as the docs say. So if you put them in one file, and have a namespaced class instantiation in another, it won't work, as you're finding. The separation in your case should be done like this:
In the first file:
require( CLASSES . "src/PHPMailer.php" );
require( CLASSES . "src/SMTP.php" );
require( CLASSES . "src/Exception.php" );
require( 'receipt_email.php' );
in receipt_email.php:
use PHPMailer\PHPMailer\PHPMailer;
use PHPMailer\PHPMailer\Exception;
$email = new PHPMailer();
This, amongst other things, is why it's really worth learning how to use composer as it simplifies these things enormously.

Command execution timesout when using Process in Laravel

I am building website screenshots SaaS with Laravel. I am trying to execute a command from it, more precisely a nodejs script; however I keep getting ProcessTimedOutException error. Here is where the magic happens, look at the take method $process-run(); line throws the exception:
<?php
namespace App\Logic;
use App\Logic\TimeHelper;
use App\UrlRequest;
use Illuminate\Support\Facades\Storage;
use Symfony\Component\Process\Exception\ProcessFailedException;
use Symfony\Component\Process\Process;
class Screenshot {
static function take(UrlRequest $urlRequest)
{
$name = self::generateName($urlRequest);
$command = self::buildScreenshotCommand($name, $urlRequest);
$startTime = TimeHelper::milliseconds();
$process = new Process($command);
$process->run();
$endTime = TimeHelper::milliseconds();
if (!$process->isSuccessful())
{
throw new ProcessFailedException($process);
}
$output = $process->getOutput();
if (trim($output) === '')
{
$urlRequest->successful = 1;
$file = self::uploadToS3($name);
$urlRequest->image_url = $file['url'];
$urlRequest->file_name = $name;
$urlRequest->file_size = $file['size'];
$urlRequest->time_it_took_to_take_screenshot_ms = $endTime - $startTime;
}
else
{
$urlRequest->error = $output;
}
if ($urlRequest->save())
{
return $urlRequest;
}
return false;
}
static function uploadToS3($name)
{
$name = 'screenshots/' . $name;
Storage::disk('s3')->put($name, Storage::disk('local')->get($name), ['visibility' => 'public']); // upload to S3
$fileSize = Storage::disk('local')->size($name);
Storage::disk('local')->delete($name);
return [
'url' => Storage::disk('s3')->url($name),
'size' => $fileSize
];
}
static function generateName($urlRequest)
{
$name = time() . rand(10000, 99999);
$extension = '.png';
if (isset($urlRequest->pdf) AND $urlRequest->pdf == 1)
{
$extension = '.pdf';
}
while (UrlRequest::where('file_name', '=', $name . $extension)->first())
{
$name = time() . rand(10000, 99999);
}
return $name . $extension;
}
static function buildScreenshotCommand($name, $urlRequest)
{
$command = 'cd ' . base_path() . ' && node puppeteer-screenshots-init.js ';
$command .= "--url={$urlRequest->url} ";
$fullPath = storage_path('app') . '/screenshots/' . $name;
$command .= "--path={$fullPath} ";
if (isset($urlRequest->pdf))
{
$command .= "--pdf=true ";
}
if (isset($urlRequest->viewport_width))
{
$command .= "--viewportWidth={$urlRequest->viewport_width} ";
}
if (isset($urlRequest->mobile))
{
$command .= '--mobile=true ';
}
if (isset($urlRequest->media_type_print))
{
$command .= '--mediaTypePrint=true ';
}
if (isset($urlRequest->user_agent))
{
$command .= "--userAgent={$urlRequest->user_agent}";
}
return $command;
}
}
Here is the nodejs script:
require('puppeteer-screenshots').init();
puppeteer-screenshots (this is the nodejs script I am trying to call) is a package which I've written for taking screenshots in an easy manner using Google's Puppeteer package. It's open source and available on BitBucket, here is a link to the one and only file. The line in this package that seems to hang the process is line 138 or, to be specific, this:
await page.screenshot({path: config.path, fullPage: true});
when this line is removed the process finishes but obviously doesn't save the screenshot.
Also, I've tried rewriting the code with exec instead of Process but in that case the script just hangs and it doesn't finish as well.
Also, when executing the exact same command from the command line everything works as expected.

display /var/log/messages in html/php output?

I am trying to display the output of /var/log/messages or similar (../secure for instance) in a webpage that I access through a webserver on the same host.
Should I use bash to tail -f >> the messages file to a new output file and display that text file in the html page, or is there a better way to do this?
Thanks!
idiglivemusic
If you're looking for a way to display actual file contents online without the need to reload the page, then you should setup a WebSocket server.
You can build a WebSocket server using a framework such as phpDaemon, ReactPHP, Ratchet, icicle, or implement your own server with the help of PHP extensions wrapping asynchronous libraries: event, ev, or similar.
I've chosen a random framework from the list above: Ratchet. Ratchet is based on ReactPHP. ReactPHP chooses a backend for the event loop from the following list:
- libevent extension,
- libev extension,
- event extension,
- or an internal class based on the built-in stream_select() function.
As a maintainer of the event extension, I've chosen event.
I've written a "quick" example just to give you idea of how it might be implemented. You'll most likely have to work out your own version, maybe using different tools. But the code should give you an impulse.
src/MyApp/Server.php
<?php
namespace MyApp;
use Ratchet\MessageComponentInterface;
use Ratchet\ConnectionInterface;
class Server implements MessageComponentInterface {
protected $clients;
public function __construct() {
$this->clients = new \SplObjectStorage;
}
public function onOpen(ConnectionInterface $conn) {
$this->clients->attach($conn);
echo "New connection! ({$conn->resourceId})\n";
}
public function onMessage(ConnectionInterface $from, $msg) {
$numRecv = count($this->clients) - 1;
printf("Connection %d sending '%s' to %d other connection%s\n",
$from->resourceId, $msg, $numRecv, $numRecv == 1 ? '' : 's');
foreach ($this->clients as $client) {
if ($from !== $client) {
$client->send($msg);
}
}
}
public function onClose(ConnectionInterface $conn) {
$this->clients->detach($conn);
echo "Connection {$conn->resourceId} has disconnected\n";
}
public function onError(ConnectionInterface $conn, \Exception $e) {
echo "An error has occurred: {$e->getMessage()}\n";
$conn->close();
}
public function broadcast($msg) {
foreach ($this->clients as $client) {
$client->send($msg);
}
}
}
server.php
<?php
use Ratchet\Server\IoServer;
use Ratchet\Http\HttpServer;
use Ratchet\WebSocket\WsServer;
use MyApp\Server;
require __DIR__ . '/vendor/autoload.php';
$server = IoServer::factory(
new HttpServer(
new WsServer(
$my_app_server = new Server()
)
),
9989
);
$loop = $server->loop;
$filename = '/var/log/messages';
$loop->addPeriodicTimer(5, function ()
use ($filename, $my_app_server)
{
static $stat_info;
if ($stat_info == null) {
clearstatcache(true, $filename);
$stat_info = stat($filename);
}
clearstatcache(true, $filename);
$st = stat($filename);
$size_diff = $st['size'] - $stat_info['size'];
echo "Diff = $size_diff bytes\n";
if ($size_diff > 0) {
$offset = $stat_info['size'];
$bytes = $size_diff;
} elseif ($size_diff < 0) {
// The file is likely truncated by `logrotate` or similar utility
$offset = 0;
$bytes = $st['size'];
} else {
$bytes = 0;
}
$stat_info = $st;
if ($bytes) {
if (! $fp = fopen($filename, 'r')) {
fprintf(STDERR, "Failed to open file $filename\n");
return;
}
if ($offset > 0) {
fseek($fp, $offset);
}
if ($msg = fread($fp, $bytes)) {
$my_app_server->broadcast($msg);
}
fclose($fp);
}
}
);
$server->run();
test.html
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=utf-8">
<title>Test</title>
</head>
<body>
<script>
var conn = new WebSocket('ws://localhost:9989');
conn.onopen = function(e) {
console.log("Connection established!");
};
conn.onmessage = function(e) {
console.log("Msg from server", e.data);
};
</script>
</body>
</html>
I'll skip the steps required to setup a basic test environment using Composer. Assuming you have successfully configured the test environment for the files above, you'll be able to run the server with the following command:
php server.php
Check, if the user has permissions to read /var/log/messages. On my system only root can read the file. So you might need to run the above-mentioned command with sudo(root permissions).
Now you can open test.html in a browser and look at the console output. Then trigger some event which is normally logged to the messages file. For instance, you can invoke sudo with a wrong password. The server should detect the changes within interval of 5 seconds, then send it to the WebSocket clients.
If you're using tail -f, that means that you'll be continuously getting data from the file while it grows as the command runs.
You can use cat or tail -n. Also, of course, you can access files directly by creating a symbolic or hard link to them (ln source-file link-file, ln -s source-file link-file) - but make sure, that your web-server has enough rights to access them.
Make sure that your html-server has rights to access the page and read the page (with something like cat, tail, grep).
In <html> put the output between <pre> and </pre>.
Method 1
In one of your base directories, create a symbolic link
ln -s /var/log/messages messages
If the directory belonged to say, test.web, access the log with
unmesh
http://test.web/messages
Method 2
If you're looking for a php script then, first create the link as mentioned in method 1. Then, create a new file, say, readlog.php in the base directory of test.web with the below content :
<?php
readfile(“$DOCUMENT_ROOT/messages”);
?>
Access the readlog.php like :
http://test.web/readlog.php
Requirement:
Read access should be enabled for all users for /var/log/messages.
Note:
Setting read option for /var/log/messages for the whole world is NOT a good idea.
<!DOCTYPE html>
<html>
<head>
<title>toyLogs</title>
</head>
<body>
<div><p><?php include('/var/www/html/accesslog.txt'); ?></p></div>
</body>
</html>

Multi cURL function not working in XAMPP localhost

XAMPP Version: 1.8.1
PHP Version: 5.4.7
I'm using the following cURL class: http://semlabs.co.uk/journal/object-oriented-curl-class-with-multi-threading
Class source code: http://paste2.org/XGeMUMme
In XAMPP, when I attempt to do a multi curl session, the page never loads and my CPU usage goes up and doesn't stop until I end the Apache process; however, when I run the code on my webhost, it works perfectly.
The following works in XAMPP, because it's a single-threaded cURL execution
$curl = new CURL();
$opts = array( CURLOPT_RETURNTRANSFER => true, CURLOPT_FOLLOWLOCATION => true );
$curl->addSession( 'http://yahoo.com/', $opts );
$result = $curl->exec();
$curl->clear();
However, when this code is ran (only adding one more session, which makes it multi-threaded), the page is never loaded as I stated previously.
$curl = new CURL();
$opts = array( CURLOPT_RETURNTRANSFER => true, CURLOPT_FOLLOWLOCATION => true );
$curl->addSession( 'http://yahoo.com/', $opts );
$curl->addSession( 'http://google.com/', $opts );
$result = $curl->exec();
$curl->clear();
Although, the above code does work on my website.
Any ideas and/or solutions regarding this problem? Thank you!
for multi curl , the curl_multi_select() always returns -1 , which causes to end the execution time
you should use you own sleep time usleep(100);

mod_rewrite to download - makes suspicous looking file

I decided to try and use mod_rewrite to hide the location of a file that a user can download.
So they click on a link that's directed to "/download/some_file/" and they instead get "/downloads/some_file.zip"
Implemented like this:
RewriteRule ^download/([^/\.]+)/?$ downloads/$1.zip [L]
This works except they when the download progress appears I'm getting a file "download" with no extension which looks suspicious and the user might not be aware they are supposed to unzip it. Is there a way of doing this so it looks like an actual file? Or is there a better a way I should be doing this?
To provide some context/reason for hiding the location of the file. This is for a band where the music can be downloaded for free provided the user signs up for the mailing list.
Also not I need to do this within .htaccess
You can set the filename by sending the Content-disposition header:
https://serverfault.com/questions/101948/how-to-send-content-disposition-headers-in-apache-for-files
Ok so I believe that I'm restricted as to what headers I can set using .htaccess
So I have instead solved this using php.
I initially copied a download php script found here:
How to rewrite and set headers at the same time in Apache
However my file size was too big and so this was not working properly.
After a bit of googling I came across this: http://teddy.fr/blog/how-serve-big-files-through-php
So my complete solution is as follows...
First send requests to download script:
RewriteRule ^download/([^/\.]+)/?$ downloads/download.php?download=$1 [L]
Then get full filename, set headers, and serve it chunk by chunk:
<?php
if ($_GET['download']){
$file = $_SERVER['DOCUMENT_ROOT'].'media/downloads/' . $_GET['download'] . '.zip';
}
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk
// Read a file and display its content chunk by chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt =0;
// $handle = fopen($filename, 'rb');
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
$save_as_name = basename($file);
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: no-cache');
header("Content-Type: application/zip");
header("Content-Disposition: disposition-type=attachment; filename=\"$save_as_name\"");
readfile_chunked($file);
?>

Resources