Command execution timesout when using Process in Laravel - node.js

I am building website screenshots SaaS with Laravel. I am trying to execute a command from it, more precisely a nodejs script; however I keep getting ProcessTimedOutException error. Here is where the magic happens, look at the take method $process-run(); line throws the exception:
<?php
namespace App\Logic;
use App\Logic\TimeHelper;
use App\UrlRequest;
use Illuminate\Support\Facades\Storage;
use Symfony\Component\Process\Exception\ProcessFailedException;
use Symfony\Component\Process\Process;
class Screenshot {
static function take(UrlRequest $urlRequest)
{
$name = self::generateName($urlRequest);
$command = self::buildScreenshotCommand($name, $urlRequest);
$startTime = TimeHelper::milliseconds();
$process = new Process($command);
$process->run();
$endTime = TimeHelper::milliseconds();
if (!$process->isSuccessful())
{
throw new ProcessFailedException($process);
}
$output = $process->getOutput();
if (trim($output) === '')
{
$urlRequest->successful = 1;
$file = self::uploadToS3($name);
$urlRequest->image_url = $file['url'];
$urlRequest->file_name = $name;
$urlRequest->file_size = $file['size'];
$urlRequest->time_it_took_to_take_screenshot_ms = $endTime - $startTime;
}
else
{
$urlRequest->error = $output;
}
if ($urlRequest->save())
{
return $urlRequest;
}
return false;
}
static function uploadToS3($name)
{
$name = 'screenshots/' . $name;
Storage::disk('s3')->put($name, Storage::disk('local')->get($name), ['visibility' => 'public']); // upload to S3
$fileSize = Storage::disk('local')->size($name);
Storage::disk('local')->delete($name);
return [
'url' => Storage::disk('s3')->url($name),
'size' => $fileSize
];
}
static function generateName($urlRequest)
{
$name = time() . rand(10000, 99999);
$extension = '.png';
if (isset($urlRequest->pdf) AND $urlRequest->pdf == 1)
{
$extension = '.pdf';
}
while (UrlRequest::where('file_name', '=', $name . $extension)->first())
{
$name = time() . rand(10000, 99999);
}
return $name . $extension;
}
static function buildScreenshotCommand($name, $urlRequest)
{
$command = 'cd ' . base_path() . ' && node puppeteer-screenshots-init.js ';
$command .= "--url={$urlRequest->url} ";
$fullPath = storage_path('app') . '/screenshots/' . $name;
$command .= "--path={$fullPath} ";
if (isset($urlRequest->pdf))
{
$command .= "--pdf=true ";
}
if (isset($urlRequest->viewport_width))
{
$command .= "--viewportWidth={$urlRequest->viewport_width} ";
}
if (isset($urlRequest->mobile))
{
$command .= '--mobile=true ';
}
if (isset($urlRequest->media_type_print))
{
$command .= '--mediaTypePrint=true ';
}
if (isset($urlRequest->user_agent))
{
$command .= "--userAgent={$urlRequest->user_agent}";
}
return $command;
}
}
Here is the nodejs script:
require('puppeteer-screenshots').init();
puppeteer-screenshots (this is the nodejs script I am trying to call) is a package which I've written for taking screenshots in an easy manner using Google's Puppeteer package. It's open source and available on BitBucket, here is a link to the one and only file. The line in this package that seems to hang the process is line 138 or, to be specific, this:
await page.screenshot({path: config.path, fullPage: true});
when this line is removed the process finishes but obviously doesn't save the screenshot.
Also, I've tried rewriting the code with exec instead of Process but in that case the script just hangs and it doesn't finish as well.
Also, when executing the exact same command from the command line everything works as expected.

Related

PHP ZipArchive | creating unsupported file & not downloading on Cpanel | working on local

hi my goal is to create zip file and download it, working okay on local server but giving problem on Cpanel when i hit download button
$fileEntry = FileEntry::with('childEntriesRecursive')->where('shared_id', $shared_id)->notExpired()->firstOrFail();
abort_if(!static::accessCheck($fileEntry), 404);
$zip_file = str_replace(' ', '_', $fileEntry->name).'.zip';
$zip = new ZipArchive();
$zip->open(public_path($zip_file), ZipArchive::CREATE | ZipArchive::OVERWRITE);
foreach ($fileEntry->childEntriesRecursive as $key => $file) {
$getFile = storage_path('app/public/'.$file->path);
if (file_exists($getFile) && is_file($getFile)){
echo $file->name;
$zip->addFile(storage_path('app/public/'.$file->path), $file->name);
}
}
$zip->close();
return response()->download($zip_file, basename($zip_file))->deleteFileAfterSend(true);
output is

Japanese Transliteration in Node.js and Kakasi

I have written a little wrapper for Kakasi
that is like the following:
Kakasi.prototype.transliterate = function (data) {
var self = this;
return new Promise(function (resolve, reject) {
var args;
args = [
'-i',
'euc',
'-Ha',
'-Ka',
'-Ja',
'-Ea',
'-ka',
'-s',
'-iutf8',
'-outf8'
];
var kakasi = spawn(self._options.bin, args, {});
console.log( "echo \""+data+"\" | " + kakasi.spawnargs.join(' ') )
args = [
data
];
var echo = spawn('echo', args, {});
echo.stdout.pipe( kakasi.stdin );
var res='';
kakasi.stdout.on('data', function(_data) {
var data=new Buffer(_data,'utf-8').toString();
res+=data;
});
kakasi.stdout.on('end', function(_) {
return resolve(res);
});
kakasi.on('error', function(error) {
return reject(error);
});
if (self._options.debug) kakasi.stdout.pipe(process.stdout);
});
}//transliterate
This code basically does the following command
echo "退屈であくびばっかしていた毎日" | kakasi -i euc -Ha -Ka -Ja -Ea -ka -s -iutf8 -outf8
that outputs taikutsu deakubibakkashiteita mainichi
Problem is that the javascript is missing some output infact:
$ node transliterate.js
echo "退屈であくびばっかしていた毎日" | kakasi -i euc -Ha -Ka -Ja -Ea -ka -s -iutf8 -outf8
----------
deakubibakkashiteita
The input parameters are the same, but for some reason (encoding?) the child output to stdout is different.
The kakasi.js code is available here.
I think your dictionary loader causes this problem. You should flip the dictionaries as follows.
process.env.KANWADICTPATH = resolve('./data/kanwadict');
process.env.ITAIJIDICTPATH = resolve('./data/itaijidict');
instead of
process.env.KANWADICTPATH = resolve('./data/itaijidict');
process.env.ITAIJIDICTPATH = resolve('./data/kanwadict');

How to call a function in Perl script after performing SSH?

I am creating a perl script in which I have to ssh multiple servers from same script and perform same commands on all these remote servers.
Right now I am using "If loop" and call all other servers from this script and perform command on them.
I want to create a function with these set of commands, that I need to perform on these different servers.
if($random_number==1){
use Net::SSH::perl
use lib qw("user/share/perl5/");
my $hostname = "10.*.*.*";
my $username = "root";
my $password = "root\#123";
my $cmd1 = "ls /home/ashish/"
my $cmd2 = "netstat -na | grep *.*.*.*;
$ssh->login("$username" , "$password");
my ($stdout,$stderr,$exit) = $ssh->cmd("$smd1" && "$cmd2");
print $stdout;
}
the above commands after if syntax needs to be repeated for different servers.
want to use a function call.
Start with general programming toutorials, then do it like:
use Net::SSH::perl;
use strict;
use warnings;
my #servers = (
{
hostname => 'somehost1',
username => 'someuser1',
password => 'somepass1',
commands => ['somecmd11','somecmd12'],
},
{
hostname => 'somehost2',
username => 'someuser2',
password => 'somepass2',
commands => ['somecmd21','somecmd22'],
},
# ...
);
do_something_on_remote_servers_one_by_one( #servers );
exit(0);
sub do_something_on_remote_servers_one_by_one {
my (#servers) = #_;
foreach my $server (#servers) {
my $ssh = Net::SSH::perl->new($server->{hostname});
$ssh->login($server->{username}, $server->{password});
my $cmd_string = join(' & ', #{ $server->{commands} } );
my ($stdout,$stderr,$exit) = $ssh->cmd($cmd_string);
print $stdout;
}
}
After that, you can think about executing commands in paralell.

Download files from RackSpace Files

I have installed PHP SDK using composer. It seems samples folder has old code cause it does not work for me. I want to download some of files from RackSpace folder. Following is my code and it brings nothing.
<?php
require 'vendor/autoload.php';
$authURL = 'https://identity.api.rackspacecloud.com/v2.0/';
$credentials = array(
'username' => 'XXXX',
'apiKey' => 'XXXX'
);
$connection = new \OpenCloud\Rackspace($authURL, $credentials);
// var_dump($connection);exit;
$objstore = $connection->objectStoreService('cloudFiles', 'DFW');
// get our containers
print("Containers:\n");
$conlist = $objstore->listContainers();
//var_dump($conlist);
while($container = $conlist->Next()) {
printf("* %s\n", $container->name);
}
First and foremost, update to the latest release of php-opencloud, currently 1.7.
Next, included sample code for the Object Store is located here, but doesn't include what you were looking to do.
The following code, given a path, will iterate through your containers and save the objects from your container to the destination path ($savePath). If the object already exists in that path, it will be skipped. This version includes output indicating success or failure for each object. Give this a try and let me know if you have any issues.
NOTE: Keep in mind that Rackspace's Cloud Files, Object Store, is handled on a per Datacenter basis so files stored in a container in ORD would be accessible only if you connect to the objectStoreService in ORD.
<?php
require 'vendor/autoload.php';
$authURL = 'https://identity.api.rackspacecloud.com/v2.0/';
$credentials = array(
'username' => 'YOUR_USERNAME',
'apiKey' => 'YOUR_API_KEY',
);
$savePath = '/path/to/files/';
$connection = new \OpenCloud\Rackspace($authURL, $credentials);
$objstore = $connection->objectStoreService('cloudFiles', 'ORD');
// get our containers
print("Containers:\n");
$conlist = $objstore->listContainers();
//var_dump($conlist);
while($container = $conlist->Next()) {
printf("*** %s\n", $container->name);
if($container->name == 'test2')
{
$files = $container->ObjectList();
while($o = $files->Next())
{
$file_name = $o->getName();
// Get our object
$file = $container->getObject($file_name);
printf("** %s\n", $file->getName());
// Let's save this file
echo "* Saving object\n";
if(file_exists($savePath.$file_name))
{
echo "* File already exists! SKIPPING\n\n";
}
else
{
if (!$fp = #fopen($savePath.$file_name, "wb")) {
throw new OpenCloud\Common\Exceptions\IOError(sprintf(
'Could not open file [%s] for writing',
$savePath.$file_name
));
}
//$retval = fwrite($fp, $o->getContent());
if (fwrite($fp, $file->getContent()) === FALSE) {
echo "* ERROR - Cannot write to file ($savePath.$file_name)\n\n";
}
else
{
echo "* File successfully written\n\n";
}
}
}
}
}
Output:
Containers:
*** gallery
*** test2
** 61OUUC44G-L._SL1471_.jpg
* Saving object
* File written
** Computer-Code.jpg
* Saving object
* File written
** accessibility2.jpg
* Saving object
* File written
Directory Listing on my server:
root#app01:/path/to/files# ll
total 960
drwxrwxrwx 2 root root 4096 Nov 8 18:53 ./
drwxr-xr-x 15 www-data www-data 4096 Nov 8 18:20 ../
-rw-r--r-- 1 www-data www-data 68650 Nov 8 18:45 61OUUC44G-L._SL1471_.jpg
-rw-r--r-- 1 www-data www-data 374177 Nov 8 18:45 accessibility2.jpg
-rw-r--r-- 1 www-data www-data 515919 Nov 8 18:45 Computer-Code.jpg
Download Rackspace Cloud Files recursively
<?php
/**
* "require": {
* "rackspace/php-opencloud": "dev-master"
* }
*/
ini_set('memory_limit', '2048M'); // size must be bigger than the biggest file
ini_set('max_execution_time', 0);
require 'vendor/autoload.php';
use OpenCloud\Rackspace;
// Instantiate a Rackspace client.
$client = new Rackspace(Rackspace::US_IDENTITY_ENDPOINT, array(
'username' => '<USERNAME>',
'apiKey' => '<APIKEY>'
));
$objStore = $client->objectStoreService('cloudFiles', 'LON');
$savePath = __DIR__.'/backup/';
// get our containers
print("Containers:\n");
$containersList = $objStore->listContainers();
while($container = $containersList->Next()) {
if( ! in_array($container->name, array('.CDN_ACCESS_LOGS', '<CONTAINER_TO_EXCLUDE>'))) {
printf("*** %s\n", $container->name);
$containerDir = $savePath.$container->name.'/';
if (!is_dir($containerDir)) {
mkdir($containerDir, 0777, true);
}
$files = $container->ObjectList();
while($o = $files->Next()) {
$file_name = $o->getName();
if (file_exists($containerDir . $file_name)) {
echo '## '.$containerDir.$file_name.' already exists'."\n";
continue;
}
// Get our object
$file = $container->getObject($file_name);
if (strpos($file->getName(), '<FILES_TO_EXCLUDE>') !== false) {
continue;
}
$tempDir = $containerDir . dirname($file->getName()) . '/';
if (!is_dir($tempDir)) {
mkdir($tempDir, 0777, true);
}
if (file_put_contents($containerDir . $file_name, $file->getContent())) {
printf("** %s - OK\n", $file->getName());
} else {
printf("** %s - KO\n", $file->getName());
}
unset($file);
}
}
}

File transfered via SFTP

I have a Linux (openSUSE 10.X) box and have a SFTP service on it.
When someone puts a file I have to write a script to move the files to another dir. I do not want to write a cron job. Is there an event or something I can check to see if they have sent the file?
You can write a c application and hook into inotify events.
Check also Net::SFTP::Server, an SFTP server written in Perl that can be extended to do things like the one you need.
Some code:
#!/usr/bin/perl
use strict;
use warnings;
use File::Basename ();
my $server = Server->new(timeout => 15);
$server->run;
exit(0);
package Server;
use Net::SFTP::Server::Constants qw(SSH_FXF_WRITE);
use parent 'Net::SFTP::Server::FS';
sub handle_command_open_v3 {
my ($self, $id, $path, $flags, $attrs) = #_;
my $writable = $flags & SSH_FXF_WRITE;
my $pflags = $self->sftp_open_flags_to_sysopen($flags);
my $perms = $attrs->{mode};
my $old_umask;
if (defined $perms) {
$old_umask = umask $perms;
}
else {
$perms = 0666;
}
my $fh;
unless (sysopen $fh, $path, $pflags, $perms) {
$self->push_status_errno_response($id);
umask $old_umask if defined $old_umask;
return;
}
umask $old_umask if defined $old_umask;
if ($writable) {
Net::SFTP::Server::FS::_set_attrs($path, $attrs)
or $self->send_status_errno_response($id);
}
my $hid = $self->save_file_handler($fh, $flags, $perms, $path);
$self->push_handle_response($id, $hid);
}
sub handle_command_close_v3 {
my $self = shift;
my ($id, $hid) = #_;
my ($type, $fh, $flags, $perms, $path) = $self->get_handler($hid);
$self->SUPER::handle_command_close_v3(#_);
if ($type eq 'file' and $flags & SSH_FXF_WRITE) {
my $name = File::Basename::basename($path);
rename $path, "/tmp/$name";
}
}
Save the script to somewhere in your server, chmod 755 $it, and configure OpenSSH to use it as the SFTP server instead of the default one.

Resources