PHP ZipArchive | creating unsupported file & not downloading on Cpanel | working on local - zip

hi my goal is to create zip file and download it, working okay on local server but giving problem on Cpanel when i hit download button
$fileEntry = FileEntry::with('childEntriesRecursive')->where('shared_id', $shared_id)->notExpired()->firstOrFail();
abort_if(!static::accessCheck($fileEntry), 404);
$zip_file = str_replace(' ', '_', $fileEntry->name).'.zip';
$zip = new ZipArchive();
$zip->open(public_path($zip_file), ZipArchive::CREATE | ZipArchive::OVERWRITE);
foreach ($fileEntry->childEntriesRecursive as $key => $file) {
$getFile = storage_path('app/public/'.$file->path);
if (file_exists($getFile) && is_file($getFile)){
echo $file->name;
$zip->addFile(storage_path('app/public/'.$file->path), $file->name);
}
}
$zip->close();
return response()->download($zip_file, basename($zip_file))->deleteFileAfterSend(true);
output is

Related

Cannot place text inside file using linux

I have installed moodle in AWS EC2 instance. But config.php file is missing. And I created a file using **
touch config.php
** command. But I couldnt place my content inside that file.
My content is
<?php // Moodle configuration file
unset($CFG); global $CFG; $CFG = new stdClass();
$CFG->dbtype = 'mariadb'; $CFG->dblibrary = 'native'; $CFG->dbhost
= 'localhost'; $CFG->dbname = 'moodledb'; $CFG->dbuser = 'root'; $CFG->dbpass = 'pass'; $CFG->prefix = 'mdl_'; $CFG->dboptions = array ( 'dbpersist' => 0, 'dbport' => '', 'dbsocket' => '', 'dbcollation' => 'utf8mb4_general_ci', );
$CFG->wwwroot = 'http://localhost/moodle'; $CFG->dataroot = 'L:\\xampp\\moodledata'; $CFG->admin = 'admin';
$CFG->directorypermissions = 0777;
require_once(__DIR__ . '/lib/setup.php');
I have connected to AWS using Putty from windows.
Use cat and heredoc:
cat > config.php <<'EOF'
your content
goes here
EOF
And you can skip touch this way.

Command execution timesout when using Process in Laravel

I am building website screenshots SaaS with Laravel. I am trying to execute a command from it, more precisely a nodejs script; however I keep getting ProcessTimedOutException error. Here is where the magic happens, look at the take method $process-run(); line throws the exception:
<?php
namespace App\Logic;
use App\Logic\TimeHelper;
use App\UrlRequest;
use Illuminate\Support\Facades\Storage;
use Symfony\Component\Process\Exception\ProcessFailedException;
use Symfony\Component\Process\Process;
class Screenshot {
static function take(UrlRequest $urlRequest)
{
$name = self::generateName($urlRequest);
$command = self::buildScreenshotCommand($name, $urlRequest);
$startTime = TimeHelper::milliseconds();
$process = new Process($command);
$process->run();
$endTime = TimeHelper::milliseconds();
if (!$process->isSuccessful())
{
throw new ProcessFailedException($process);
}
$output = $process->getOutput();
if (trim($output) === '')
{
$urlRequest->successful = 1;
$file = self::uploadToS3($name);
$urlRequest->image_url = $file['url'];
$urlRequest->file_name = $name;
$urlRequest->file_size = $file['size'];
$urlRequest->time_it_took_to_take_screenshot_ms = $endTime - $startTime;
}
else
{
$urlRequest->error = $output;
}
if ($urlRequest->save())
{
return $urlRequest;
}
return false;
}
static function uploadToS3($name)
{
$name = 'screenshots/' . $name;
Storage::disk('s3')->put($name, Storage::disk('local')->get($name), ['visibility' => 'public']); // upload to S3
$fileSize = Storage::disk('local')->size($name);
Storage::disk('local')->delete($name);
return [
'url' => Storage::disk('s3')->url($name),
'size' => $fileSize
];
}
static function generateName($urlRequest)
{
$name = time() . rand(10000, 99999);
$extension = '.png';
if (isset($urlRequest->pdf) AND $urlRequest->pdf == 1)
{
$extension = '.pdf';
}
while (UrlRequest::where('file_name', '=', $name . $extension)->first())
{
$name = time() . rand(10000, 99999);
}
return $name . $extension;
}
static function buildScreenshotCommand($name, $urlRequest)
{
$command = 'cd ' . base_path() . ' && node puppeteer-screenshots-init.js ';
$command .= "--url={$urlRequest->url} ";
$fullPath = storage_path('app') . '/screenshots/' . $name;
$command .= "--path={$fullPath} ";
if (isset($urlRequest->pdf))
{
$command .= "--pdf=true ";
}
if (isset($urlRequest->viewport_width))
{
$command .= "--viewportWidth={$urlRequest->viewport_width} ";
}
if (isset($urlRequest->mobile))
{
$command .= '--mobile=true ';
}
if (isset($urlRequest->media_type_print))
{
$command .= '--mediaTypePrint=true ';
}
if (isset($urlRequest->user_agent))
{
$command .= "--userAgent={$urlRequest->user_agent}";
}
return $command;
}
}
Here is the nodejs script:
require('puppeteer-screenshots').init();
puppeteer-screenshots (this is the nodejs script I am trying to call) is a package which I've written for taking screenshots in an easy manner using Google's Puppeteer package. It's open source and available on BitBucket, here is a link to the one and only file. The line in this package that seems to hang the process is line 138 or, to be specific, this:
await page.screenshot({path: config.path, fullPage: true});
when this line is removed the process finishes but obviously doesn't save the screenshot.
Also, I've tried rewriting the code with exec instead of Process but in that case the script just hangs and it doesn't finish as well.
Also, when executing the exact same command from the command line everything works as expected.

import multiple csv to mongodb

I know how to import a CSV file as a collection using mongoimport in shell/cmd prompt, or using GUI such as 3T studio.
each line is a document with headerline as their title.
For example a csv file name Data.csv
headerline and content are as follow:
Example
It can be done very easily using the above method if it is for just one file.
May I know is there a way to import multiple files (few hundred) where it can.
-Separate each file as one collection
-Use file name as collection name
-Import all contents as each documents under the collections
Best is can use nodejs, but other method are more than welcome.
Thank you.
update
for i in `ls ~/te/*.csv`; do
./mongoimport -d test $i --type=csv --headerline ;
done
const exec = require('child_process').exec;
var yourscript = exec('bash mongoin.sh /te',
(error, stdout, stderr) => {
console.log(`${stdout}`);
console.log(`${stderr}`);
if (error !== null) {
console.log(`exec error: ${error}`);
}
});
In bash:
for i in `ls ~/Desktop/*.csv`; do
./mongoimport -d local $i --type=csv --headerline ;
done
In batch:
forfiles /p c:\te /m *.csv /c "cmd /c mongoimport -d local #file --type csv --headerline"
Where
~/Desktop/*.csv is the folder where the csv files are located.
And local is the mongodb database to import to.
The name of the collection will be picked up by the base filename of each csv file.
Put the content of the bash code in a script, say mymongo.sh:
for i in `ls $1`; do
./mongoimport -d local $i --type=csv --headerline ;
done
And then call it from node, with the following code:
const exec = require('child_process').exec;
var yourscript = exec('bash mymongo.sh /Users/niko/Desktop',
(error, stdout, stderr) => {
console.log(`${stdout}`);
console.log(`${stderr}`);
if (error !== null) {
console.log(`exec error: ${error}`);
}});
or on windows, with the code in mymongo.bat
const exec = require('child_process').exec;
var yourscript = exec('cmd /c c:/te/mymongo.bat',
(error, stdout, stderr) => {
console.log(`${stdout}`);
console.log(`${stderr}`);
if (error !== null) {
console.log(`exec error: ${error}`);
}});
This is python version of it.
import os
import subprocess
# directory of files
dir_files = 'C:\data'
# create list of all files
_, _, fns = next(os.walk(dir_files))
files = [os.path.join(dir_files, fn) for fn in fns]
# mongotool address
mongotool = r'C:\Program Files\MongoDB\Server\4.4\bin\mongoimport.exe'
# name of mongodb database
mydatabase = 'mydatabase'
# name of mongodb collection
mycollection = 'mycollection'
# import all files to mongodb
for fl in files:
commands =[mongotool, '--db', mydatabase,
'--collection', mycollection,
'--file', fl,
'--type', 'tsv',
'--headerline']
subprocess.Popen(commands, shell=True)

Using SFTP to transfer images from HTML form to remote linux server using PERL/CGI.pm

This is a school project, and the instructor has no knowledge of how to write the code.
I am using CGI and I am attempting to transfer a file without using Net::FTP or Net::SFTP since the server I am attempting to transfer it to will not allow connections from these services. I have written the HTML form and I am able to grab the name of the file uploaded through CGI.
Is it possible to use the SFTP command within a Perl script that resides on a Linux server using bash to transfer a file uploaded through an HTML form?
If anyone knows a way to do it please post the code so I can modify it and insert into my script.
use CGI qw(:standard);
use File::Basename;
my ( $name, $path, $extension) = fileparse ( $productimage, '..*' );
$productimage = $name . $extension;
$productimage =~ tr/ /_/; $productimage =~ s/[^$safechars]//g;
if ( $productimage =~/^([$safechars]+)$/ ) {
$productimage = $1;
} else {
die "Filename contains invalid characters";
}
$fh = upload('image');
$uploaddir = "../../.hidden/images";
open ( UPLOADFILE, ">$uploaddir/$productimage" )
or die "$!"; binmode UPLOADFILE;
while (<$fh>) {
print UPLOADFILE;
}
close UPLOADFILE;
This is the code I used to upload the file into the server.

Download files from RackSpace Files

I have installed PHP SDK using composer. It seems samples folder has old code cause it does not work for me. I want to download some of files from RackSpace folder. Following is my code and it brings nothing.
<?php
require 'vendor/autoload.php';
$authURL = 'https://identity.api.rackspacecloud.com/v2.0/';
$credentials = array(
'username' => 'XXXX',
'apiKey' => 'XXXX'
);
$connection = new \OpenCloud\Rackspace($authURL, $credentials);
// var_dump($connection);exit;
$objstore = $connection->objectStoreService('cloudFiles', 'DFW');
// get our containers
print("Containers:\n");
$conlist = $objstore->listContainers();
//var_dump($conlist);
while($container = $conlist->Next()) {
printf("* %s\n", $container->name);
}
First and foremost, update to the latest release of php-opencloud, currently 1.7.
Next, included sample code for the Object Store is located here, but doesn't include what you were looking to do.
The following code, given a path, will iterate through your containers and save the objects from your container to the destination path ($savePath). If the object already exists in that path, it will be skipped. This version includes output indicating success or failure for each object. Give this a try and let me know if you have any issues.
NOTE: Keep in mind that Rackspace's Cloud Files, Object Store, is handled on a per Datacenter basis so files stored in a container in ORD would be accessible only if you connect to the objectStoreService in ORD.
<?php
require 'vendor/autoload.php';
$authURL = 'https://identity.api.rackspacecloud.com/v2.0/';
$credentials = array(
'username' => 'YOUR_USERNAME',
'apiKey' => 'YOUR_API_KEY',
);
$savePath = '/path/to/files/';
$connection = new \OpenCloud\Rackspace($authURL, $credentials);
$objstore = $connection->objectStoreService('cloudFiles', 'ORD');
// get our containers
print("Containers:\n");
$conlist = $objstore->listContainers();
//var_dump($conlist);
while($container = $conlist->Next()) {
printf("*** %s\n", $container->name);
if($container->name == 'test2')
{
$files = $container->ObjectList();
while($o = $files->Next())
{
$file_name = $o->getName();
// Get our object
$file = $container->getObject($file_name);
printf("** %s\n", $file->getName());
// Let's save this file
echo "* Saving object\n";
if(file_exists($savePath.$file_name))
{
echo "* File already exists! SKIPPING\n\n";
}
else
{
if (!$fp = #fopen($savePath.$file_name, "wb")) {
throw new OpenCloud\Common\Exceptions\IOError(sprintf(
'Could not open file [%s] for writing',
$savePath.$file_name
));
}
//$retval = fwrite($fp, $o->getContent());
if (fwrite($fp, $file->getContent()) === FALSE) {
echo "* ERROR - Cannot write to file ($savePath.$file_name)\n\n";
}
else
{
echo "* File successfully written\n\n";
}
}
}
}
}
Output:
Containers:
*** gallery
*** test2
** 61OUUC44G-L._SL1471_.jpg
* Saving object
* File written
** Computer-Code.jpg
* Saving object
* File written
** accessibility2.jpg
* Saving object
* File written
Directory Listing on my server:
root#app01:/path/to/files# ll
total 960
drwxrwxrwx 2 root root 4096 Nov 8 18:53 ./
drwxr-xr-x 15 www-data www-data 4096 Nov 8 18:20 ../
-rw-r--r-- 1 www-data www-data 68650 Nov 8 18:45 61OUUC44G-L._SL1471_.jpg
-rw-r--r-- 1 www-data www-data 374177 Nov 8 18:45 accessibility2.jpg
-rw-r--r-- 1 www-data www-data 515919 Nov 8 18:45 Computer-Code.jpg
Download Rackspace Cloud Files recursively
<?php
/**
* "require": {
* "rackspace/php-opencloud": "dev-master"
* }
*/
ini_set('memory_limit', '2048M'); // size must be bigger than the biggest file
ini_set('max_execution_time', 0);
require 'vendor/autoload.php';
use OpenCloud\Rackspace;
// Instantiate a Rackspace client.
$client = new Rackspace(Rackspace::US_IDENTITY_ENDPOINT, array(
'username' => '<USERNAME>',
'apiKey' => '<APIKEY>'
));
$objStore = $client->objectStoreService('cloudFiles', 'LON');
$savePath = __DIR__.'/backup/';
// get our containers
print("Containers:\n");
$containersList = $objStore->listContainers();
while($container = $containersList->Next()) {
if( ! in_array($container->name, array('.CDN_ACCESS_LOGS', '<CONTAINER_TO_EXCLUDE>'))) {
printf("*** %s\n", $container->name);
$containerDir = $savePath.$container->name.'/';
if (!is_dir($containerDir)) {
mkdir($containerDir, 0777, true);
}
$files = $container->ObjectList();
while($o = $files->Next()) {
$file_name = $o->getName();
if (file_exists($containerDir . $file_name)) {
echo '## '.$containerDir.$file_name.' already exists'."\n";
continue;
}
// Get our object
$file = $container->getObject($file_name);
if (strpos($file->getName(), '<FILES_TO_EXCLUDE>') !== false) {
continue;
}
$tempDir = $containerDir . dirname($file->getName()) . '/';
if (!is_dir($tempDir)) {
mkdir($tempDir, 0777, true);
}
if (file_put_contents($containerDir . $file_name, $file->getContent())) {
printf("** %s - OK\n", $file->getName());
} else {
printf("** %s - KO\n", $file->getName());
}
unset($file);
}
}
}

Resources