Download files from RackSpace Files - rackspace-cloud

I have installed PHP SDK using composer. It seems samples folder has old code cause it does not work for me. I want to download some of files from RackSpace folder. Following is my code and it brings nothing.
<?php
require 'vendor/autoload.php';
$authURL = 'https://identity.api.rackspacecloud.com/v2.0/';
$credentials = array(
'username' => 'XXXX',
'apiKey' => 'XXXX'
);
$connection = new \OpenCloud\Rackspace($authURL, $credentials);
// var_dump($connection);exit;
$objstore = $connection->objectStoreService('cloudFiles', 'DFW');
// get our containers
print("Containers:\n");
$conlist = $objstore->listContainers();
//var_dump($conlist);
while($container = $conlist->Next()) {
printf("* %s\n", $container->name);
}

First and foremost, update to the latest release of php-opencloud, currently 1.7.
Next, included sample code for the Object Store is located here, but doesn't include what you were looking to do.
The following code, given a path, will iterate through your containers and save the objects from your container to the destination path ($savePath). If the object already exists in that path, it will be skipped. This version includes output indicating success or failure for each object. Give this a try and let me know if you have any issues.
NOTE: Keep in mind that Rackspace's Cloud Files, Object Store, is handled on a per Datacenter basis so files stored in a container in ORD would be accessible only if you connect to the objectStoreService in ORD.
<?php
require 'vendor/autoload.php';
$authURL = 'https://identity.api.rackspacecloud.com/v2.0/';
$credentials = array(
'username' => 'YOUR_USERNAME',
'apiKey' => 'YOUR_API_KEY',
);
$savePath = '/path/to/files/';
$connection = new \OpenCloud\Rackspace($authURL, $credentials);
$objstore = $connection->objectStoreService('cloudFiles', 'ORD');
// get our containers
print("Containers:\n");
$conlist = $objstore->listContainers();
//var_dump($conlist);
while($container = $conlist->Next()) {
printf("*** %s\n", $container->name);
if($container->name == 'test2')
{
$files = $container->ObjectList();
while($o = $files->Next())
{
$file_name = $o->getName();
// Get our object
$file = $container->getObject($file_name);
printf("** %s\n", $file->getName());
// Let's save this file
echo "* Saving object\n";
if(file_exists($savePath.$file_name))
{
echo "* File already exists! SKIPPING\n\n";
}
else
{
if (!$fp = #fopen($savePath.$file_name, "wb")) {
throw new OpenCloud\Common\Exceptions\IOError(sprintf(
'Could not open file [%s] for writing',
$savePath.$file_name
));
}
//$retval = fwrite($fp, $o->getContent());
if (fwrite($fp, $file->getContent()) === FALSE) {
echo "* ERROR - Cannot write to file ($savePath.$file_name)\n\n";
}
else
{
echo "* File successfully written\n\n";
}
}
}
}
}
Output:
Containers:
*** gallery
*** test2
** 61OUUC44G-L._SL1471_.jpg
* Saving object
* File written
** Computer-Code.jpg
* Saving object
* File written
** accessibility2.jpg
* Saving object
* File written
Directory Listing on my server:
root#app01:/path/to/files# ll
total 960
drwxrwxrwx 2 root root 4096 Nov 8 18:53 ./
drwxr-xr-x 15 www-data www-data 4096 Nov 8 18:20 ../
-rw-r--r-- 1 www-data www-data 68650 Nov 8 18:45 61OUUC44G-L._SL1471_.jpg
-rw-r--r-- 1 www-data www-data 374177 Nov 8 18:45 accessibility2.jpg
-rw-r--r-- 1 www-data www-data 515919 Nov 8 18:45 Computer-Code.jpg

Download Rackspace Cloud Files recursively
<?php
/**
* "require": {
* "rackspace/php-opencloud": "dev-master"
* }
*/
ini_set('memory_limit', '2048M'); // size must be bigger than the biggest file
ini_set('max_execution_time', 0);
require 'vendor/autoload.php';
use OpenCloud\Rackspace;
// Instantiate a Rackspace client.
$client = new Rackspace(Rackspace::US_IDENTITY_ENDPOINT, array(
'username' => '<USERNAME>',
'apiKey' => '<APIKEY>'
));
$objStore = $client->objectStoreService('cloudFiles', 'LON');
$savePath = __DIR__.'/backup/';
// get our containers
print("Containers:\n");
$containersList = $objStore->listContainers();
while($container = $containersList->Next()) {
if( ! in_array($container->name, array('.CDN_ACCESS_LOGS', '<CONTAINER_TO_EXCLUDE>'))) {
printf("*** %s\n", $container->name);
$containerDir = $savePath.$container->name.'/';
if (!is_dir($containerDir)) {
mkdir($containerDir, 0777, true);
}
$files = $container->ObjectList();
while($o = $files->Next()) {
$file_name = $o->getName();
if (file_exists($containerDir . $file_name)) {
echo '## '.$containerDir.$file_name.' already exists'."\n";
continue;
}
// Get our object
$file = $container->getObject($file_name);
if (strpos($file->getName(), '<FILES_TO_EXCLUDE>') !== false) {
continue;
}
$tempDir = $containerDir . dirname($file->getName()) . '/';
if (!is_dir($tempDir)) {
mkdir($tempDir, 0777, true);
}
if (file_put_contents($containerDir . $file_name, $file->getContent())) {
printf("** %s - OK\n", $file->getName());
} else {
printf("** %s - KO\n", $file->getName());
}
unset($file);
}
}
}

Related

PHP ZipArchive | creating unsupported file & not downloading on Cpanel | working on local

hi my goal is to create zip file and download it, working okay on local server but giving problem on Cpanel when i hit download button
$fileEntry = FileEntry::with('childEntriesRecursive')->where('shared_id', $shared_id)->notExpired()->firstOrFail();
abort_if(!static::accessCheck($fileEntry), 404);
$zip_file = str_replace(' ', '_', $fileEntry->name).'.zip';
$zip = new ZipArchive();
$zip->open(public_path($zip_file), ZipArchive::CREATE | ZipArchive::OVERWRITE);
foreach ($fileEntry->childEntriesRecursive as $key => $file) {
$getFile = storage_path('app/public/'.$file->path);
if (file_exists($getFile) && is_file($getFile)){
echo $file->name;
$zip->addFile(storage_path('app/public/'.$file->path), $file->name);
}
}
$zip->close();
return response()->download($zip_file, basename($zip_file))->deleteFileAfterSend(true);
output is

How to read external variables

I am trying to read a variable sent as an external parameter by --configFile="", but I always get an error not found, even if I pass the absolute path
Vars:
var (
c Config
conf = c.getConf()
app = kingpin.New("exporter", "Exporter for Prometheus.")
configFile = app.Flag("configFile", "Configuration file destination (/etc/exporter/config.yaml)").Default("/etc/exporter/config.yaml").String()
)
Config:
func (c *Config) getConf() *Config {
yamlFile, err := ioutil.ReadFile(string(*configFile))
if err != nil {
log.Errorf("yamlFile.Get err #%v ", err)
}
err = yaml.Unmarshal(yamlFile, c)
if err != nil {
log.Errorf("Unmarshal: %v", err)
}
return c
}
Command/Output:
server:/etc/exporter # ./exporter --configFile="/etc/exporter/config.yaml"
ERRO[0000] yamlFile.Get err #open : no such file or directory
ls -ltr:
total 14152
-rw------- 1 root root 1334 Sep 25 20:47 config.yaml
-rwxrwxr-x 1 root root 14466568 Sep 25 22:03 exporter
Replace code here https://github.com/rafal-szypulka/itm_exporter/blob/master/main.go#L51
with:
conf *Config
Replace code here https://github.com/rafal-szypulka/itm_exporter/blob/master/main.go#L354
with:
p := kingpin.MustParse(app.Parse(os.Args[1:]))
conf = c.getConf()
switch p {

Command execution timesout when using Process in Laravel

I am building website screenshots SaaS with Laravel. I am trying to execute a command from it, more precisely a nodejs script; however I keep getting ProcessTimedOutException error. Here is where the magic happens, look at the take method $process-run(); line throws the exception:
<?php
namespace App\Logic;
use App\Logic\TimeHelper;
use App\UrlRequest;
use Illuminate\Support\Facades\Storage;
use Symfony\Component\Process\Exception\ProcessFailedException;
use Symfony\Component\Process\Process;
class Screenshot {
static function take(UrlRequest $urlRequest)
{
$name = self::generateName($urlRequest);
$command = self::buildScreenshotCommand($name, $urlRequest);
$startTime = TimeHelper::milliseconds();
$process = new Process($command);
$process->run();
$endTime = TimeHelper::milliseconds();
if (!$process->isSuccessful())
{
throw new ProcessFailedException($process);
}
$output = $process->getOutput();
if (trim($output) === '')
{
$urlRequest->successful = 1;
$file = self::uploadToS3($name);
$urlRequest->image_url = $file['url'];
$urlRequest->file_name = $name;
$urlRequest->file_size = $file['size'];
$urlRequest->time_it_took_to_take_screenshot_ms = $endTime - $startTime;
}
else
{
$urlRequest->error = $output;
}
if ($urlRequest->save())
{
return $urlRequest;
}
return false;
}
static function uploadToS3($name)
{
$name = 'screenshots/' . $name;
Storage::disk('s3')->put($name, Storage::disk('local')->get($name), ['visibility' => 'public']); // upload to S3
$fileSize = Storage::disk('local')->size($name);
Storage::disk('local')->delete($name);
return [
'url' => Storage::disk('s3')->url($name),
'size' => $fileSize
];
}
static function generateName($urlRequest)
{
$name = time() . rand(10000, 99999);
$extension = '.png';
if (isset($urlRequest->pdf) AND $urlRequest->pdf == 1)
{
$extension = '.pdf';
}
while (UrlRequest::where('file_name', '=', $name . $extension)->first())
{
$name = time() . rand(10000, 99999);
}
return $name . $extension;
}
static function buildScreenshotCommand($name, $urlRequest)
{
$command = 'cd ' . base_path() . ' && node puppeteer-screenshots-init.js ';
$command .= "--url={$urlRequest->url} ";
$fullPath = storage_path('app') . '/screenshots/' . $name;
$command .= "--path={$fullPath} ";
if (isset($urlRequest->pdf))
{
$command .= "--pdf=true ";
}
if (isset($urlRequest->viewport_width))
{
$command .= "--viewportWidth={$urlRequest->viewport_width} ";
}
if (isset($urlRequest->mobile))
{
$command .= '--mobile=true ';
}
if (isset($urlRequest->media_type_print))
{
$command .= '--mediaTypePrint=true ';
}
if (isset($urlRequest->user_agent))
{
$command .= "--userAgent={$urlRequest->user_agent}";
}
return $command;
}
}
Here is the nodejs script:
require('puppeteer-screenshots').init();
puppeteer-screenshots (this is the nodejs script I am trying to call) is a package which I've written for taking screenshots in an easy manner using Google's Puppeteer package. It's open source and available on BitBucket, here is a link to the one and only file. The line in this package that seems to hang the process is line 138 or, to be specific, this:
await page.screenshot({path: config.path, fullPage: true});
when this line is removed the process finishes but obviously doesn't save the screenshot.
Also, I've tried rewriting the code with exec instead of Process but in that case the script just hangs and it doesn't finish as well.
Also, when executing the exact same command from the command line everything works as expected.

Download files from connected FTP with Node.js

I'm rushing to find a solution for my problem!
At the moment I'm connected to an FTP and I selected 5 files to download, but I can't get the correct diretory for both local and remote path.
On the remote ftp, I am inside a folder named OUT and this folder contains 5 files that I want to download.
On local, I want to download to my project folder > files.
Here is my code:
ftps.cd('./OUT')
.raw('ls')
.exec(function(err, res){
var arr = [];
var items = res.data.split('\n')//.map(String);
var item;
//console.log(items);
for (i in items){
item = items[i];
if(item.length > 0 ){
var linha = item.toString().slice(46);
var date = linha.toString().substr(0,12);
var date_timestamp = moment([2018, months[date.substr(0,3)], date.substr(7,2), date.substr(5,2), date.substr(10,2)]).format('x');
var nome_ficheiro = linha.toString().slice(13);
// console.log(nome_ficheiro)
// split files
//-rwxrwxrwx 1 owner group 79563 Jul 17 10:44 STKHYN_20180713170017.csv
var fl = [];
// fl['datea'] = moment([2018, months[date.substr(0,3)], date.substr(7,2), date.substr(5,2), date.substr(10,2)]);
fl['date'] = date_timestamp;
fl['filename'] = nome_ficheiro;
arr.push(fl);
}
}
// filter array
arr = _.sortBy(arr, "date").reverse();
// first 5 files
arr = arr.splice(0,5);
// get files
for (i in arr){
ftps.put("files/text.txt", "/");
ftps.get(arr[i]['filename']);
}
console.log(arr);
});
Could you guys please help me?

How to get directory size in node.js without recursively going through directory?

How do I get the size of a directory in node.js without recursively going through all the children in a directory?
E.g.
var fs = require('fs');
fs.statSync('path/to/dir');
Will return me an object like this,
{ dev: 16777220,
mode: 16877,
nlink: 6,
uid: 501,
gid: 20,
rdev: 0,
blksize: 4096,
ino: 62403939,
size: 204,
blocks: 0,
atime: Mon May 25 2015 20:54:53 GMT-0400 (EDT),
mtime: Mon May 25 2015 20:09:41 GMT-0400 (EDT),
ctime: Mon May 25 2015 20:09:41 GMT-0400 (EDT) }
But the size property is not the size of the directory and it's children (aka the sum of the files inside of it).
Is there no way to get the size of a dir (w/the sizes of the files inside of it included) without recursively finding the sizes of the children (and then summing those up)?
I'm basically trying to do the equivalent of du -ksh my-directory but if the given directory is really large (e.g /) than it takes forever to recursively get the true dir size..
I use this simple async/await + fs Promises API (Node.js v14+) solution... It doesn't depend on external libraries or spawning new processes which is nice:
const path = require('path');
const { readdir, stat } = require('fs/promises');
const dirSize = async directory => {
const files = await readdir( directory );
const stats = files.map( file => stat( path.join( directory, file ) ) );
return ( await Promise.all( stats ) ).reduce( ( accumulator, { size } ) => accumulator + size, 0 );
}
Usage:
( async () => {
const size = await dirSize( '/path/to/directory' );
console.log( size );
} )();
This doesn't use any loop constructs to recurse through the directory, although it is mapping/reducing arrays. The other solutions are just abstracting recursion behind NPM packages/C code so it should be all good...
Update: I had used the above solution for getting a directory without recursively going through child directories based on my previous use case... Also it is clear reading question again that the original poster wanted the sizes from child directories as well.
This should do the trick if anyone is looking for that; however it technically doesn't the bill of avoiding recursion. Thatnks for the comment #Inigo!
const { readdir, stat } = require('fs/promises');
const { join } = require('path');
const dirSize = async dir => {
const files = await readdir( dir, { withFileTypes: true } );
const paths = files.map( async file => {
const path = join( dir, file.name );
if ( file.isDirectory() ) return await dirSize( path );
if ( file.isFile() ) {
const { size } = await stat( path );
return size;
}
return 0;
} );
return ( await Promise.all( paths ) ).flat( Infinity ).reduce( ( i, size ) => i + size, 0 );
}
Usage:
( async () => {
const size = await dirSize( '/path/to/directory' );
console.log( size );
} )();
You can either spawn a du command on your target directory but as you said it can be rather slow the first time. What you might not know is that du results seem to be cached somehow:
$ time du -sh /var
13G /var
du -sh /var 0.21s user 0.66s system 9% cpu 8.930 total
$ time du -sh /var
13G /var
du -sh /var 0.11s user 0.34s system 98% cpu 0.464 total
It took initially 8s and then only 0.4s
Hence if your directories are not changing too often, just going with du might be the easiest way to go.
Another solution is to store that in a cache layer, so you can watch your root directory for changes, then compute the size of the folder, store it in a cache and just serve it when needed. To perform this you could use the watch functionality of NodeJS but you'll have some cross platform issues, hence a library like chokidar might be helpful.
fast-folder-size uses Sysinternals DU on Windows and the built-in du program on other platforms to quickly compute a folder size.
Installation
npm i fast-folder-size
Usage
const fastFolderSize = require('fast-folder-size')
fastFolderSize('.', (err, bytes) => {
if (err) {
throw err
}
console.log(bytes)
})
You should try "getFolderSize" node module
https://www.npmjs.com/package/get-folder-size
Usage
getFolderSize(folder, [regexIgnorePattern], callback)
Example:
var getSize = require('get-folder-size');
getSize(myFolder, function(err, size) {
if (err) { throw err; }
console.log(size + ' bytes');
console.log((size / 1024 / 1024).toFixed(2) + ' Mb');
});

Resources