Multi cURL function not working in XAMPP localhost - multithreading

XAMPP Version: 1.8.1
PHP Version: 5.4.7
I'm using the following cURL class: http://semlabs.co.uk/journal/object-oriented-curl-class-with-multi-threading
Class source code: http://paste2.org/XGeMUMme
In XAMPP, when I attempt to do a multi curl session, the page never loads and my CPU usage goes up and doesn't stop until I end the Apache process; however, when I run the code on my webhost, it works perfectly.
The following works in XAMPP, because it's a single-threaded cURL execution
$curl = new CURL();
$opts = array( CURLOPT_RETURNTRANSFER => true, CURLOPT_FOLLOWLOCATION => true );
$curl->addSession( 'http://yahoo.com/', $opts );
$result = $curl->exec();
$curl->clear();
However, when this code is ran (only adding one more session, which makes it multi-threaded), the page is never loaded as I stated previously.
$curl = new CURL();
$opts = array( CURLOPT_RETURNTRANSFER => true, CURLOPT_FOLLOWLOCATION => true );
$curl->addSession( 'http://yahoo.com/', $opts );
$curl->addSession( 'http://google.com/', $opts );
$result = $curl->exec();
$curl->clear();
Although, the above code does work on my website.
Any ideas and/or solutions regarding this problem? Thank you!

for multi curl , the curl_multi_select() always returns -1 , which causes to end the execution time
you should use you own sleep time usleep(100);

Related

Curl command is not processing changes I make to file

I am trying to test parsing of a zip file in node.js using curl from the command line. Originally, I had a route that looks like this:
app.post('/processZip', (req, res) => {
const zip = req.file
console.log(req)
extractCSVFilesFromZip(zip, '/tmp/connections', '/tmp/messages')
const connectionsOutputPath = '/tmp/connections'
const messagesOutputPath = '/tmp/messages'
console.log(`Size of Parsed Connections File: ${connectionsOutputPath.size}`)
console.log(`Size of Parsed Messages File: ${messagesOutputPath.size}`)
res.send('success!')
})
which calls a function that looks like this:
const extractCSVFilesFromZip = (zipFilePath, connectionsCSVOutputPath,
messagesCSVOutputPath) => {
console.log(zipFilePath)
fs.createReadStream(zipFilePath)
.pipe(unzip.Parse())
.on('entry', entry => {
const [
fileName,
size
] = [
entry.path,
entry.size
]
if (fileName === 'Connections.csv') {
console.log(`Size of Connections File to Parse: ${size}`)
entry.pipe(fs.createWriteStream(connectionsCSVOutputPath))
} else if (fileName === 'Messages.csv') {
console.log(`Size of Messages File to Parse: ${size}`)
entry.pipe(fs.createWriteStream(messagesCSVOutputPath))
} else {
entry.autodrain()
}
})
}
I am using this curl command to test the request:
curl -F file=#../../../Downloads/Basic_LinkedInDataExport_09-14-2018.zip http://localhost:5000/processZip/
Originally, it gave me an error pointing to the first instance of createReadStream in the function, so I commented out all the code and just tried to console.log(zipFilePath) to see what is being sent. But I still get the same error. In fact, I can comment out, remove, or change any of the code in either the route or the file, but it makes no difference. I still get the same error. It's as if curl is still sending the request to a cached version of the files, and not processing the changes I am making. But if I examine the files from the command line with sudo nano I can see the updated versions. What could be causing this issue? I have saved the files and restarted the server each time. Could it be that I need to wait longer than usual for the changes to be processed because it is a larger codebase than I am used to working in, or is something else to blame. For what it is worth, the servers are being run by forever. Thanks in advance for any help!
Okay I figured it out, there was a ghost process running on 5000. killall -9 node did the trick!

Module cwd assistance

I created a Perl module that is to be used in many Perl scripts to use Net::SSH::Expect
to do a login.
package myRoutines;
#
use v5.22;
use strict;
use warnings;
use Net::SSH::Expect;
use Exporter qw(import);
our #EXPORT_OK = qw(my_login);
sub my_login {
my $user = 'xxxx';
my $port = '10000';
my $passwd = 'XYZ';
my $adminServer = 'myServer';
my $rootpassword = 'ABCDEF';
my ( $pName, $vName ) = #_;
our $ssh = Net::SSH::Expect->new(
host => "$adminServer",
ssh_option => "-o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null",
user => "$user",
password => "$passwd",
port => "$port",
raw_pty => 1,
restart_timeout_upon_receive => 1,
log_file => "/var/tmp/clilog_$pName$vName"
);
eval {
my $login_output = $ssh->login();
if ( $login_output !~ />/ ) {
die "Login has failed.
Login output was $login_output";
}
};
return $ssh;
}
1;
The scripts will do:
use myRoutines qw(my_login);
our ( $ssh, $pName, $vName );
$pName = 'abc';
$vName = '123';
$ssh = my_login( $pName, $vName );
$ssh->send( "some command\r" );
This all works if I'm in the directory that the script and module are in. If I'm in any other directory, the new call works but the call to $ssh->send does nothing.
I've tried adding to my script:
use lib '/some/dir';
(where the .pm file resides) to force it find the module, and that seems to work when I'm not in the directory where the pm file resides.
I've tried to:
use File::chdir;
$CWD = '/some/dir';
and again, the login seems to work but the next send does nothing. So I'm at a loss as to what might be happening and would like some advice.
Update 20170908:
Upon further playing and following the suggestions made, I've done the following and it now works:
removed the eval as it was unnecessary.
removed the our's and made it my's.
removed the ""'s
set the following in the script:
use File::Basename;
use Cwd qw( abs_path );
chdir "/some/dir";
use lib dirname(abs_path($0));
my $scriptName = basename($0);
use myRoutines qw(ovm_login);
my $pName = substr($scriptName,0,-3); (cutting off the .pl from the end of the script name to pass the scriptname as the pName)
using chdir to change directory to where my pl script and pm file is and then setting the lib is seemingly working as it should.
Borodin, I'm not sure I understand your meaning when you say to object orient the module and .... but would be interested in hearing more to better understand.
If you don't want to hardcode the directory, you can use
use FindBin qw( $RealBin );
use lib $RealBin;
($RealBin is the path to the script. Adjust as needed if myRoutines.pm is in a subdir.)
The simple and easy way would be placing your .pm file in:
/usr/lib64/perl5/
directory and you shouldn't have any problems.
But still not the perfect solution, you should be able to put the .pm file wherever you want.

ReactPHP run every loop twice

I'm trying to learn now technology - reactPHP. But I'm stacked with starting script. I'd edited it little bit, but I have problem, if I call the react loop, the script is done twice.
I have this code:
<?php
require 'vendor/autoload.php';
$app = function ($request, $response) {
$date = new DateTime();
file_put_contents("data.txt", $date->getTimestamp().";", FILE_APPEND);
$response->writeHead(200, array('Content-Type' => 'text/plain'));
$response->end("Done\n");
};
$loop = React\EventLoop\Factory::create();
$socket = new React\Socket\Server($loop);
$http = new React\Http\Server($socket, $loop);
$http->on('request', $app);
echo "Server running at http://127.0.0.1:1337\n";
$socket->listen(1337);
$loop->run();
and if I call http://localhost:1337/react/index.php I get in data.txt this
1439849018;1439849018;
I'm expecting only one value.
I have tested your code and the problem is because your'e testing it in your browser. Your browser send request and then ask for favicon. That's it. On the image from Inspect it's the first and third line. Next time try to run the scripts from cmd.

SaltStack: Create a server via salt-cloud and set endpoints autmatically in Azure?

I'm a bit stuck with my server deployment recipes...
So far, I can create and provision my servers via commandline:
First I run salt-cloud and afterwards I provision it via salt-ssh...
But since I'm using some special ports (https,...) I also have to set/open input/endpoints (=ports) on Microsoft Azure.
This is where I stuck: is there any way of telling salt-cloud, to automatically execute a script, which opens the endpoints automatically?
My preferred result would be:
run salt-cloud => sets up new machine and opens all necessary ports
run salt-ssh to provision them
I have already looked at salt orchestration, but it looks like it's more for server fleets, instead of single (external) server configuration.
Any hints?
You could write a custom bootstrap script that would open the ports that you want.
As Utah_Dave said, I wrote a ruby script to add the ports...
# ruby
# execute with ruby startscript.rb
require 'json'
PROVIDER = "yourprovider"
SERVER = "yourserver"
ENDPOINTS = {
"SSH2" => 17532,
"HTTPS" => 443,
"HTTP" => 80,
"SlangerHTTP" => 8080,
"Slanger" => 4567,
"CouchDB" => 5984
}
def get_missing
service_existing = false
while !service_existing
begin
res = `salt-cloud --out=json -f list_input_endpoints #{PROVIDER} deployment=#{SERVER} service=#{SERVER}`
result = JSON.parse(res)
service_existing = true
rescue => e
puts e
end
end
existing_services = result[PROVIDER]["azure"].keys
missung_services = ENDPOINTS.keys - existing_services
end
missung_services = get_missing
while missung_services.any?
print "#{Time.now} [#{SERVER}] Services missing: #{missung_services.join(", ")}"
missung_services.each do |m|
print "."
`salt-cloud --out=json -f add_input_endpoint #{PROVIDER} name=#{m} port=#{ENDPOINTS[m]} protocol=tcp deployment=#{SERVER} service=#{SERVER} role=#{SERVER}`
end
print "\n"
missung_services = get_missing
end

How does a Perl socket resolve hostnames under Linux?

I have a (from what I can tell) perfectly working Linux setup (Ubuntu 8.04) where all tools (nslookup, curl, wget, firefox, etc) are able to resolve addresses. Yet, the following code fails:
$s = new IO::Socket::INET(
PeerAddr => 'stackoverflow.com',
PeerPort => 80,
Proto => 'tcp',
);
die "Error: $!\n" unless $s;
I verified the following things:
Perl is able to resolve addresses with gethostbyname (ie the code below works):
my $ret = gethostbyname('stackoverflow.com');
print inet_ntoa($ret);
The original source code works under Windows
This is how it supposed to work (ie. it should resolve hostnames), since LWP tries to use this behavior (in fact I stumbled uppon the problem by trying to debug why LWP wasn't working for me)
Running the script doesn't emit DNS requests (so it doesn't even try to resolve the name). Verified with Wireshark
From a quick look, the following code from IO::Socket::INET
sub _get_addr {
my($sock,$addr_str, $multi) = #_;
my #addr;
if ($multi && $addr_str !~ /^\d+(?:\.\d+){3}$/) {
(undef, undef, undef, undef, #addr) = gethostbyname($addr_str);
} else {
my $h = inet_aton($addr_str);
push(#addr, $h) if defined $h;
}
#addr;
}
suggests (if you look at the caller of this code) the work-around of adding MultiHomed => 1, to your code.
Without that work-around, the above code appears to try to call inet_aton("hostname.com") using the inet_aton() from Socket.pm. That works for me in both Win32 and Unix, so I guess that is where the breakage lies for you.
See Socket.xs for the source code of inet_aton:
void
inet_aton(host)
char * host
CODE:
{
struct in_addr ip_address;
struct hostent * phe;
if (phe = gethostbyname(host)) {
Copy( phe->h_addr, &ip_address, phe->h_length, char );
} else {
ip_address.s_addr = inet_addr(host);
}
ST(0) = sv_newmortal();
if(ip_address.s_addr != INADDR_NONE) {
sv_setpvn( ST(0), (char *)&ip_address, sizeof ip_address );
}
}
It appears that the Perl gethostbyname() works better than the C gethostbyname() for you.
Could you perhaps tells us exactly how your code fails? You've got error checking code in there but you haven't reported what the error is!
I've just tried the original code (with the addition of the "use IO::Socket::INET" on my Mac OS X machine and it works fine.
I suspect that the Multihomed option is an unnecessary hack and some other issue is the root cause of your problem.
Make sure that you have the statement
use IO::Socket::INET;
At the beginning of your source code. If you leave this out, you are probably getting the error message:
Can't locate object method "new" via
package "IO::Socket::INET"
Beyond that you might verify that DNS is working using Net::DNS::Resoler, see more information here.
use Net::DNS;
my $res = Net::DNS::Resolver->new;
# Perform a lookup, using the searchlist if appropriate.
my $answer = $res->search('example.com');

Resources