I am trying to create eJabberd users using PHP script.
The following script is working perfectly on my local system(Ubuntu 14.04) :
<?php
error_reporting(E_ALL);
ini_set('display_errors', '-1');
$username = 'userx1';
$password = '123456';
$node = 'localhost';
exec('ssh -t -t <hostname> ejabberdctl register '.$username.' '.$node.' '.$password.' 2>&1', $output, $status);
if($output == 0)
{
echo "User created successfully.";
}
else
{
// Failure, $output has the details
echo '<pre>';
foreach($output as $o)
{
echo $o."\n";
}
echo '</pre>';
}
But when I am trying to run on the server(CentOS).
Its giving me following error :
Host key verification failed.
I tried some solutions like :
https://askubuntu.com/questions/45679/ssh-connection-problem-with-host-key-verification-failed-error
https://www.youtube.com/watch?v=IJj0uD7EgGk
but no success.
Any reference will be very helpful. Thanks in advance.
Found A better way to crate user to eJabberd server.
mod_rest :
https://github.com/processone/ejabberd-contrib/tree/master/mod_rest
Provides you to make REST calls to the server.
Related
I have moved my xampp server to raspbian for image uploading. I can access the server, folder via browser and can see that image was registered in DB but unfortunately there is no image at that folder.
If you need more specific information let me know.
My phone app: https://gist.github.com/DoIReallyNeedIt/a068c69958f269ad1d1a373d9ec8bcdb
Connection to server:
<?php
$user_name = "root";
$user_pass = "root";
$host_name = "localhost";
$db_name = "mydb";
$con = mysqli_connect ($host_name, $user_name, $user_pass, $db_name);
if ($con)
{
$image = $_POST["image"];
$name =date('Y').date('m').date('d'). "_" . date('H'). date('i'). "_". $_POST["name2"]."_".$_POST["name1"];
$sql = "insert into imageinfo(name) values ('$name')";
$upload_path = "uploads/$name.jpg";
if(mysqli_query($con,$sql)){
file_put_contents ($upload_path, base64_decode($image));
echo json_encode (array ('response'=>'Nuotrauka buvo sėkmingai įkelta'));
}
else {
echo json_encode (array ('response'=>'Nuotraukos įkelti nepayvko'));
}
}
mysqli_close($con);
?>
Found the problem. For some reason chmod didn't saved for folders
We're building out an api service for a node application that's running on Google app engine. Currently, I have setup passport to use the 'passport-http-bearer' strategy to handle browserless http requests to our api. This takes a token from the authorization header of the request and uses that to authenticate it.
We're also building out an on-prem python program that will request Google for a token, which we will send to the node app to make an api call. Based on what I've seen around the web, it seems like the best way to do this is to use a service account that is associated with the GCP project. Unfortunately, all the tutorials I've seen use the service account credentials to make authorized calls to Google APIs. I would like to use the service account credentials to make authorized calls to our application's API. My problem is that I can't seem to find any code to take the bearer token from the request and then check against the service account to say either "Yes this was generated from the right account" or "No this request should be rejected". Any insights into how to bridge this gap would be very helpful. Currently my (initial, very poor) bearer strategy is:
passport.use(new BearerStrategy((token, done) => {
console.log('Bearer called with token: ', token);
if (token === '<Fake test token for SO>') {
console.log(' valid token!');
return done(null, { name: 'api_service' });
}
console.log(' invalid token...');
return done(null, false);
}));
We ended up using an https request directly to the google auth endpoint. Here's the code:
// Bearer token strategy for headless requests. This is used to authenticate API calls
passport.use(new BearerStrategy((token, done) => {
//forming the request to hit the google auth endpoint
const options = {
host: 'www.googleapis.com',
path: `/oauth2/v1/tokeninfo?access_token=${token}`,
headers: {
Accept: 'application/json'
}
};
//ask google endpoint if the token has the service account's email
https.get(options, (res) => {
res.setEncoding('utf8');
res.on('data', (chunk) => {
if (JSON.parse(chunk).email === config.get('SVCACCT_NAME')) {
//good request from the service account
return done(null, { name: 'api_service' });
}
//not the service account
return done(null, false);
});
}).on('error', (err) => {
console.log('Got API auth error: ', err.message);
//error or bad token. Either way reject it
return done(err, false);
});
}));
We used a shell script with the service account json file from the project console to generate the token for testing purposes (This won't run on mac. I had to use a docker container with jq installed.). Later we'll translate this to python:
#!/bin/bash
if [ -z "${1}" ]; then
PROG=$( basename $0 )
echo "usage: ${PROG} <JSON account file>"
exit 1
fi
keyfile="${1}"
client_email=$( jq -r '.client_email' $keyfile )
if [ -z "${client_email}" ]; then
echo "JSON file does not appear to be valid"
exit 2
fi
private_key=$( jq '.private_key' $keyfile | tr -d '"' )
if [ -z "${private_key}" ]; then
echo "JSON file does not appear to be valid"
exit 3
fi
keyfile=$( mktemp -p . privkeyXXXXX )
echo -e $private_key > $keyfile
now=$( date "+%s" )
later=$( date -d '+30 min' "+%s" )
header=$( echo -n "{\"alg\":\"RS256\",\"typ\":\"JWT\"}" | base64 -w 0 )
claim=$( echo -n "{ \"iss\":\"${client_email}\", \"scope\":\"email profile\", \"aud\":\"https://www.googleapis.com/oauth2/v4/token\", \"exp\":${later}, \"iat\":${now} }" | base64 -w 0 )
data="${header}.${claim}"
sig=$( echo -n $data | openssl dgst -sha256 -sign $keyfile -keyform PEM -binary | base64 -w 0 )
rm -f $keyfile
stuff=$( echo "${header}.${claim}.${sig}" | sed 's!\/!%2F!g' | sed 's/=/%3D/g' | sed 's/\+/%2B/g' )
curl -d "grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=${stuff}" https://www.googleapis.com/oauth2/v4/token
Google offers google.oauth2.id_token module to help verify tokens.
verify_oauth2_token can ve used to check a Google token:
verify_oauth2_token(id_token, request, audience=None)[source]
Verifies an ID Token issued by Google’s OAuth 2.0 authorization server.
[ ... ]
Returns: The decoded token.
I am trying to automatically backup my database from cPanel using a cronjob. I want to send the database to my email address when the cronjob is run and I have written code (below) but it is still not working.
mysqldump -e --user=username --password='password' dbname | gzip | uuencode sql_backup.gz | mail example#example.com
In my email when cronjob is run I am getting this message:
/usr/local/cpanel/bin/jailshell: mail: command not found
mysqldump: Got errno 32 on write
I have been referring to this article: Automatic MySQL Backup.
I hope you understand my question and help me.
I have also tried with curl but still not working. You can check my steps that I had followed.
First Step: Created mail_alert.sh file and added bellow code.
#!/bin/bash
curl --url "smtps://smtp.gmail.com:465" --ssl-reqd \
--mail-from "example#gmail.com" --mail-rcpt "example#gmail.com" \
--upload-file mail.txt --user "example#gmail.com:mypassword" --insecure
Second Step: Created mail.txt and added below code.
From: "Name" example#gmail.com
To: "Name" example#gmail.com
Subject: Backup completed
The backup has been completed.
Third Step: Added code in command line.
mysqldump -e --user=username --password='password' dbname | gzip | uuencode sql_backup.gz | sh public_html/sql/mail_alert.sh
After this I am getting this message in my mail.
curl: option --ssl-reqd: is unknown
curl: try 'curl --help' or 'curl --manual' for more information
mysqldump: Got errno 32 on write
Looks like mail is not available for you to use or is not installed.
Another option to consider is to using curl to send emails as described here: https://stackoverflow.com/a/16069786/280842
Here's how you could implement this, using code from the link above:
mail_alert.sh file contents
#!/bin/bash
curl --url "smtps://smtp.gmail.com:465" --ssl-reqd \
--mail-from "username#gmail.com" --mail-rcpt "john#example.com" \
--upload-file mail.txt --user "username#gmail.com:password" --insecure
mail.txt file contents
From: "User Name" <username#gmail.com>
To: "John Smith" <john#example.com>
Subject: Backup completed
The backup has been completed.
It's considered a bad security practice to pass
account credentials thru command line arguments. The
above example is for demo purpose only.
Then add your newly created script to your existing cron job
mysqldump -e --user=username --password='password' dbname | gzip | uuencode sql_backup.gz | sh /home/myuser/mail_alert.sh
OK, I will show you how to create a php script that backs up a MySQL database WITHOUT phpMyAdmin and then attaches the .sql file to an email.
Well today I needed to create a little script that backed up a database and then sent it in an email. I found the best way to do this was using the mysqldump program. Usually you have permission to run this program even on a Reseller hosting package.
In linux the program is usually located at
CODE:
/usr/bin/mysqldump
Ok so lets get started.
First of all we need to setup our variables containing MySQL credentials, email addresses to send to, path to store sql file, absolute path to mysqldump program.
CODE:
ini_set("memory_limit","250M"); // We don't want any nasty memory error messages
$SendTo[] = 'myemailaddress#thephpanswers.com'; // This is your email address, you can copy this line and add another recipient
$path = '/home/website/public_html/backupSQL/sql/' // This is the absolute path to where we are going to save the .sql files - please note you should place a .htaccess to deny any users browsing the directory
$tmpFilename = time() .'_mysql.sql'; // This is the tmp filename for the sql, needs to be different everytime
define('mysqlUser','mysqlusername'); // This is the username for the MySQL database
define('mysqlPass','mysqlpassword'); // Password for the username
define('mysqlDatabase','mysqldatabase'); // The database you wish to backup
define('mysqldump','/usr/bin/mysqldump'); // The absolute path to the mysqldump program
Using mysqldump to backup MySQL database:
mysqldump is very easy to use, for more information visit here: http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html
Now we just add the shell_exec to tell mysqldump to backup the database.
CODE:
shell_exec(mysqldump . ' -u ' . mysqlUser .' -p' . mysqlPass .' ' . mysqlDatabase .' > ' . $path .$tmpFilename); // See the > $path . $tmpFilename these are populated from the variables you set above
You can now run this script and see if it actually creates the .sql file in folder you specified.
Sending an attachment in PHP
Ok so we know our file is located at $path . $tmpFilename so lets get on with the complicated emailing of the attachment.
CODE:
$from = "Backup <backup#domain.co.uk>"; // Who the email is coming from
$subject = 'MySQL Database backup'; // The subject of the email
$absoluteFile = $path . $tmpFilename; // Keep it simple, creates a variable of where the file is
$fileType = 'text/plain'; // Content type
$mailBodyText = '<h1>MySQL Database attached</h1>'; // Our HTML body of the email
$mineBoundaryStr=md5(time()); // Needs to be random for the mime
// Advanced headers from http://xahlee.org/php/send_mail_attachment.html
$headers= <<<EEEEEEEEEEEEEE
From: $from
MIME-Version: 1.0
Content-Type: multipart/mixed; boundary="$mineBoundaryStr"
EEEEEEEEEEEEEE;
$mailBodyEncodedText = <<<TTTTTTTTTTTTTTTTT
This is a multi-part message in MIME format.
--{$mineBoundaryStr}
Content-Type: text/html; charset=UTF-8
Content-Transfer-Encoding: quoted-printable
$mailBodyText
TTTTTTTTTTTTTTTTT;
$file = fopen($absoluteFile,'rb');
$data = fread($file,filesize($absoluteFile));
fclose($file);
$data = chunk_split(base64_encode($data));
$mailBodyEncodedText .= <<<FFFFFFFFFFFFFFFFFFFFF
--$mineBoundaryStr
Content-Type: $fileType;
name=$tmpFilename
Content-Disposition: attachment;
filename="$tmpFilename"
Content-Transfer-Encoding: base64
$data
--$mineBoundaryStr--
FFFFFFFFFFFFFFFFFFFFF;
foreach($SendTo as $k => $v) { // Loop through all our recipients
mail( $v , date("H:i - jS \of F Y") . 'MySQL Database backup' , $mailBodyEncodedText, $headers ); // Send the emails
}
So lets put all the script together and you should have this:
CODE:
<?php ini_set("memory_limit","250M"); // We don't want any nasty memory error messages
$SendTo[] = 'myemailaddress#thephpanswers.com'; // This is your email address, you can copy this line and add another recipient
$path = '/home/website/public_html/backupSQL/sql/' // This is the absolute path to where we are going to save the .sql files - please note you should place a .htaccess to deny any users browsing the directory
$tmpFilename = time() .'_mysql.sql'; // This is the tmp filename for the sql, needs to be different everytime
define('mysqlUser','mysqlusername'); // This is the username for the MySQL database
define('mysqlPass','mysqlpassword'); // Password for the username
define('mysqlDatabase','mysqldatabase'); // The database you wish to backup
define('mysqldump','/usr/bin/mysqldump'); // The absolute path to the mysqldump program
shell_exec(mysqldump . ' -u ' . mysqlUser .' -p' . mysqlPass .' ' . mysqlDatabase .' > ' . $path .$tmpFilename); // See the > $path . $tmpFilename these are populated from the variables you set above
$from = "Backup <backup#domain.co.uk>"; // Who the email is coming from
$subject = 'MySQL Database backup'; // The subject of the email
$absoluteFile = $path . $tmpFilename; // Keep it simple, creates a variable of where the file is
$fileType = 'text/plain'; // Content type
$mailBodyText = '<h1>MySQL Database attached</h1>'; // Our HTML body of the email
$mineBoundaryStr=md5(time()); // Needs to be random for the mime
// Advanced headers from http://xahlee.org/php/send_mail_attachment.html
$headers= <<<EEEEEEEEEEEEEE
From: $from
MIME-Version: 1.0
Content-Type: multipart/mixed; boundary="$mineBoundaryStr"
EEEEEEEEEEEEEE;
$mailBodyEncodedText = <<<TTTTTTTTTTTTTTTTT
This is a multi-part message in MIME format.
--{$mineBoundaryStr}
Content-Type: text/html; charset=UTF-8
Content-Transfer-Encoding: quoted-printable
$mailBodyText
TTTTTTTTTTTTTTTTT;
$file = fopen($absoluteFile,'rb');
$data = fread($file,filesize($absoluteFile));
fclose($file);
$data = chunk_split(base64_encode($data));
$mailBodyEncodedText .= <<<FFFFFFFFFFFFFFFFFFFFF
--$mineBoundaryStr
Content-Type: $fileType;
name=$tmpFilename
Content-Disposition: attachment;
filename="$tmpFilename"
Content-Transfer-Encoding: base64
$data
--$mineBoundaryStr--
FFFFFFFFFFFFFFFFFFFFF;
foreach($SendTo as $k => $v) { // Loop through all our recipients
mail( $v , date("H:i - jS \of F Y") . 'MySQL Database backup' , $mailBodyEncodedText, $headers ); // Send the emails
}
?>
You really should protect the directory you choose for the .SQL files, this can be done by creating a file named .htaccess and save it in the directory. The contents of the .htaccess are as follows:
CODE:
order allow,deny
deny from all
You can now automate this with a cron job, set the cron job to run the script every day at midnight
I hope this helps some people out there!
PS: After using this script I realised there is no real security on the .sql dumps, I found using openssl or something similiar on .sql files before emailing them it is 100% secure!
im trying to scale my Azure SQL DB with php. All the other sql statements works fine, but when im sending
ALTER DATABASE db1_abcd_efgh MODIFY (EDITION = 'Web', MAXSIZE=5GB);
i get an error like that
User must be in the master database.
My database URL is that
xaz25jze9d.database.windows.net
and the database is named linke that
db1_abcd_efgh
function skale_a_m(){
$host = "tcp:xaz25jze9d.database.windows.net,1433\sqlexpress";
$user = "db_user";
$pwd = "xxxxx?!";
$db = "master"; //I have tried out db1_abcd_efgh at this point
try {
$conn = new PDO("sqlsrv:Server= $host ; Database = $db ", $user, $pwd);
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
} catch (Exception $e) {
}
$string = 'use master; ALTER DATABASE db1_a_m MODIFY (EDITION =\'Web\', MAXSIZE=5GB)';
$stmt = $conn->query($string);
}
Now i have modified my function linke this
function skale_a_m() {
$serverName = "tcp:yq6ipq11b4.database.windows.net,1433";
$userName = 'db_user#yq6ipq11b4';
$userPassword = 'xxxxx?!';
$connectionInfo = array("UID" => $userName, "PWD" => $userPassword, "MultipleActiveResultSets" => true);
$conn = sqlsrv_connect($serverName, $connectionInfo);
if ($conn === false) {
echo "Failed to connect...";
}
$string = "ALTER DATABASE master MODIFY (EDITION ='Web', MAXSIZE=5GB)";
$stmt = sqlsrv_query($conn, $string);
}
Now i get no errors but the Db did not scale?
According to ALTER DATABASE (Windows Azure SQL Database), the ALTER DATABASE statement has to be issued when connected to the master database.
With PDO, this can be achieved by a connection string such as:
"sqlsrv:server=tcp:{$server}.database.windows.net,1433; Database=master"
Sample code:
<?php
function scale_database($server, $username, $password, $database, $maxsize) {
try {
$conn = new PDO ("sqlsrv:server=tcp:{$server}.database.windows.net,1433; Database=master", $username, $password);
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$conn->setAttribute(constant('PDO::SQLSRV_ATTR_DIRECT_QUERY'), true);
$conn->exec("ALTER DATABASE {$database} MODIFY (MAXSIZE={$maxsize}GB)");
$conn = null;
}
catch (Exception $e) {
die(print_r($e));
}
}
scale_database("yourserver", "youruser", "yourpassword", "yourdatabase", "5");
?>
Note: It's not necessary to set the edition; it will be set according to the max size.
To test the sample code, configure it with your details (server name, login, password and database to be scaled) and execute it with PHP configured with the Microsoft Drivers 3.0 for PHP for SQL Server.
After that, refresh (Ctrl+F5) the Windows Azure Management Portal and you should see the new max size reflected on the Scale tab of the database.
You can also verify that it worked by using a tool to connect to the scaled database (not to the master database) and issuing this command:
SELECT CONVERT(BIGINT,DATABASEPROPERTYEX ('yourdatabase', 'MAXSIZEINBYTES'))/1024/1024/1024 AS 'MAXSIZE IN GB'
$string = 'use master; ALTER DATABASE db1_a_m MODIFY (EDITION =\'Web\', MAXSIZE=5GB)'
I'm pretty sure SQL Azure does not support switching Databases using the USE command.
Try connect directly to the master db in your connection, and remove the USE Master statement from the start of your query.
$host = "tcp:xaz25jze9d.database.windows.net,1433\sqlexpress";
That also looks wrong to me. You shouldn't have a named instance called SQLExpress at the end of your server connection afaik.
I am creating a function to protect my admin pages if a user is not logged in. Am I doing it in a correct way. The following function is included on all my admin pages.
What else should I check for before give acccess to my admin pages for a secure page??
function is_logged_in_admin()
{
$CI =& get_instance();
$is_logged_in = $CI->session->userdata('is_logged_in');
$username = $CI->session->userdata('username');
$status = $CI->session->userdata('status');
if(!isset($is_logged_in) || $is_logged_in != true)
{
redirect('auth/login',location);
}
if(!$username == 'myeswr')
{
redirect('auth/login',location);
}
if(!$status == '1')
{
redirect('auth/resend_activation',location);
}
}
With the code you have here, there is a possibility for granting permission w/o intent. Its not likely, but if for some reason there is a logic error (not syntax error) somewhere in your code for if #1, you are not redirected, and the other 2 fail.
I suggest using if.. elseif.. elseif.. else. Final else being a redirect to login, as a failsafe.
You may also want to check length of login (or just use CI's built in session length).