i'd like to deny access to example.com/phpmyadmin or secure it with a password-prompt using .htaccess and .htpasswd.
Is that possible? If so,how can i achieve that?
I don't need to access phpmyadmin over the web.
I'm running an openlitespeed wordpress droplet on DigitalOcean.
Thanks for any help!
Cheers,
Dan
Follow this guide
If you don't need to visit it from browser at all:
Just remove phpmyadmin from the WebAdmin > Virtual Hosts >
Context.
If you want to set password to protect it, follow Method 2 from the guide:
Log into SSH console and create a password file:
touch /usr/local/lsws/conf/PASS
chown lsadm:lsadm /usr/local/lsws/conf/PASS
Navigate to WebAdmin > Security
Set Realm Name = example, and User DB Location = /usr/local/lsws/conf/PASS
Click /usr/local/lsws/conf/PASS to create a user/password
Navigate to WebAdmin > Virtual Hosts > Context > phpmyadmin
Set Realm to example
Best
I am trying to setup cron job with following url :
wget -O - -q -t 1 https://myexample.com/check/test > /dev/null
but this is not working.
When i am trying to execute this url on web https://myexample.com/check/test
i see message Your connection is not private
You will need a SSL certificate to get rid of that security warning. You could use one generated by letsencrypt (which is free). An alternative way would be to get a SSL certificate through startssl.com (also free). If you just need your cron to run you could use it like this:
/usr/bin/wget --no-check-certificate -O - -q -t 1 https://myexample.com/check/test > /dev/null
Accessing the same link via a web browser, without having a valid SSL certificate will result in a security warning. If you do not want to buy or use a real SSL certificate then you could just use Firefox web browser an add an exception for that website/certificate.
I have been trying to use Curl and wget to download file from Sharepoint. I am planning to make it as Script which runs automatically everyday and download the file from URL.
I tried using CURL with following command
curl -O --user Myusername:Mypassword https://OurDomain.sharepoint.com/_XXX&file=IPS_cleaned.xlsx&action=default
But it gave me error about SSL connection. I got to know that there is some existing bug in CURL 7.35 So i downgraded it to 7.22. But still gives me same error.
I also tried using Wget
wget --user=Myusername --password=MyPassword --no-check-certificate https://OurDomain.sharepoint.com/_XXX&file=IPS_cleaned.xlsx&action=default
But it still gives me error -- Unable to establish SSL connection
Can someone please let me know how i can accomplish my task
UPDATE
I was able to resolve the error in CURL. Below is the command that i gave
curl -O -L --sslv3 -A "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.A.B.C Safari/525.13" --user Myusername:Mypassword 'https://OurDomain.sharepoint.com/_%7BB21r-9CA2-345DEF%7D&file=IPS_cleaned.xlsx&action=default'
Now what it downloads is a file, which when i open it shows me Login page of Sharepoint. It does not download the actual excel file.
Any reason?
Another potential solution to this involves taking your sharepoint link and replacing the text after the '?' with download=1:
This:
https://my.sharepoint.com/:u:/g/XXX/XXXX-bunchofRandomText?e=kRlVi
Becomes this:
https://my.sharepoint.com/:u:/g/XXX/XXXX-bunchofRandomText?download=1
Now, you can just:
wget https://my.sharepoint.com/:u:/g/XXX/XXXX-bunchofRandomText?download=1
*Note, this example used a single file and a link where anyone with the link could access the file (no credentials required)
Please use rclone
Download and install the latest one from https://rclone.org/downloads
First option: Use OneDrive to access SharePoint sites/personal folder. This option will help you to upload large files.
1.create rclone configurations using the rclone config command
2.Select New remote and give a name
3.Select cloud storage OneDrive
4.Leave client ID and secret as blank
5.Edit advanced config: n
6.Remote config: Use auto-config: y
7.Open the URL on the browser and give access to rclone
8.Select personal/shared site URL option
8a.Shared site URL option you have to give the site URL. ie; https://sharepoint.com/sites/SiteName
9.Select personal/Documents drive. Documents drive will show if you selected the shared site URL option in the 8th step
Save config and quit
And the configuration file contents will be like the following. If you selected the Personal option drive type will be personal.
[onedrive]
type = onedrive
token =
drive_id =
drive_type = documentLibrary
Second option: In this option, you can upload up to 2 GB-sized files.
1.create rclone configurations using rclone config command
2.Select New remote and give a name
3.Select cloud storage WebDAV
4.Give site URL, username and password
5.Save and quit
And the configuration file contents will be like the following. Password will be in an encrypted format.
vim /root/.config/rclone/rclone.conf
[sharepoint]
type = webdav
url = https://sharepoint.com/sites/SiteName/Documents
vendor = sharepoint
user =
pass =
Download a file from SharePoint.
rclone copy --ignore-times --ignore-size --verbose sharepoint:SourceFolder/file.txt DestFolder
Firefox plugin that captures the link with session ID etc.. and it provides a command you could paste in the console for curl or wget.
If anyone has a better suggestion please let me know.
It gives you a curl or wget command with headers, cookies and all, with a copy to clipboard button, right on the download dialogue.
Download URL: https://addons.mozilla.org/en-US/firefox/addon/cliget
Reference: https://superuser.com/questions/27243/how-to-find-out-the-real-download-url-on-download-sites-that-use-redirects/1239026#1239026
Struggled with the same issue myself, and had my not-so-automatic-but-man-so-convenient way, with a daily log-in.
logged into Sharepoint with a browser,
exported the cookie,
run the following command.
wget --cookies=on --load-cookies cookies.txt --keep-session-cookies --no-check-certificate -m https://yoursharepoint.com
And files were downloaded just fine.
For anyone using CURL to download a file on Sharepoint with an "Anyone with the link" download option. Below are the steps I had to follow to download. Essentially you have to use the cookie from the share link, and then download the file from a different download link they don't provide easily for you.
When sending the CURL command for the “share link” it returns a 302 message, a forward link, and a cookie. If we save that cookie and use it to hit a “download” link I am able to download the file. Essentially, Microsoft uses the initial “share link” to send the cookie to the browser, and then redirect to their “View File” website. On that website you need to use the cookie provided (authentication), and select your next function (On screen view, print, download, etc). When you click the download button you hit a different link. I was able to find this link by going to the "view page" website for the file/link, turning on developer tools, and watching the link the browser follows when hitting download. You can then replicate that link for each file. If we use that download link along with the cookie, we can download the file.
curl -i -c cookies.txt SHARE LINK
curl -o docsdownloaded.pdf -b cookies.txt DOWNLOAD LINK
Share Link Ex: https://tenant.sharepoint.com/:b:/s/Folder/EdNUf4xAVzFJgBoO0MqkfppR5tgobxLrmCnRqU4LFJQ?e=rOGNSD
Download Link Ex:https://tenant.sharepoint.com/sites/Folder/_layouts/15/download.aspx?SourceUrl=%2Fsites%2FFolder%2FShared%20Documents%2FGeneral%2FBig%2Dfile%2Epdf
Similar to the answer Zyglute gave, using cURL:
You can export your login cookie using the cookies.txt Chrome extension: https://chrome.google.com/webstore/detail/njabckikapfpffapmjgojcnbfjonfjfg
Then use the following code:
curl -b cookie.txt https://OurDomain.sharepoint.com/_XXX&file=IPS_cleaned.xlsx&action=default
At some point your Sharepoint session will expire (not sure how long that takes), and you will need a new cookie file.
EDIT: If a malicious user gets a hold of your cookie.txt, they could get into your SharePoint account, so be sure to keep it safe.
Use wget adding &download=1 at the end of the link.
wget "<yourlink>&download=1"
it will be download with <yourlink> string as name, then just mv with the correct name after.
How can I find the publicly available Image IDs on Windows Azure?
I found this related question - Azure: List OS Images
But, the answer requires Windows+PowerShell while I need a way to get it on Linux or REST/
Use the URL specified here:
http://msdn.microsoft.com/en-us/library/windowsazure/jj157191.aspx
You'll need to provide a client certificate when sending the request.
If you are using curl on Linux, add the --cert to point to a .pem file (you'll need to upload it to the administrator's management certificate as a .cer file first).
Don't forget to add the x-ms-version header for it to work:
-H "x-ms-version: 2013-03-01"
Here is an example of using curl to get the auto-scale information for a cloud service
curl -H "accept: application/json" -H "x-ms-version: 2013-10-01"
--cert azure-cert.pem $AUTOSCALEURL
I know I can use the command:
curl -X PUT
http://admin:password#127.0.0.1:5984/_config/admins/admin
-d '"password"'
to add a new admin to the server. How do I go about removing an admin user?
You should be able to DELETE the user admin that you setup with:
curl -X DELETE http://admin:password#127.0.0.1:5984/_config/admins/admin
Note this is a bad example as you're deleting the user that you're authenticating as. The last part of the URL admin is the name of the user.