How can I copy image from server to local using NodeJS or Commandline - node.js

I'm using nodeJS and I need to save images hosted at Facebook into my local server. currently I am executing command line using exec to get this working. something like
wget https://example.com/image.jpg -O ic_launcher.jpg
it works, but when I have a complex url like
wget https://scontent.xx.fbcdn.net/v/t34.0-12/20206198_1998942567005064_567078929_n.jpg?_nc_ad=z-m&oh=e92a33eb810eeb12f199a567cdaf035d&oe=5972D7E3 -O ic_launcher.jpg
it doesn't work because of the & that is on the url, how can I solve this problem? Thank you!

After I added "" to the url it worked, neat little trick but didn't know it. Well now I do.

Related

How to Redirect to an URL in A Command line Inteface (CLI)

I Was Trying to make a redirection to an URL in A CLI Using nodejs.
How Can i do this? I Tried Using request, But i Dont know how.
What do you want to do?
I think you can use:
curl -v -L www.example.com
Or if you can be more specific...

Unable to use wget for downloading git hub search results

I am trying to use wget to download github code search results into a logfile.
I've been using the following command :
wget -o logfile -r -l 2 https://github.com/search?l=Dockerfile&q=openjdk&type=Code&utf8=%E2%9C%93
I do however, get a robots.txt file that says the following :
# If you would like to crawl GitHub contact us at support#github.com.
# We also provide an extensive API: https://developer.github.com/
Do I need some sort of permission from github for this?
Can someone help?
I think the message is pretty clear: you're trying to crawl GitHub site and they don't like that.
They advise you to use the GraphQL API.
The v3 API is still REST, so you could do something like:
wget --output-document search-results.json --user <YOUR_GITHUB_ID> \
"https://api.github.com/search/code?q=openjdk+language:Dockerfile"

wget and htaccess: username only

I googled how to download images using terminal in ubuntu with wget. I found what I needed, but on the server, protected with .htaccess, there's no password. with
wget admin#http://server.com/filename.jpg
it returns: No route to hosts. When I set a password and type
wget admin:password#http://server.com/filename.jpg
everything's fine. However I am not allowed to use a password on the server. How to fix it, finding route?
Much easier:
wget --user="username" --password="password" http://xxxx.yy/filename.xxx
Try
wget --user=admin http://server.com/filename.jpg
instead.
Alternatively,
wget http://admin:#server.com/filename.jpg
may work instead.
Your syntax is actually wrong, as its:
wget http://user:pass#host/file
your username is outside the url, and was being treated as a hostname.

how to transfer a file to my server directly from another server

Hey guys what is the easiest way to transfer a file to my server directly from another server, this way I won't download the file to my pc and then upload it to my server, so the requested file should look like http://www.examplesite.com/file.zip
my server is running linux, but I don't have SSH access.
So how can I do this ?
and thanks guys :D
Without SSH it will be very difficult. Possibly rsync might work, if its on both servers with damons set up. RCP (remote copy) exists, its simlar to SCP with out the SSH part, but I doubt its installed due to security concerns.
You have to start a shell on your server. Then try :
man wget
And use :
wget http://www.examplesite.com/file.zip
If you can not have acces to a shell then tell us exactly what control you have over your server.
my2c

File path for a Cron Job

Hi I want to run a cron job to call a PHP script on my server. I am using Cpanel from my web host and these are the options:
Minute:
Hour:
Day:
Month:
Weekday:
Command:
I am really struggling to point the command to my file I am using this line /home/abbeysof/public_html/adi/cron/daily.php but I am getting this error:
/bin/sh: /home/abbeysof/public_html/adi/cron/daily.php: Permission denied
I asked my web host for help and this is the response:
If you use cpanel to create it, it will fill in the path for you. Typically /home/username/public_html/etc
Can anyone please offer some advice?
Advise 1: use wget command, wget runs the PHP script exactly as if it was called from the web so the PHP environment is exactly the same of when calling the file from the web, it's easier to debug your script then.
wget -O - http://yourdomain.com/adi/cron/daily.php >/dev/null 2>&1
The cron jobs has to be created going into cPanel cron jobs menu. I don't understand if you have this clear by reading your hoster's answer.
And advise 2: change web hosting, try this one they don't leave you alone.
Sorry, I don't know anything about cpanel, but it sounds like:
if you created the file daily.php, then you need to change the permissions on it
if they created the file, then there's a bug in their creation routine.
Good luck!
try this one
/usr/bin/php -q /home/yourCpanelUsername/public_html/filename.php
for some cpanels it might be like this
/usr/local/bin/php -q /home/yourCpanelUsername/public_html/filename.php
Sounds like you need to make /home/abbeysof/public_html/adi/cron/daily.php executable.
The link might help you.
https://www.inmotionhosting.com/support/edu/cpanel/how-to-run-a-cron-job
There is difference if you are using VPS than sharing hosting for giving the command.
You may need to use user-agent & cPanel-Cron along with your url.
curl --user-agent cPanel-Cron http://example.com/cron.php

Resources