Send message from linux terminal to some web server - linux

I am trying to figure out how I can post a message to an http server from the linux shell. What I want is for the shell to post the message and then I can write a small php program to reroute the message to its intended recipient based on the contents and the sender. I cant seem to find a command to do this in Linux. I would really like to stick to a built in utility.
If there is a better frame work that you can think of please let me know.
Thanks

The man page for wget has some examples, e.g.
wget --save-cookies cookies.txt \
--post-data 'user=foo&password=bar' \
http://server.com/auth.php

curl and wget can be used for performing http requests from the shell.
You may want to use some sort of authentication and encryption mechanism to avoid abuse of the URL

If you want to stick with built in tools use wget and refer to this SO post about posting data with wget: How to get past the login page with Wget?.
You're going to have to send your data in the post data section and format it on your server side PHP script.

You can use curl for this purpose. Have a look at the --data* and --form options in the manpage.

This is what curl is good at.

--post-data does not work for me because it will report "405 Method Not Allowed"
you can actually use wget as following to send some data to http server.
wget 'http://server.com/auth?name=foo&password=bar'

Related

Download entire website with videos using wget

I have been using wget to download website but I have come across a bit of trouble if the website has videos from youtube, vimeo or others.
I can't seem to get rid of the ads as well.
The website that I am trying to get at the moment is :
https://www.ctrlpaint.com
I only need it temporarily be cause I have to work at a place where there is no internet. So I don't want to go to the hassle of downloading all the videos from vimeo.
Thanks for the help, let me know if you need more precision or if you want me to try anything.
I'm using gentoo.
The command I used was:
$ wget \ --recursive \ --no-clobber \ --page-requisites \ --html-extension \ --convert-links \ --restrict-file-names=windows \ --domains website.org \ --no-parent \ website_to download
It left me with the full website but looks to connect to the internet for the videos.
That's because the videos are on a different host, I think.
This would work
wget -H -r --level=1 -k -p --no-clobber https://www.ctrlpaint.com/
The -H option includes other hosts. That being said, the video host here is vimeo, and when I tried it they detected the wget user agent and refused to actually send the video.
As an aside, this kind of thing is generally considered bad form, as the host you are mirroring has to pay for bandwidth. (And in fact may refuse to fulfill some requests, sending a too many requests error response.)
The reason why the videos are not downloading is because they are not a single file, they are a stream of multiple files or chunks.
Websites like Vimeo or YouTube will be most likely using DASH or HLS, which is all HTTP Video Streaming. This requires that you open the video with one of their players. After you make the initial request for the video to the server, the server sends back a manifest file with a list of all the links for the movie chunks. From there the player will send the subsequent requests for each of the movie chunks.
The server denies you access to the manifest or chunks when using wget or curl because there are some requirements and auth necessary for you to be able to get access to the files. The player takes care of all that, that's why you have to use one of their players.
You are probably in need of an app that can download the YouTube videos. I'm pretty sure you can find some options out there.
Good luck!

How to Send Data to an sFTP server? and how to upload/download data to it?

I am completely new to sFTP (Secure File Transfer Protocol) Servers and would like to know how to send data to one.
Imagine I have set up an sFTP server, could someone provide me with the pseudo code (as I am not sure what specifics I'm required to give) for sending a .zip file to it using a Linux box on the command line.
Also could you provide me with the pseudo code that would be needed to extract that same data once it has been uploaded from that server.
Could I please ask that any code supplied be heavily commented (as I really want to understand this!)
Please be gentle with your comments, I am VERY new to all of this. I imagine I will have missed out something key that someone will need to no. If any additional information is required please let me know and I will of course supply it.
Thanks in advance. I really will appreciate any help/advise!
For a GUI, you need an SFTP client like FileZilla. There is a lot of them free.
Linux has a sftp command for bash.
From your client, you can use curl to upload and/or download files to/from your sftp server.
To upload a file:
curl -T /name/of/local/file/to/upload -u username:password sftp://hostname.com/directory/to/upload/file/to
To download a file:
curl -u username:password sftp://hostname.com/name/of/remote/file/to/download -o /name/of/local/directory/to/download/file/to

Download webpage source from a page that requires authentication

I would like to download a webpage source code from a page that requires authentication, using shell script or something similar (like Perl, Python, etc..) in a Linux machine.
I tried to use wget and curl, but when I pass the URL, the source code that is being downloaded is for a page that ask me for credential. The same page is already open on Firefox, or Chrome, but I don't known how I can re-use this session.
Basically what I need to do is run a refresh on this page in a regular basis, and grep for some information inside the source code. If I found what I'm looking for, I will trigger another script.
-- Edit --
Tks #Alexufo .I managed to make it work, this way:
1 - Download a Firefox addon to allow me save the cookies in a TXT file. I used this addon: https://addons.mozilla.org/en-US/firefox/addon/export-cookies/
2 - Logged in the site I want, and saved the cookie.
3 - Using wget:
wget --load-cookies=cookie.txt 'http://my.url.com' -O output_file.txt
4 - Now the page source code is inside output_file.txt and I can parse the way I want.
CURL should works anywhere.
1) do first response for autorization. Save cookes.
2) use cookes when you try second response to get you source page code.
update:
Wget should work with post autorization like curl
wget with authentication
update2: http://www.httrack.com/
Mechanize (http://mechanize.rubyforge.org/) can do that. I am using it (together) with Ruby 2.0.0 for exactly that.

cUrl doesn't send command immediately

I'm trying to get a server response by simply calling cURL http://my.url.com. If I access the url via browser I get a response immediately. If I call it via curl It takes up to 3-4 seconds before I receive the response from the server.
Is there any kind of special "end" command which has to be passed to curl?
You can run 'cURL -v http://my.url.com' with the verbose flag in order to see details of what step may be slow. Depending on the server the response can be fast or slow. For more details, check out the man page of cURL.
Also, your browser may be storing pages which makes loading seem faster.

Re-send POST request easily - what tools?

I am looking for an easy way to re-send POST request to the server within the browser mainly for debug purposes. Say you have a XHR request which contains POST parameters that is to be send to the server. After having changed the script on the server side, you would like to resent the very same request for analyzing the output.
What tool could help? I guess it is a browser's extension.
I already tried extension Tamper Data for Firefox which does the job as you can "Replay in browser". But for my taste, it is not enough straight forward, as there are 3 - 4 clicks to get the result of the request.
Unfortunately, curl would not be suitable for my needs as my application has a session's cookie.
You can use cookies with cUrl. If the cookies is created on login, you can login using cUrl, saving the cookie to a file:
curl -c cookies.txt -d "user=username&pass=password" http://example.com/login
And then to load them for the subsequent requests:
curl -b cookies.txt ...
If for some reason the cookie can't be created using a cUrl request, you can use the Firefox extension Cookie Exporter, which produces files that can be read by curl using the -b flag.

Resources