I am looking for an easy way to re-send POST request to the server within the browser mainly for debug purposes. Say you have a XHR request which contains POST parameters that is to be send to the server. After having changed the script on the server side, you would like to resent the very same request for analyzing the output.
What tool could help? I guess it is a browser's extension.
I already tried extension Tamper Data for Firefox which does the job as you can "Replay in browser". But for my taste, it is not enough straight forward, as there are 3 - 4 clicks to get the result of the request.
Unfortunately, curl would not be suitable for my needs as my application has a session's cookie.
You can use cookies with cUrl. If the cookies is created on login, you can login using cUrl, saving the cookie to a file:
curl -c cookies.txt -d "user=username&pass=password" http://example.com/login
And then to load them for the subsequent requests:
curl -b cookies.txt ...
If for some reason the cookie can't be created using a cUrl request, you can use the Firefox extension Cookie Exporter, which produces files that can be read by curl using the -b flag.
Related
I would like to download a webpage source code from a page that requires authentication, using shell script or something similar (like Perl, Python, etc..) in a Linux machine.
I tried to use wget and curl, but when I pass the URL, the source code that is being downloaded is for a page that ask me for credential. The same page is already open on Firefox, or Chrome, but I don't known how I can re-use this session.
Basically what I need to do is run a refresh on this page in a regular basis, and grep for some information inside the source code. If I found what I'm looking for, I will trigger another script.
-- Edit --
Tks #Alexufo .I managed to make it work, this way:
1 - Download a Firefox addon to allow me save the cookies in a TXT file. I used this addon: https://addons.mozilla.org/en-US/firefox/addon/export-cookies/
2 - Logged in the site I want, and saved the cookie.
3 - Using wget:
wget --load-cookies=cookie.txt 'http://my.url.com' -O output_file.txt
4 - Now the page source code is inside output_file.txt and I can parse the way I want.
CURL should works anywhere.
1) do first response for autorization. Save cookes.
2) use cookes when you try second response to get you source page code.
update:
Wget should work with post autorization like curl
wget with authentication
update2: http://www.httrack.com/
Mechanize (http://mechanize.rubyforge.org/) can do that. I am using it (together) with Ruby 2.0.0 for exactly that.
I'm trying to get a server response by simply calling cURL http://my.url.com. If I access the url via browser I get a response immediately. If I call it via curl It takes up to 3-4 seconds before I receive the response from the server.
Is there any kind of special "end" command which has to be passed to curl?
You can run 'cURL -v http://my.url.com' with the verbose flag in order to see details of what step may be slow. Depending on the server the response can be fast or slow. For more details, check out the man page of cURL.
Also, your browser may be storing pages which makes loading seem faster.
When a request is sent to the server from the browser for a web page what are the information sent to the server in the http request? Can we check those information?If yes how to do that?
Install Firebug and you can inspect the HTTP headers of both the request and response. That will work for either GET or POST requests. You can do the same with Fiddler for IE.
Fiddler is the best for this (IMO)
You can save all requests in a particular session. Have it running in the background while you click away, then see what got sent over the wire. Replay requests, etc. The features go on and on.
You can check that with Chrome using a right-click > inspect > ressources or under Firefox using Firebug.
I need to login to a website with username and password, and then download a file. The url of the file is static. How do I automate the above process with Linux/Unix scripts? Thanks a lot.
Jiangzhe
well, it's not that simple.
what you need to do is the following:
send an HTTP POST request containing your username and password to the login form's URL.
you will get a cookie (probably containing a session ID).
send an HTTP GET request for the file, sending your cookie details in the HTTP headers.
you probably should use some scripting language with an HTTP library (python's httplib and urllib2 are great options).
Just use CURL to send POST or GET request with login data to site and then do second request to download file.
I am trying to figure out how I can post a message to an http server from the linux shell. What I want is for the shell to post the message and then I can write a small php program to reroute the message to its intended recipient based on the contents and the sender. I cant seem to find a command to do this in Linux. I would really like to stick to a built in utility.
If there is a better frame work that you can think of please let me know.
Thanks
The man page for wget has some examples, e.g.
wget --save-cookies cookies.txt \
--post-data 'user=foo&password=bar' \
http://server.com/auth.php
curl and wget can be used for performing http requests from the shell.
You may want to use some sort of authentication and encryption mechanism to avoid abuse of the URL
If you want to stick with built in tools use wget and refer to this SO post about posting data with wget: How to get past the login page with Wget?.
You're going to have to send your data in the post data section and format it on your server side PHP script.
You can use curl for this purpose. Have a look at the --data* and --form options in the manpage.
This is what curl is good at.
--post-data does not work for me because it will report "405 Method Not Allowed"
you can actually use wget as following to send some data to http server.
wget 'http://server.com/auth?name=foo&password=bar'