cUrl doesn't send command immediately - linux

I'm trying to get a server response by simply calling cURL http://my.url.com. If I access the url via browser I get a response immediately. If I call it via curl It takes up to 3-4 seconds before I receive the response from the server.
Is there any kind of special "end" command which has to be passed to curl?

You can run 'cURL -v http://my.url.com' with the verbose flag in order to see details of what step may be slow. Depending on the server the response can be fast or slow. For more details, check out the man page of cURL.
Also, your browser may be storing pages which makes loading seem faster.

Related

Executing php scripts without opening browser

I want to execute a php file located in my apache server on localhost/remote from Processing. But I want to do it without opening the browser. How can I do this. The php script just adds data to mysql database using values obtained from GET request. Is this possible? Actually I tried using link("/receiver.php?a=1&b=2") but it opens a web page containing the php output.
Ideally such scripts must be generic so that it can be used as utility for both web and bash scripts, in-case you cannot change/modify script, then I would suggest to curl from terminal/bash script to make HTTP request to the given link.
Refer to this solution, as how to make request with CURL.

Download webpage source from a page that requires authentication

I would like to download a webpage source code from a page that requires authentication, using shell script or something similar (like Perl, Python, etc..) in a Linux machine.
I tried to use wget and curl, but when I pass the URL, the source code that is being downloaded is for a page that ask me for credential. The same page is already open on Firefox, or Chrome, but I don't known how I can re-use this session.
Basically what I need to do is run a refresh on this page in a regular basis, and grep for some information inside the source code. If I found what I'm looking for, I will trigger another script.
-- Edit --
Tks #Alexufo .I managed to make it work, this way:
1 - Download a Firefox addon to allow me save the cookies in a TXT file. I used this addon: https://addons.mozilla.org/en-US/firefox/addon/export-cookies/
2 - Logged in the site I want, and saved the cookie.
3 - Using wget:
wget --load-cookies=cookie.txt 'http://my.url.com' -O output_file.txt
4 - Now the page source code is inside output_file.txt and I can parse the way I want.
CURL should works anywhere.
1) do first response for autorization. Save cookes.
2) use cookes when you try second response to get you source page code.
update:
Wget should work with post autorization like curl
wget with authentication
update2: http://www.httrack.com/
Mechanize (http://mechanize.rubyforge.org/) can do that. I am using it (together) with Ruby 2.0.0 for exactly that.

How to auto upload and check in the files to sharepoint using curl?

I am trying to upload a file from linux to sharepoint with my sharepoint login credentials.
I use the cURL utility to achieve this. The upload is successful.
The command used is : curl --ntlm --user username:password --upload-file myfile.txt -k https://sharepointserver.com/sites/mysite/myfile.txt
-k option is used to overcome the certificate errors for the non-secure sharepoint site.
However, this uploaded file is showing up in "checked out" view(green arrow) in sharepoint from my login.
As a result, this file is non-existent for users from other logins.
My login has the write access previlege to sharepoint.
Any ideas on how to "check in" this file to sharepoint with cURL so that the file can be viewed from anyone's login ?
I don't have curl available to test right now but you might be able to fashion something out of the following information.
Check in and check out is handled by /_layouts/CheckIn.aspx
The page has the following querystring variables:
List - A GUID that identifies the current list.
FileName - The name of the file with extension.
Source - The full url to the allitems.aspx page in the library.
I was able to get the CheckIn.aspx page to load correctly just using the FileName and Source parameters and omitting the List parameter. This is good because you don't have to figure out a way to look up the List GUID.
The CheckIn.aspx page postbacks to itself with the following form parameters that control checkin:
PostBack - boolean set to true.
CheckInAction - string set to ActionCheckin
KeepCheckout - set to 1 to keep checkout and 0 to keep checked in
CheckinDescription - string of text
Call this in curl like so
curl --data "PostBack=true&CheckinAction=ActionCheckin&KeepCheckout=0&CheckinDescription=SomeTextForCheckIn" http://{Your Server And Site}/_layouts/checkin.aspx?Source={Full Url To Library}/Forms/AllItems.aspx&FileName={Doc And Ext}
As I said I don't have curl to test but I got this to work using the Composer tab in Fiddler 2
I'm trying this with curl now and there is an issue getting it to work. Fiddler was executing the request as a POST. If you try to do this as a GET request you will get a 500 error saying that the AllowUnsafeUpdates property of the SPWeb will not allow this request over GET. Sending the request as a POST should correct this.
Edit I am currently going through the checkin.aspx source in the DotPeek decompiler and seeing some additional options for the ActionCheckin parameter that may be relevant such as ActionCheckinPublish and ActionCheckinFromClientPublish. I will update this with any additional findings. The page is located at Microsoft.SharePoint.ApplicationPages.Checkin for anyone else interested.
The above answer by Junx is correct. However, Filename variable is not only the document filename and the extension, but should also include the library name. I was able to get this to work using the following.
Example: http://domain/_layouts/Checkin.aspx?Filename=Shared Documents/filename.txt
My question about Performing multiple requests using cURL has a pretty comprehensive example using bash and cURL, although it suffers from having to reenter the password for each request.

Send message from linux terminal to some web server

I am trying to figure out how I can post a message to an http server from the linux shell. What I want is for the shell to post the message and then I can write a small php program to reroute the message to its intended recipient based on the contents and the sender. I cant seem to find a command to do this in Linux. I would really like to stick to a built in utility.
If there is a better frame work that you can think of please let me know.
Thanks
The man page for wget has some examples, e.g.
wget --save-cookies cookies.txt \
--post-data 'user=foo&password=bar' \
http://server.com/auth.php
curl and wget can be used for performing http requests from the shell.
You may want to use some sort of authentication and encryption mechanism to avoid abuse of the URL
If you want to stick with built in tools use wget and refer to this SO post about posting data with wget: How to get past the login page with Wget?.
You're going to have to send your data in the post data section and format it on your server side PHP script.
You can use curl for this purpose. Have a look at the --data* and --form options in the manpage.
This is what curl is good at.
--post-data does not work for me because it will report "405 Method Not Allowed"
you can actually use wget as following to send some data to http server.
wget 'http://server.com/auth?name=foo&password=bar'

Re-send POST request easily - what tools?

I am looking for an easy way to re-send POST request to the server within the browser mainly for debug purposes. Say you have a XHR request which contains POST parameters that is to be send to the server. After having changed the script on the server side, you would like to resent the very same request for analyzing the output.
What tool could help? I guess it is a browser's extension.
I already tried extension Tamper Data for Firefox which does the job as you can "Replay in browser". But for my taste, it is not enough straight forward, as there are 3 - 4 clicks to get the result of the request.
Unfortunately, curl would not be suitable for my needs as my application has a session's cookie.
You can use cookies with cUrl. If the cookies is created on login, you can login using cUrl, saving the cookie to a file:
curl -c cookies.txt -d "user=username&pass=password" http://example.com/login
And then to load them for the subsequent requests:
curl -b cookies.txt ...
If for some reason the cookie can't be created using a cUrl request, you can use the Firefox extension Cookie Exporter, which produces files that can be read by curl using the -b flag.

Resources