Executing php scripts without opening browser - web

I want to execute a php file located in my apache server on localhost/remote from Processing. But I want to do it without opening the browser. How can I do this. The php script just adds data to mysql database using values obtained from GET request. Is this possible? Actually I tried using link("/receiver.php?a=1&b=2") but it opens a web page containing the php output.

Ideally such scripts must be generic so that it can be used as utility for both web and bash scripts, in-case you cannot change/modify script, then I would suggest to curl from terminal/bash script to make HTTP request to the given link.
Refer to this solution, as how to make request with CURL.

Related

Forward log to http server

I have a pipeline that I run with nextflow which is a workflow framework.
It has an option of seeing real time logs on the http server.
The command to do this is like so:
nextflow run script.nf --with-weblog http://localhost:8891
But I don't see anything when I open my web browser. I have port forwarded while logging into the ubuntu instance and the python http server seems to work fine.
I will need help in understanding how I can set this up so I can view logs generated by my script on the url provided.
Thanks in advance!
In nextflow you need to be careful with the leading dashes of commandline parameters. Everything starting with two dashes like --input will be forwarded to your workflow/processes (e.g. as params.input) while parameters with a single leading dash like -entry are interpreted as parameters by nextflow.
It might be a typo in your question, but to make this work you have to use -with-weblog <url> (note that I used only a single dash here)
See the corresponding docs for further information on this:
Nextflow is able to send detailed workflow execution metadata and
runtime statistics to a HTTP endpoint. To enable this feature use the
-with-weblog as shown below:
nextflow run <pipeline name> -with-weblog [url]
However, this might not be the only problem you are encountering with your setting. You will have to store or process the webhooks that nextflow sends on the server.
P.S: since this is already an old question, how did you proceed? Have you solved that issue yourself in the meantime or gave up on it?

Upload file on easyupload.io with curl linux

I tried to upload file on file.io which has api using curl -F "file=#filename.txt" "https://file.io/"
how to achieve this using easyupload.io and validate that file is uploaded properly and also display the url for the uploaded file.
Easyupload.io uses Google Captcha in the background to prevent automatic uploading.
So I guess if there is no specific API for this site, it's not meant to be used with a script / program.
You can reverse engineer the upload process by opening the network tab in your browser's developer tools.

Download webpage source from a page that requires authentication

I would like to download a webpage source code from a page that requires authentication, using shell script or something similar (like Perl, Python, etc..) in a Linux machine.
I tried to use wget and curl, but when I pass the URL, the source code that is being downloaded is for a page that ask me for credential. The same page is already open on Firefox, or Chrome, but I don't known how I can re-use this session.
Basically what I need to do is run a refresh on this page in a regular basis, and grep for some information inside the source code. If I found what I'm looking for, I will trigger another script.
-- Edit --
Tks #Alexufo .I managed to make it work, this way:
1 - Download a Firefox addon to allow me save the cookies in a TXT file. I used this addon: https://addons.mozilla.org/en-US/firefox/addon/export-cookies/
2 - Logged in the site I want, and saved the cookie.
3 - Using wget:
wget --load-cookies=cookie.txt 'http://my.url.com' -O output_file.txt
4 - Now the page source code is inside output_file.txt and I can parse the way I want.
CURL should works anywhere.
1) do first response for autorization. Save cookes.
2) use cookes when you try second response to get you source page code.
update:
Wget should work with post autorization like curl
wget with authentication
update2: http://www.httrack.com/
Mechanize (http://mechanize.rubyforge.org/) can do that. I am using it (together) with Ruby 2.0.0 for exactly that.

Script to Open URL as HTTP POST request (not just GET) in web browser

How can I open a URL from the command line script as a POST request to a URL? I'd like to to do this on Linux, Windows, and MacOS so a cross platform way of doing this
For each of these I can open a GET request to a URL using:
xdg-open [url] - Linux
start [url] - Windows
open [url] - MacOS
... but I don't see a way of doing a POST request. Is it possible, and if so how?
Also, showing how to add a POST body to the request would be appreciated too. Fyi, the script I'm writing is written in Ruby so anything built-in to the OS or something using Ruby is fine too.
UPDATED:
To clarify I'd like this open up in the default browser, not just issue the POST request and get the result.
Sorry, but there's no reliable way to do this across browsers and across operating systems. This was a topic the IE team spent a bunch of time looking into when we built Accelerators in IE8.
The closest you can come is writing a HTML page to disk that has an auto-submitting form (or XmlHttpRequest), then invoking the browser to load that page which will submit the form. This constrains your code to only submitting forms that are legal to submit from the browser (e.g. only thee possible content types, etc).
If you're using ruby you can use the NET::HTTP library: http://ruby-doc.org/stdlib-2.0/libdoc/net/http/rdoc/Net/HTTP.html
require 'net/http'
uri = URI('http://www.example.com/search.cgi')
res = Net::HTTP.post_form(uri, 'q' => 'ruby', 'max' => '50')
puts res.body
I don't know how compatible my solution is across browsers and platforms.
But instead of using a temporary file as described by #EricLaw,
you can encode that html with the auto-submit form in a Data URL, as i did here:
https://superuser.com/questions/1281950/open-chrome-with-method-post-body-data-on-specific-url/1631277#1631277

Images stopped showing using Smart Lencioni Image Resizer

I don't know what I did, but for some reason images stopped working. Some browsers show the image, but the majority don't. I'm using v1.4.1 of Lencioni Image Resizer
This image will not show:
http://www.norwegianfashion.no/wp-content/themes/norwegianfashion/image.php?width=280&height=&cropratio=2:1&image=http://www.norwegianfashion.no/wp-content/uploads/magazine/issue5/siri1_72dpi.jpg
But you can access the image here:
http://www.norwegianfashion.no/wp-content/uploads/magazine/issue5/siri1_72dpi.jpg
If I change & with &, I get the message Error: no image was specified.
http://www.norwegianfashion.no/wp-content/themes/norwegianfashion/image.php?width=280&height=&cropratio=2:1&image=http://www.norwegianfashion.no/wp-content/uploads/magazine/issue5/siri1_72dpi.jpg
Another place I'm using it, is here, and that works fine:
http://www.advicis.no/wp-content/themes/business1/image.php?width=150&height=&cropratio=1:1&image=http://www.advicis.no/wp-content/uploads/OD-puzzle-large.jpg
What could cause this?
Apparently your PHP script cannot access the url http://www.norwegianfashion.no/wp-content/uploads/magazine/issue5/siri1_72dpi.jpg.
I can, and you can, but the server on which the PHP script is running can't.
Perhaps the PHP server doesn't have the right pluging installed to do HTTP requests, or the HTTP server blocks requests coming from within.
Can you insert some debugging into image.php showing the results of each step? Or post the part of the image.php code where it retrieves the image?
Can you login to the PHP server with SSH and see if you can execute:
wget http://www.norwegianfashion.no/wp-content/uploads/magazine/issue5/siri1_72dpi.jpg
Another solution is to let image.php grab the file from the local disk instead of through a HTTP request, but that requires some redesign of that script.

Resources