cntlm proxy with phantomjs - iis

I'm trying to use the cntlm proxy on my windows machine to talk to a local web application on IIS that uses Windows Authentication from PhantomJS. To create the proxy, I'm doing: cntlm -v -u username#domain -p password -l 1456 localhost:80
My app lives at localhost/myapp
To test whether or not this works, I try to browse to localhost:1456/myapp but I always get an auth challenge and no sensible username/password combination seems to work. Any thoughts on why this setup might not be working as expected?
When I hit the proxied endpoint in a browser, this is the output from cntlm:
http://pastebin.com/xvvmfsGV

After wrestling with the concept for a while I finally figured out how to get this set up.
After installing cntlm, I ran the following from a command prompt:
"c:\Program Files (x86)\Cntlm\cntlm.exe" -u <user_name> -d <domain_name> -H
This asks for your password and spits out three hashes to use in the configuration file.
I whittled down the required configuration in cntlm.ini to:
Username <user_name>
Domain <domain_name>
PassLM <LM_hash>
PassNT <NT_hash>
PassNTLMv2 <NTLMv2_hash>
Proxy 192.168.7.1:80 #random proxy
NoProxy *
Listen 3133 # unused port
cntlm forces your to specify a top-level proxy even if you don't need one or have one, so any valid number for that option will do. Setting NoProxy to * ensures that any request never gets passed on to the bogus proxy specified.
Run "c:\Program Files (x86)\Cntlm\cntlm.exe" -f in a console to verify that everything is working. Otherwise, start and stop it as a service.
To test with phantomjs I used the following script:
var page = require('webpage').create();
page.open('http://<machine_name>/myapp', function(status) {
console.log("Status: " + status);
if(status === "success") {
page.render('example.png');
}
phantom.exit();
});
<machine_name> cannot be localhost because phantomjs bypasses proxies when the host is localhost, so use your machine name or ip address instead.
To run it: phantomjs --proxy=localhost:3133 test.js

Related

Node js can't upload files to FTP when deployed and running on production server

I'm using Node JS (12.13.0) and NPM (6.13.19) with basic-ftp. Everything works fine and I can upload files to the remote FTP (without SSL, my remote FTP doesn't allow this) when I run the code on my development machine from localhost.
The production server is hosted on Digital Ocean (Ubuntu 18.04.3) I have tried to disable the firewall, because I thought this might be the reason to the problem. I used sudo ufw disable and just to make sure it's disabled I check the current status with sudo ufw status which returns Status: inactive.
This is my code
async function uploadImageToFtp(fileName, path) {
const client = new ftp.Client()
client.ftp.verbose = true
try {
await client.access({
host: process.env.FTP_HOST,
user: process.env.FTP_USER,
password: process.env.FTP_PASSWORD,
secure: false
})
await client.uploadFrom(path, "images/bd/" + fileName)
} catch (err) {
console.log(err)
}
client.close()
}
Response on production
Connected to EXTERNAL_IP_ADDRESS < 220 server ready - login please Login
security: No encryption
> USER username < 331 password required
> PASS ###
Again on localhost everything works and we get past this step and starts uploading the file(s) to the same server and credentials.
After this I never get any response, except for a timeout with Bad Gateway 502 from my request.
I don't know the library, but the problem sounds like the FTP session is running in active mode. That can often be a problem, so if it is in active mode, I'd recommend trying setting your client to ask for passive mode.
There is issue with AWS dynamic routing. your instance is under vpc and nat is not able to resolve the address back from ftp server to your instance.
You can try by adding route entry in ip table. Check here
This tells the nat to resolve particular ftp to specific address. i hope this will help.

Splash does not connect to proxy using any of the 3 ways described in documentation

Splash browser does not send anything to through the http proxy. The pages are fetched even when the proxy is not running.
I am using scrapy with splash in python 3 to fetch pages after authentication for a an Angular.js website. The script is able to fetch pages, authenticate, and fetch pages after authentication. However, it does not use the proxy setup at localhost:8090 and wireshark confirms that traffic coming from port 8050 goes to some port in the 50k range.
The setup is
- splash running locally on a docker image (latest) on port 8050
- python 3 running locally on a mac
- Zap proxy running locally on a mac at port 8090
- Web page accessed through VPN
I have tried to specify the proxy host:port through the server using Chrome with a LUA script. Page is fetched without the proxy.
I have tried to specify the proxy in the python script with both Lua and with the api (args={'proxy':'host:port'} and the page is fetched without using the proxy.
I have tried using the proxy-host file and I get status 502.
Proxy set through Lua on Chrome (no error, not proxied):
function main(splash, args)
splash:on_request(function(request)
request:set_proxy{
host = "127.0.0.1",
port = 8090,
username = "",
password = "",
type = "HTTP"
}
end
)
assert(splash:go(args.url))
assert(splash:wait(0.5))
return {
html = splash:html(),
png = splash:png(),
har = splash:har(),
}
end
req = SplashRequest("http://mysite/home", self.log_in,
endpoint='execute', args={'lua_source': script})
Proxy set through api (status 502):
req = SplashRequest("http://mysite/home",
self.log_in, args={'proxy': 'http://127.0.0.1:8090'})
Proxy set through Lua in Python (no error, not proxied):
def start_requests(self):
script = """
function main(splash, args)
assert(splash:go(args.url))
assert(splash:wait(0.5))
splash:on_request(function(request)
request:set_proxy{
host = "127.0.0.1",
port = 8090,
username = "",
password = "",
type = "HTTP"
}
end
)
return {
html = splash:html(),
png = splash:png(),
har = splash:har(),
}
end
"""
req = SplashRequest("http://mysite/home", self.log_in,
endpoint='execute', args={'lua_source': script})
# req.meta['proxy'] = 'http://127.0.0.1:8090'
yield req
Proxy set through proxy file in docker image (status 502):
proxy file:
[proxy]
; required
host=127.0.0.1
port=8090
Shell command:
docker run -it -p 8050:8050 -v ~/Documents/proxy-profile:/etc/splash/proxy-profiles scrapinghub/splash --proxy-profiles-path=/etc/splash/proxy-profiles
All of the above should display the page in zap proxy at port 8090.
Some of the above seem to set the proxy, but the proxy can't reach localhost:8090 (status 502). Some don't work at all (no error, not proxied). I think this may be related to fact that a docker image is being used.
I am not looking to use Selenium because that is what this replacing.
All methods returning status 502 are working correctly. The reason for this issue is that docker images cannot access localhost on the host. To resolve this, use http://docker.for.mac.localhost:8090 as the proxy host:port on mac host and use docker run -it --network host scrapinghub/splash for linux with localhost:port. For linux, -p is invalidated since all services on the container will be on localhost.
Method 2 is best for a single proxy without rules. Method 4 is best for multiple proxies with rules.
I did not try other methods to see what they would return with these changes and why.
Alright I have been struggling with the same problem for a while now, but I found the solution for your first method on GitHub, which is based on what the Docker docs state:
The host has a changing IP address (or none if you have no network access). From 18.03 onwards our recommendation is to connect to the special DNS name host.docker.internal, which resolves to the internal IP address used by the host.
The gateway is also reachable as gateway.docker.internal.
Meaning that you should/could use the "host.docker.internal" as host instead for your proxy E.g.
splash:on_request(function (request)
request:set_proxy{
host = "host.docker.internal",
port = 8090
}
end)
Here is the link to the explanation: https://github.com/scrapy-plugins/scrapy-splash/issues/99#issuecomment-386158523

How can I access my nodejs web server from my local computer using the server domain name?

I installed nodejs and created a sample app. When I run npm start I get a message saying that I can open my web browser to http://localhost:3000 to see the app in action, but this installation is on a web server - not my local computer, so, instead of localhost:3000 I want to get there using something like mydomain.com:3000
I can't find the answer, it's very likely I just don't know how to search for it... any ideas?
I'm following the tutorial here: https://facebook.github.io/react/tutorial/tutorial.html
I think I only needed to get away from this for a while. I got it working using ssh local forwarding.
I already used an ssh config file to log in to my server without having to remember the password, so I just added this line to my config file:
LocalForward localhost:3000 xxx.xxx.xxx.xxx:3000
where xxx.xxx.xxx.xxx is my server IP address.
Then, I connected to my server via ssh:
ssh -f -N mysite
Once connected, I open up the browser and go to localhost:3000 and there it is now.
I used my ssh config file, but it should also work without it.
ssh -f -N -R 3000:localhost:3000 mydomain.com
I found this command that eventually led me to solve my problem in this link: http://stuff-things.net/2016/01/20/tunneling-to-localhost/

Run node.js on cpanel hosting server

It is a simple node.js code.
var http = require('http');
http.createServer(function(req, res) {
res.writeHead(200, { 'Content-Type' : 'text/plain'});
res.end('Hello World!');
}).listen(8080);
I uploaded it on cpanel hosting server and installed node.js and run it.
If a server is normal server I can check script result by accessing 'http://{serverip}:8080'. But on cpanel is hosting domain and sub domain and every domain is matched by every sites. Even http://{serverip} is not valid url.
How can I access my node.js result?
Kindly teach me.
Thanks.
bingbing.
Install/Setup NodeJS with CPanel
1. Log in to your account using SSH (if it is not enabled for your account, contact the support team).
2. Download Node.js
wget https://nodejs.org/dist/latest/node-v10.0.0-linux-arm64.tar.xz
3. Extract the Node.js files
tar xvf node-v10.0.0-linux-arm64.tar.xz
4.Now rename the folder to "nodejs". To do this, type the following command
mv node-v10.0.0-linux nodejs
5. Now to install the node and npm binaries, type the following commands:
mkdir ~/bin <br> cp nodejs/bin/node ~/bin
cd ~/bin
ln -s
../nodejs/lib/node_modules/npm/bin/npm-cli.js npm
6. Node.js and npm are installed on your account. To verify this, type the following commands
node --version
npm --version
The ~/bin directory is in your path by default, which means you can run node and npm from any directory in your account.
7. Start Node.js Application
nohup node my_app.js &
8. Stop the Application
pkill node
9. Integrating a Node.js application with the web server(optional)
Depending on the type of Node.js application you are running, you may want to be able to access it using a web browser. To do this, you need to select an unused port for the Node.js application to listen on, and then define server rewrite rules that redirect visitors to the application.
In a text editor, add the following lines to the .htaccess file in the/home/username/public_html directory, where username represents your account username:
RewriteEngine On
RewriteRule ^$ http://127.0.0.1:XXXXX/ [P,L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ http://127.0.0.1:XXXXX/$1 [P,L]
In both RewriteRule lines, replace XXXXX with the port on which your Node.js application listens.
To run a Node.js application on a managed server, you must select an unused port, and the port number must be between 49152 and 65535(inclusive).
Save the changes to the .htaccess file, and then exit the text editor. Visitors to your website are redirected to the Node.js application listening on the specified port.
If your application fails to start, the port you chose may already be in use. Check the application log for error codes like EADDRINUSE that indicate the port is in use. If it is, select a different port number, update your application’s configuration and the .htaccess file, and then try again.
cPanel typically runs Apache or another web server that is shared among all the cPanel/unix accounts. The web server listens on port 80. Depending on the domain name in the requested URL, the web server uses "Virtual Hosting" to figure out which cPanel/unix account should process the request, i.e. in which home directory to find the files to serve and scripts to run. If the URL only contains an IP address, cPanel has to default to one of cPanel accounts.
Ordinarily, without root access, a job run by a cPanel account cannot listen on port 80. Indeed, the available ports might be quite restrictive. If 8080 doesn't work, you might try 60000. To access a running node.js server, you'll need to have the port number it's listening on. Since that is the only job listening on that port on that server, you should be able to point your browser to the domain name of any of the cPanel accounts or even the IP address of the server, adding the port number to the URL. But, it's typical to use the domain name for the cPanel account running the node.js job, e.g. http://cPanelDomainName.com:60000/ .
Of course port 80 is the default for web services, and relatively few users are familiar with optional port numbers in URLs. To make things easier for users, you can use Apache to "reverse proxy" requests on port 80 to the port that the node.js process is listening on. This can be done using Apache's RewriteRule directive in a configuration or .htaccess file. This reverse proxying of requests arguably has other benefits as well, e.g. Apache may be a more secure, reliable and manageable front-end for facing the public Internet.
Unfortunately, this setup for node.js is not endorsed by all web hosting companies. One hosting company that supports it, even on its inexpensive shared hosting offerings, is A2Hosting.com. They also have a clearly written description of the setup process in their Knowledge Base.
Finally, it's worth noting that the developers of cPanel are working on built-in node.js support. "If all of the stars align we might see this land as soon as version 68," i.e. perhaps early 2018.
References
Apache Virtual Hosting -
http://httpd.apache.org/docs/2.4/vhosts/
Apache RewriteRule Directive - http://httpd.apache.org/docs/2.4/mod/mod_rewrite.html
A2Hosting.com Knowledge Base Article on Configuring Node.js - https://www.a2hosting.com/kb/installable-applications/manual-installations/installing-node-js-on-managed-hosting-accounts
cPanel Feature Request Thread for node.js Support - https://features.cpanel.net/topic/nodejs-hosting
Related StackOverflow Questions
How to host a Node.Js application in shared hosting
Why node.js can't run on shared hosting?
Yes it's possible, but it has few dependencies which may or may not be supported by either your cpanel hosting provider or the plan you opt in for.
Below steps that I'm mentioning is just for a demo purpose. If you are a student or just want to play with it you can try it out. I'm not a security expert so from security point of view how good it is I really don't know.
So with that being said let's see how I configured it. I have hostinger cpanel hosting subscription and following are the steps:
Enable SSH ACCESS
Connect to shared machine via ssh
Check your linux distro and download & setup node js
In my case following are the commands for that:
Downloading node & extracting it using curl
curl https://nodejs.org/dist/v12.18.3/node-v12.18.3-linux-x64.tar.gz |tar xz
This will download & extract node & create a directory. You can confirm that using ls command as shown in the image below.
At this point you can check the versions as shown below
as you can see for the node command it's okay but for the npm command we have modify it as follows
./node-v12.18.3-linux-x64/bin/node ./node-v12.18.3-linux-x64/lib/node_modules/npm/bin/npm-cli.js --version
Further we can create alias to make life little easier
check the below images for that:
I tried using bashrc/bash_profile but somehow it didn't work .
And that's all node server running on a shared cpanel machine.
Now I wanted to have an express js based rest api support in this case. The problem with that is it will be locally hosted on the port I'll give. Check the below example:
var express=require('express')
var app=express()
app.get('/', function (req, res) {
res.send('hosting node js base express api using php & shared hosting a great way to start yjtools')
})
console.log("listening yjtools node server on port 49876...")
app.listen(49876)
The problem here is even though it will execute I'll not be able to access it over the network. This is because we only get fixed predefined ports (like 80,21,3306 etc.) which are allowed/open on the shared cpanel machine. Due to this the express app I hosted will only available locally on 49876 port.
Let's see what do we have:
An express js based app hosted locally on cpanel machine.
Php based hosted Apache server available over http/https.
So we can make use of php with redirect rule set and curl to bridge the gap.
Following are the changes I did to make it work:
In .htaccess file add a redirect rule, say domain/api is what I want my rest api path to be.
RewriteRule api/(.*)$ api/api.php?request=$1 [QSA,NC,L]
In the api/api.php file (this is the path I choose you can choose any path)
<?php
echo "Hello ".$_REQUEST['username'];
echo '<hr>';
$curl = curl_init('http://127.0.0.1:49976/');
curl_setopt($curl, CURLOPT_HEADER, 1);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
//Get the full response
$resp = curl_exec($curl);
if($resp === false) {
//If couldn't connect, try increasing usleep
echo 'Error: ' . curl_error($curl);
} else {
//Split response headers and body
list($head, $body) = explode("\r\n\r\n", $resp, 2);
$headarr = explode("\n", $head);
//Print headers
foreach($headarr as $headval) {
header($headval);
}
//Print body
echo $body;
}
//Close connection
curl_close($curl);
?>
And on the ssh prompt just run the app.js file
node api/app.js
Below are the images for this working in action:
Here is the similar thing which I referred for my program, so we can also make this node call via php itself.
Now I have express based rest api support , angular app hosted and mysql for database everything on cpanel.
You can use any domain pointed to that cPanel server and instead of accessing http://server-ip:8080 try accessing http://domain.tld:8080. By default cPanel does not bind on port 8080. Be sure to check if there is any firewall on the server. If it is, then allow incoming connections on tcp port 8080. Depending on your WHM server configuration, it should also work with http://server-ip:8080
cPanel Version 80 has nodejs 10.x support: https://documentation.cpanel.net/display/80Docs/80+Release+Notes#id-80ReleaseNotes-InstallanduseNode.jsapplications
Install and use Node.js applications
You can now install and use Node.js applications on your server. To
use Node.js, install the ea-nodejs10 module in the Additional Packages
section of WHM's EasyApache 4 interface (WHM >> Home >> Software >>
EasyApache 4).
You can register Node.js applications in cPanel's Application Manager
interface (cPanel >> Home >> Software >> Application Manager). For
more information, read our Guide to Node.js Installations
documentation.
For Application Manager to be enabled: https://documentation.cpanel.net/display/80Docs/Application+Manager
Your hosting provider must enable the Application Manager feature in
WHM's Feature Manager interface (WHM >> Home >> Packages >> Feature
Manager).
Your hosting provider must install the following Apache modules:
The ea-ruby24-mod_passengermodule. Note: This module disables Apache's
mod_userdir module.
The ea-apache24-mod_env module. Note: This module allows you to add
environment variables when you register your application. For more
information about environment variables, read the Environment
Variables section below.
The ea-nodejs10 module if you want to register a Node.js™ application.
You can see how application manager looks like in this Youtube video:
https://www.youtube.com/watch?v=ATxMYzLbRco
anyone who wants to know how to deploy node js app to Cpanel this is a good source for him, this explains thoroughly how to deploy node js app to Cpanel please check this

SCP File from local to Heroku Server

I'd like to copy my config.yml file from my local django app directory to my heroku server, but I'm not sure how to get the user#host.com format for heroku.
I've tried running 'heroku run bash'
scp /home/user/app/config.yml
I'm not sure how I can get it in the
scp user#myhost.com:/home/user/dir1/file.txt user#myhost.com:/home/user/dir2'
format
As #tamas7 said it's firewalled, but your local machine is probably also firewalled. So unless you have a private server with SSH accessible from the Internet, you won't be able to scp.
I'm personally using transfer.sh free and open source service.
Upload your config.yml to it:
$ curl --upload-file ./config.yml https://transfer.sh/
https://transfer.sh/66nb8/config.yml
Then download it back from wherever you want:
$ wget https://transfer.sh/66nb8/config.yml
According to http://www.evans.io/posts/heroku-survival-guide/ incoming connections are firewalled off. In this case you need to approach your local machine from the Heroku server.
heroku run bash
scp user#mylocalmachine:/home/user/dir/file.txt .
This is a bit late to answer this question, but I use services like localtunnel - https://localtunnel.github.io/www/ to copy files from local machine to heroku.
First, run a python HTTP server in the directory where the file is located.
cd /path/to/file
python3 -m http.server
This starts a server in port 8000. Configure localtunnel to connect to that port.
lt -s mylocal -p 8000
Now from your heroku machine, you can fetch the file via curl.
curl -XGET http://mylocal.localtunnel.me/myfile.txt > myfile.txt
You could also use a service like https://ngrok.com/ to open up a TCP tunnel into your local machine.
You will need to enable Remote Login as in simlmx answer.
On your local machine open the TCP tunnel just like this:
$ ngrok tcp 22
And then, on the Heroku console, just use SCP with the PORT and HOST that Ngrok provided.
$ scp -P [PORT] username#[HOST]:~/path/to/file.ext .
If you need to download your entire repo, for example to recover an app that you no longer have locally, use heroku git:clone -a myapp. Docs.
Expanding on tamas7's answer:
You can connect to your computer from the heroku server.
If your computer is behind a router, you'll also need to forward the connection to your computer.
1. You computer must accept ssh connections
On my mac it was as simple as enabling it in the Preferences / Sharing panel.
2. Your router needs to forward the connection to your computer.
Go to your router's settings page in your browser (typically 192.168.0.1 but varies depending on the router). Find the port forwarding section and forward some port to your computer on port 22.
This is how it looked on my tp-link:
Here I am making sure that port 22000 is forwarded to my computer (192.168.0.110) on port 22.
3. Find your external IP
Simply google "what is my IP".
4. Scp your file from heroku
heroku run bash
scp -P 22000 your_user#your_external_IP:/path/to/your/file .
5. Undo everything!
Once you're done it's probably good practice to disable the port forwarding and remote login.

Resources