Extension Chrome.Proxy, Only Proxy To Some of Website - google-chrome-extension

So Bypasslist in chrome.proxy will bypass the list from using proxy, but what i need is proxy only some of website i want like only 4 website and except that bypass by the proxy, if i need to make a list inside bypass list it will be hell.
Can i do that with chrome.proxy ? So Only Some Website Get Proxied.

The documentation provides an example where you specify a PAC script:
var config = {
mode: "pac_script",
pacScript: {
data: "function FindProxyForURL(url, host) {\n" +
" if (host == 'foobar.com')\n" +
" return 'PROXY blackhole:80';\n" +
" return 'DIRECT';\n" +
"}"
}
};
chrome.proxy.settings.set(
{value: config, scope: 'regular'},
function() {});
The example does exactly what you require for 1 domain, and it should be possible to extend to several domains.

Related

req.url that does not belong to my domain

I got requests from china,
The problem is that the req.url does not belong to my domain
Usually it is / or /login, etc
Was I hacked?
I'd like to know a rationale for this
const logger = function (req, res, next) {
let { url, ip, method, statusCode } = req
console.log(`${moment().format("L - hh:mm:ss")} `.red + `${method} `.green + `From: ` + `${ip?.replace("::ffff:", "")?.replace("::1", "localhost")} (${geoip.lookup(ip)?.country || "No IP"})`.cyan + ` : ` + `${req.user?.id || null} `.yellow + `At: ` + `${url} `.cyan)
next()
}
app.use(logger)
No, somebody merely sent proxy requests to you. They are directed to your server but request a full URL instead of a path. Protocol-wise they'd look like GET http://google.com/ HTTP/1.1 instead of just GET / HTTP/1.1 as you are used to. If your server were (mis)configured to honor such requests as proxy, it'd send another request itself to http://google.com/ and forward the response, but that doesn't happen in your case anyway so you can just ignore it.
See also this answer.

Express endpoint works in Postman but not in my app

I am working on an application with both a front-end and a back-end. The front uses Angular, and the back Node.js (Express). They communicate with a Rest API.
When I request this endpoint with Postman ou directly with my web browser, it works and I get the right result.
However, when I am requesting it since one component of my app, it just doesn't work.
The call is here
console.log("Call GETSAVE");
this.rest.getSave(type, file).subscribe((data) => {
console.log(data);
});
Then, in my service
getSave(type: enumType, path: string) {
console.log(this.url + 'saves/' + type + path);
return this.http.get(this.url + 'saves/' + type + path);
}
with HTTP being an instance of HttpClient (injected in the constructor).
All endpoints of this service work and I call them in the same way as above, but for this endpoint, the subscribe just doesn't work. I never enter the subscribe.
I finally found the answer, and it's quite simple.
By default, HTTPClient will parse data as a JSON. In my case, it was not a JSON but a text.
In consequence, there was an error because it couldn't parse the text as a JSON.
To fix this issue, you need to add a parameter in the request with a "responseType".
In my case :
getSave(type: enumType, path: string) {
console.log(this.url + 'saves/' + type + path);
return this.http.get(this.url + 'saves/' + type + path, {
'responseType':'text'
});
}

Show the available image in Node.js(when traffic limit exceeded)

I have an image hosting server that has a traffic limit 600M each day.
And I can also link the image with Google Drive. When I link the image with Google Drive, it's slow but there is no traffic limit.
So I want to let node.js server return the URL of the available image.
My image hosting domain is image.com and my node.js server's domain is node.com.
I'll insert image in HTML like this.
<img src = "http://node.js/image">
and node.js server's code is like below.
app.all('/image', function (req, res) {
if(// should check whether 'http://image.com/' is available now' //)
res.redirect('http://image.com/image.png');
else // The case that image.com is not available
res.redirect('http://googledrive/imagesourceURL');
});
So how can I know that image.com is available now?
If I check it by requesting the real image file, it will use traffic of image hosting server so I think it's not a good idea.
What should I insert in if(// should check whether 'http://image.com/' is available now' //)?
Do you have any idea about it?
In order to check if a http resourse i available you should do a http ajax request.
In your case you should redirect to that URL if the status is 200 and to the other URL otherwise.
It would be something like:
$.ajax({
type: "GET",
url: 'http://image.com/image.png',
success: function(data, textStatus, jqXHR){
console.log(textStatus + ": " + jqXHR.status);
res.redirect('http://image.com/image.png');
},
error: function(jqXHR, textStatus, errorThrown){
console.log(textStatus + ": " + jqXHR.status + " " + errorThrown);
res.redirect('http://googledrive/imagesourceURL');
}
});

How to obtain the system proxy settings from chrome extension?

I need to make a customized proxy settings for my chrome extension so that traffic to a specific domain goes through a proxy server while all other (user) traffic goes normally with the default system settings. According to the chrome API documentation, the only way is to use pac_script (correct me if I am wrong). Thus, the code will be something like this:
var config = {
mode: "pac_script",
pacScript: {
data: "function FindProxyForURL(url, host) {\n" +
" if (dnsDomainIs(host, 'mydomain.com') )\n" +
" return 'SOCKS5 10.0.0.1:1234';\n" +
" return 'DIRECT';\n" +
"}"
}};
chrome.proxy.settings.set({value: config, scope: 'regular'});
However, pac_script does not have the option to route traffic using system proxy settings (only 'DIRECT' which means it will skip the system settings). I thought to obtain the system proxy settings using chrome.proxy.settings.get, but this function returns an object where mode = system with no useful information.
Does anyone know how to obtain the system proxy settings from extension? Or has suggestions to handle the original problem?
You can get it with
chrome.proxy.settings.get(
{'incognito': false},
function(config) {
console.log(JSON.stringify(config));
});
It should show you something like this:
{
"levelOfControl":"controlled_by_this_extension",
"value":
{
"mode":"pac_script",
"pacScript": {
"data": "function FindProxyForURL(url, host) {\n
return \"PROXY 10.0.0.1:1234;\";\n}",
"mandatory": false
}
}
}
Read the doc for more detailed
https://developer.chrome.com/extensions/proxy

How to set global variables by command

Thanks for so many fast response.
I used NodeJS(v4.3.2) and ExpressJs(v4.x) to build up website.
I used a lot AJAX and all AJAX url point to one static IP(AWS Server itself).
Because I would deploy to several servers, I don't want to change AJAX url separately.
My idea is when I run "node bin/www" command line, Can I change it to "node bin/www 50.50.50.50(my AWS address)" and I can set all AJAX url to the right IP?
Is it possible or other alternative solustion?
Thanks
Your issue is related to CORS : basically, you cannot access a domain http://www.example1.com from http://www.example2.com via Ajax if http://www.example1.com does not explicitly allows it in the response.
This is a "security" feature on most modern browsers. You won't encounter this problem using command line such as curl or chrome extension Postman.
To fix this, make sure the domain requesting the data (http://www.example2.com) is allowed in the Access-Control-Allow-Origin header in your server's response, as well as the http verb (GET, POST, PUT... or * for every http methods).
It all comes down to add the two following headers to the http://www.example1.com server's response :
Access-Control-Allow-Origin: http://www.example2.com
Access-Control-Allow-Methods: *
Edit
Following #Paulpro's comment, you have to rebase all your urls so that they reach your server's IP instead of your localhost server.
I fix this problem.
First, in bin/www
append these code to retrieve URL for command line and store into json file.
function setUpURL(){
var myURL = process.argv[2];
console.log('myURL: ', myURL);
var outputFilename = 'public/myURL.json';
var myData = {
port:myURL
}
fs.writeFile(outputFilename, JSON.stringify(myData, null, 4), function(err) {
if(err) {
console.log(err);
} else {
console.log("JSON saved to " + outputFilename);
}
});
};
Then in each JS containing ajax add these code in the head of JS file (it need to load before ajax)
var myURL;
$.getJSON("myURL.json", function(json) {
myURL = json.port.toString();
console.log(myURL);
});
new ajax format
$.ajax({
type: "POST",
url: myURL + "/forgetPwd",
data: sendData,
success: function(data){
window.location.replace(myURL);
},
error: function(data){
}
});
Finally, you can run server
node bin/www your_aws_ip
It works for me now. Hope these will help you guys.
By the way, you should be careful about the path of myURL.json.

Resources