The issue with embedded images, google images proxy, ...? - gmail

My app sends html email with embedded images (data uri scheme) to address like *#gmail.com.
All available for me desktop clients show mail correctly. But when I use web interface some images (not all) are broken.
I have got the following information as result of my investigation:
The issue is observed not only chrome, but on FF and IE too.
Gmail for the web interface replaces values of src attribute of embedded images from data uri scheme to http url for loading images from google proxy servers.
In my case Gmail uses two external servers for loading images: gm1.ggpht.com & mail.google.com
All images are loaded properly from server gm1.ggpht.com with status 200 without any redirection to another location.
All images which tries to load from mail.google.com are broken. Response headers from mail.google.com have 302 status (redirection) with location like https://gm1.ggpht.com/...
An attempt to open new location in browser returns response with status 403 (Forbidden)
Any ideas?

The address was blocked by the Proxy server on the network you use.

Related

How to block uploads in specific browser without proxy

I have found titanium web proxy to block uploads in browser.
But the requirement is have to block all uploads in specific browser (Chrome, Edge, ..) without proxy?
Can any one help me on this topic how to get the code by searching the keyword?

Application Request Routing: Why some sites return HTTP 404 some don't?

I want to add a load balancer to an existing asp.net project using Application Request Routing. So I made myself familiar with the concepts and created a local test-setup:
IIS locally running on Windows 10:
Installed Application Request Routing 3.0 with Windows Platform Installer
Created server farm with following servers:
<test-server-name>.de (Microsoft 2012 R2 Server: contains the asp.net project)
www.google.com (just to see if load balancing and url rewriting works because I don't have two test servers available)
URL-Rewriting rule:
After typing localhost multiple times in any browser, I can see that load balancing (weighted round robin) is working fine. It's alternating between 1. and 2. website.
The problem I'm facing is a 404 Error on both websites.
I already tried the following:
Installing and enabling Failed Request Tracing Rules (on local IIS): URL Rewriting is working properly i think.
Failed Request Log for www.google.com: google drive, unzip and open xml in e.g. IE for better view
Create Server Farm without automatic creation of URL Rewrite rules
(selecting No and create own URL Rewrite rule)
Change "Managed Pipeline Mode"-setting of Applcation Pool from Integrated to Classic
Healthcheck on other Websites I have absolutly no clue why it's working on Git-websites and why facebook is returning a 400 error code.
Enabling/disabling proxy (IIS-Manager -> Application Request Routing Cache -> Server Proxy Settings...)
I don't know what i could do next, so I appreciate any help. Thanks.
Answer can be found here: https://forums.iis.net/t/1238739.aspx?Why+some+sites+return+HTTP+404+some+don+t+
Some websites simply don't support localhost as hostname, which is why localhost can't be found (error 404) e.g. on google.com
Detailed answer if link above is not working in future:
That is not an effective test.
What you are doing is sending the hostname of your request to the third party servers. Like Google.
So if your request is say http://example.com you are sending this to say www.google.com and the Google servers will likely reject this as you can see
Web server admins generally don't let themselves receive traffic from domain thet do not host.
If you sent a request to my servers IP with mysite.com I too would likely reject it. (Things get complex if you have wildcard sites and you allow all traffic through)
But simply showing that 404 page from Google means tour request hit there server so that implies ARR is working.
If you really wanted to test it this way have a local host file with www.google.com resolving to your servers IP. Set up a site with www.google.com as the hostheader and then you should see the correct info hitting Google. But there is no accounting for what 3rd party admins do on their side.

Can HTTPS web pages be permitted to load data over HTTP?

If a web page that's only served over HTTPS tries to load data (e.g. JSON) that's only available over "insecure" HTTP, Chrome blocks the request with a message that "This page is trying to load scripts from unauthenticated sources".
Is there a meta tag that can be added to the HTML page to override this, allowing the data to be loaded?
This is up to the browser and user now. It's not something you should try to disable.
Here's what you can do:
Change your external URLs to https if the external servers support it
Copy external scripts and serve them from your local server, if possible
If the above are not possible, you will need to setup a reverse proxy and serve them from there. Ex. if external content is at http://external.com/script.js , then change the URL to https://me.com/proxy/external.com/script.js, and have your proxy grab the insecure content and return it as required.

Tracking down X-Frame-Options header

We've partnered with a company whose website will display our content in an IFRAME. I understand what the header is and what it does and why, what I need help with is tracking down where it's coming from!
Windows Server 2003/IIS6
Container page: https://testDomain.com/test.asp
IFRAME Content: https://ourDomain.com/index.asp?lots_of_parameters,_wheeeee
Testing in Firefox 24 with Firebug installed. (IE and Chrome do the same thing.) Also running Fiddler so I can watch network traffic while I'm at it.
For simplicity's sake, I created a page with nothing on it but the IFRAME in question - same physical server, different domain/site - and it failed with
Load denied by X-Frame-Options: https://www.google.com/ does not permit cross-origin framing.
(That's in the Firebug console.) I'm confused because:
Google is not referenced anywhere in the containing app, or in the IFRAMEd app. All javascript libraries are kept locally; there is no analytics in the app. No Google, nowhere.
The containing page has NOTHING on it, except the IFRAME. No html tags, no head tag, no body tag. IFRAME. That's it.
The X-FRAME-OPTIONS header does not exist in IIS on the server: not at the "Websites" node, not in the individual sites.
So where the h-e-double-sticks is that coming from? What am I missing?
Interesting point: if I remove http"S" from the IFRAME url, it works. Given the nature of the data, SSL is required.
You might check global.asax.cs, the app could be adding the header to every response automatically. If you just search the app for "x-frame-options" you might find something also.

GMAIL rss don't see tag description.How to see Request headers of my iis 5 website

217.76.185.140/18.rss asp.net server
If i add to webclip (gmail) rss http://www.brainyquote.com/link/quotebr.rss it works fine
(See up of the inbox there are rss feed)
But my own rss feed don't see description tag.
1.I want copy request header (that send gmail to my iis server)
HTTP 101. etc
content-type
2.Than i want copy it and send this httrrequest with fiddler to http://www.brainyquote.com/link/quotebr.rss
3.Than i will saw http response from quotebr.rss
4. I copy this resopnse and replace description,title etc to my own
1.I want to know how i can safe(log, trace request to iis 5 windows xp) fiddler don't saw it
2. How to see request from gmail to my site 3.Do y have rss samle,wich work in gmail?
Fiddler won't see GMail's request for the feed, as the request will go directly from GMail server to your server, without ever touching the client computer. You'll have to log the request at your server.
[your computer]-------[gmail.com]------[your server w/ RSS feed]
your computer requests a page from gmail
gmail requests a page from your server (on completely different connection)
your server returns a response to gmail
gmail assembles the response into its own response
gmail returns the response to your computer
In Fiddler, you can only see steps 1 and 5, as they generate traffic between your computer and some other computer on the Net.

Resources