Problem with response from IIS 7. When user do request for a page the IIS server does not return full static content: js, css, jpg, png.
But if I do refresh few times(push F5 or Ctrl-F5) the content loads fully and correctly. So what is the problem?
Server: Windows Server 2008 SP1, IIS 7.0.6001.18000
I tried a lot of things to fix it but I can't sort out the problem.
The simular problem described here http://forums.iis.net/t/1164186.aspx , but I did not found solution too.
Here is example firebug screenshot:
Increased bandwidth: replaced old router by a more faster router.
Related
I had created a simple portal site for our internal users, just a CSS menu with our internal web services which then displayed in an iFrame. Worked well so far.
Our helpdesk software was GLPI running on APACHE on the same server and had no issues. We recently upgraded to the latest version and in the meantime I moved it to our new web apps server and switched to IIS. The 'portal' is still hosted on the old server.
When I updated the javascript for the iFrame to point to the new address, it looked like it worked, I could get the login screen for GLPI. After logging in however I just get stuck at a white screen. If I try it in IE I get the message:
This content cannot be displayed in a frame
To help protect the security of information you enter into this website, the publisher of this content does not allow it to be
displayed in a frame.
What you can try:
Open this content in a new window
It doesn't seem to be the iFrame itself as I can get the login page.
My question is, can anyone give me some ideas on where to look at this issue? I've checked the IIS logs on both sides and see no errors, GLPI reports no errors, neither does PHP.
GLPI is on IIS 8 on Server 2012
The 'portal' is on IIS 6 on server 2003
GLPI running on PHP 5.3.0
EDIT: I've looked into the X-FRAME issue and I'm pretty sure this is not it, servers are on the same domain and I am able to get to the login screen of the second server through the iFrame, just no content after that. If it was an issue with the frame or permissions I would expect to not get to the site at all?
The only header response currently in IIS is 'X-Powered-By --> ASP.NET' am I looking in the wrong spot?
The server is returning an X-Frame-Options header used to prevent ClickJacking. That header must be removed (or updated with an Allow-From directive that lists the framing page's origin) in order for the target page to be rendered as a subframe.
We have been using ABCpdf for years now. In fact we are on version 6.1 still. It has just always worked. But we have recently upgraded to Windows 2008 x64 / IIS 7.5.
Our code that converts HTML pages (Invoices) to PDF now does not work. The basics are that there is a QueryString based URL that renders the Invoice in HTML, this allows us to "preview" it, then to send it to the client we use ASP .NET to execute the ABCpdf code (calling that same URL from the server to the server). This time the output is PDF, and that's what is attached to an email and sent off to the client.
Pretty simple and straight forward stuff right?
This is what we noticed about ABCpdf:
1) PdfObj.AddImageUrl("http://localhost/..."); // Localhost does not work.
2) PdfObj.AddImageUrl("http://127.0.0.1/..."); // Local IP does not work.
3) PdfObj.AddImageUrl("http://41.XX.XX.XX/..."); // Live IP does not work.
Now this:
4) PdfObj.AddImageUrl("http://www.google.com/"); // Works perfectly!
So we know the code and everything about it technically can and does work.
But it seems that any time the AddImageUrl() function calls a location that points to itself, the page does not render and we get "Unable to render HTML. Page load timed out. Unable to load page."
I know it's not to do with the timeout because if I use Fiddler (on the server) to execute the exact same code, it works perfectly.
I suspect this is to do with permissions... what what permissions? I read this: "... this is because ABCpdf uses the Microsoft MSHTML component" but how do I set the permissions on this component. I have already turned off "IE ESC".
What am I missing?
So it turned out after fiddling with just about every setting, that it came down to the fact that IIS did not allow URL calls from w3wp.exe to the same "site" within the same IIS.
There is more on that here: http://support.microsoft.com/kb/316451
It wasn't the "MSXML2.ServerXMLHTTP.3.0" requests, these seems to work - and why it was so confusing. But in ABC PDF, there is obviously something similar, and so IIS was blocking it... in fact the entire "site" locked up while it was failing.
In the end all it took was to make a clone of the main site ("site2"), and changing the URL that was parsed to ABC pdf to use the clone site.
I have a website set up on IIS 7.5. The host headers are working, the pages are returning fine. But when the browser requests an image, I get a 404 message. And I noticed that in the 404 details, the "Physical Path" that is specified is on the "c:\inetpub\wwwroot\images...".
The reason this is so strange is that the website is on the "D:" drive: "d:\inetpub\wwwroot\images...". The physical path is configured properly on IIS (I've done this a million times) but when attempting to serve images, it's using the wrong drive... why? Where is it getting this? I'm competently experienced with IIS for more than a decade and I have never seen this before. WTF??
You probably have two web sites binding to the same IP address, and one of them (with the physical path shown) is being honored.
If you add your application/virtual under the one website that's not being honor, you will see this error.
You need to set up the application under the web site that's being honored by IIS when using the IP.
HI
I have the following (apparently simple) problem: I have to install a simple website, made by someone else, on a web hosting account. The site consists of lot and lot of HTML pages, no dynamic content, created some in MS Word and saved as html, some in frontpage, etc. A mixed bag.
I uploaded initially on a test account on my server (Win Server 2003) and it works ok.
Then I uploaded on the real web hosting (fedora / apache).
When I loaded the site in browser I see lot of odd craracters (instead of diacritics, used in html pages). Duacritics were saved as escape code, like & #350; for Ș (using codepage 1252).
The problem is, when I load the page from my own test server, the browser select automatically correct codepage (1252).
But when I load the site from public host, the same bowser loads the page using utf-8 encoding, rendering page with odd caracrets.
The test site on my server can be seen at http://radu-stanian.dnsalias.com and on public server at http://radustanian.scoli.edu.ro/
This happens no matter what browser I use (IE, ff or chrome)
What should I do to force browsers to load the pages in correct codepage?
Making changes to every page is not an option, because there are hundreds of pages, created by various peoples which could edit them further for update
Thank you
I did a quick google search and this is what I came up with:
http://www.w3.org/International/questions/qa-htaccess-charset
I've never messed with the .htaccess files with this scenario, but from what I read up it seems like you can force a certain character codepage mode based on file extension, which is what you need.
I'm not sure if it works, but hopefully it does :)
Most web servers allow you to edit HTTP headers. One of them can specify the exact codepage for a browser to use.
For example:
Content-Type: text/html; charset=ISO-8859-4
I have a web application which works fine when i publish and host it on my localhost ...
The same published folder i host to a remote server and a few controls go missing on load of the page .I log into remote desktop to the server and try opening the site on the server itself it works fine. But only when i access it from my local system some controls go missing .Is there something i am missing on the Browserused on the swerver as well as my local system is IE 8 .
Thanks & Regards,
Francis P.
You probably have some URLs with a hard-coded http://localhost/....
Change all of your absolute URLs to relative URLs.
Fiddler and Firebug will be very helpful to see which URLs are being requested.