While browsing a web page X, browsers typically indicate visited links (occuring in that page X) in a different colour.
Are there browsers offer this for bookmarked pages? I.e., are there browsers that, when browsing page X, can indicate bookmarked links in that page?
This is similar to this question
How to detect if a link already is in the user's bookmarks? but asking whether some browsers have this feature natively.
Related
I have been reading a lot regarding iframes and clickjacking, and was not able to find the information I am looking for. Can you help me out with below questions?
How does Iframe clickjacking spread? I have seen lot of articles which mentions editing of html code in the local machine and by the same they are able to hijack users click by adding an invisible button. But, this is a modified logic on a local machine of a user. I am interested in knowing is it possible to push this same code to the cloud and impact every user logging in or using that portal? If yes, how?
If I enable Iframe options on my website, it is a security risk because my page can be loaded as an Iframe in some one else's website and they can misuse it. And if there is any secured data, if end user is accidentally entering on that website, the data is hacked. This is a security concern so it is always recommended not to allow Iframe, is that correct? Are there any other security risk.
Please add if there are any other risks.
Clickjacking does not spread.
It is literally as it is stated - jacking clicks - nothing more. However, the consequences of those clicks could be severe.
Imagine you visit a site, evil.example.org. In another tab you are also logged into your bank, bank.example.com.
evil.example.org also loads bank.example.com in an IFrame. However, it uses CSS to make this IFrame invisible. And it does not load the home page, it loads the money transfer page, passing some parameters:
<iframe src="https://bank.example.com/loggedIn/transferMoney?toAccount=Bob&amount=100000"></iframe>
Now, this page does not transfer the money immediately. It asks the user to click to confirm the transfer to Bob.
However, evil.example.org draws a button right underneath the Confirm Transfer button saying Free iPad click here.
Because the IFrame is invisible, the user just sees Free iPad click here. But when they click, the browser registers the click against Confirm Transfer.
Because you are logged into the bank site in another tab, Bob has just nicked your money.
Note that the X-Frame-Options header fixes this vulnerability on your site, assuming it is set to SAMEORIGIN or DENY. You are vulnerable until you add the header. There's a new directive in CSP called frame ancestors - however, only the latest browsers support it, so you're best off adding both headers at the moment. This will give you protection on Internet Explorer 8 and later, plus Chrome, Firefox, Opera and Safari.
Preventing framing can also help thwart over attacks such as Cross Site History Manipulation.
I have a site which has support for custom themes (the same content, but different HTML, CSS, graphics), e.g. default theme, mobile theme etc.
Themes are switched via request params (?theme=mobilie) and saved in user session (database).
How should I serve those themes to search engine bots? Should I allow to crawl themes other than default as well? robots noindex nofollow, canonical tags in head?
ok, themes and mobile versions are two different beasts, lets start with themes
lets say you have a ?theme=black and a ?theme=white theme and a default theme.
google does not care about your themes, as different themes just offer the same content in blue (or black, or whatever). this is a typical case of douplicate content.
so if you want to offer the users this option you should save it in the session.
www.example.com/?theme=black -> sets theme in session -> redirect HTTP 301 to -> www.example.com/
the "link" the users clicks should best be an external javascript (so that google can't easily discover that redirect link)
<span onclick="changeTheme('black')">Black</span>
so basically you offer google just the default version of your site, hide the themes (as they do not offer different content for google).
ok, about the mobile case.
if the HTML of the mobile site is optimized for mobile devices you might want to offer it to googlebot mobile.
so you could do user agent detection (is this a mobile site) and redirect your mobile users to a mobile site i.e.: m.example.com (if it's a site optimized like this http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=72462&from=40348&rd=1 and not only a smartphone optimized page) - then you should redirect googlebot mobile (not googlebot, just googlebot mobile) also to m.example.com (there is a whole chapter in googles seo guideline on how to do this http://www.google.com/webmasters/docs/search-engine-optimization-starter-guide.pdf )
I am in charge of creating plugins for all major browsers which will hijack links from specified website visited. I wonder if this can be done by any plugin. I mean if plugins has that level of control over the website visited by the browser under my plugin supervision?
When I say hijack it means, I should be able to read the anchor tag's href attribute value and to modify it accordingly. I know how to do that in Javascript and I know Google Chrome extensions are written in HTML, CSS & Javascript so this seems to be working. Will it work for IE, Firefox, Safari?
After huge research I got to know this is possible in Chrome, Firefox and IE. I am still researching for Safari.
I have used Firefox, Chrome and now Opera. Everytime when I switch browsers, I find that some of the websites don't display or display incorrectly i.e. in terms of box alignments etc.
I have never understood why does this happen. Which property differs in all the browsers that leads to this issue?
Each of those browsers uses a different rendering engine (Firefox -> Gecko; Chrome, Safari -> WebKit; Opera -> Presto), and each rendering engine has different rules about how the markup is displayed and which default attributes are used.
However, if your site is designed properly, these three browsers should have little trouble displaying the correct layout. It's the Internet Explorer's Trident engine that many people have trouble with.
See the Wikipedia articles on Web browser engines and comparisons amongst them for HTML and CSS
If I have an iPhone version of my site, what are the things I need to make sure of so it doesn't interfere with SEO?
I've read quite a bit now about cloaking and sneaky javascript redirects, and am wondering how this fits into iPhone and Desktop websites playing together.
If my iPhone site has a totally different layout, where say the Desktop site has a page with 3 posts and 10 images all on the page, and my iPhone site makes that 2 pages, one with the posts, one with the images (trying to think up an example where the structure's decently different), that's probably not best practice for SEO, so should I just tell google not to look at the mobile site? If so, and assuming my client would like to automatically redirect mobile users to the iPhone site (I'm familiar with the id of taking them to the regular page with a link to the mobile version instead), how do I not make this look like cloaking?
Google actually has a separate index and crawler for mobile content. So all you need to do is design your URLs in such a way that you can exclude googlebot from the mobile pages and googlebot-mobile from the regular pages in robots.txt.
Certainly you have the option of telling the search engines to not look at the mobile page. I would leave it though because you never know who is looking for something specific and maybe Google will prefer certain pages over others for mobi users.
If the 2 pages on mobi make sense to the visitor then I would not worry about it for SEO. If you are redirecting based on mobi then I don't see how the search engines could think you are cloaking, but if you want to be totally sure I suggest using CSS to show different information based on Media type.
The only problem I can think of would be of duplicate content. The SEs may see both pages and not rank one as highly because it likes what it sees on the other page. There is no penalty other than the fact that one page is more interesting than the other and may get better rankings whereas the other drops in rank. If you are making two separate pages it would be an opportunity to tune your information to specific details and maybe get hits for both, but if you are using CSS then it will rank as one page.