I would think that specifying Subresource Integrity hash on a resource should allow web browsers to cache much more aggresively, and basically always reuse the local copy of the resource.
Is stronger caching for SRI resources implemented (or at least "planned to be implemented") in browsers? So that for example hitting a "[Refresh Page]" button would still make a browser reuse the cached resource? Or is it not? If not, then is it because of some important reasons? Or just "not yet there, but yeah maybe some day"?
Inspired by #sideshowbarker's comment (thanks!), I browsed the W3C's issue tracker for SRI, and lo and behold, this idea is already tracked, as:
#22 — Consider shared caching
Implementing this caching idea is apparently indeed non-trivial with regards to security and privacy (e.g. because it could be used for tracking whether user visited a page). Thus, I assume, not yet (if ever) in browsers (though I still can't be 100% sure based on that). And not even yet determined if possible to be implemented in a "safe" way.
Related
What is a situation where nothing more than clearing your browser history will solve an issue with a web application?
"Clearing your history" does not include cookies or any sort of cache. Just the history. Is there any legitimate situation where asking someone to clear their browser history will solve a problem?
Short answer: There should never be such a situation, so, "none".
Longer answer: Well there are some possible interactions with history (window.history.go comes to mind) and it's possible that some websites could use some very convoluted systems to examine and ingest that history/.
(https://developer.mozilla.org/en-US/docs/Web/API/History_API for some brief overviews on how mozilla based browsers would do it)
But those are all nonsensical, unreliable, and remarkably bad design decisions for a web application to make. If you can clear your browser history and solve a problem with a web application, that web application is categorically awful and you should not use it.
EDIT: This presumes you're talking about location history. If you're asking how clearing the other historical data in your browser (such as cookies or cached data or local storage or perhaps download history) might affect the experience. there are plenty of ways that could affect the function of a site!
We currently get web analytics for a WordPress site using WebTrends.
If we use a caching mechanism like Varnish, I would assume WebTrends would suddenly report a dramatic reduction in traffic.
Is this correct and, if so, can you avoid this problem and get the correct statistics reported by WebTrends?
In my experience, acceleration caches shouldn't interfere with your Analytics data capture, because the cached content should include all of the on-page data points (such as meta tags) as well as the WT base tag file, which the user's browser will then execute and which will then make the call to the WT data collection server.
By way of a disclaimer, I should add that I haven't got any specific experience with Varnish, but a cache that acts as a barrier to on-page JavaScript executing is basically broken, and I've personally never had a problem with one preventing analytics software from running.
The only conceivable problem I could foresee is if a cache was going to the extent of scanning pages for linked resources (such as the "no javascript" image in the noscript tag), acquiring those resources in advance, and then reconfiguring the page being served to pull those resources from the cache rather than the third party servers. In which case you might end up with spurious "no javascript" records in your data.
Just make sure that your varnish config is not removing any webtrends cookies and it should be percetly OK. By default it does not but if you use some ready-made wordpress vcl then it might be you will need to exclude these cookies together with the wordpress-specific ones in the configuration.
How difficult/easy is it to break into Joomla backend & to access the pages which are only set to be accessible by selected Joomla users of the website? Is it safe enough to rely on Joomla's management system?
Yes, Joomla is quite secure system by itself. Although you have to be careful with third party extensions and always track update news for all components (including core) you have installed and use your judgement about updating them. Usually security issues spotted quite quickly and you have time before succeed attack.
Another thing keep in mind is proactive defense with all possible means you have in hands, this includes .htaccess and .htpasswd, also good idea to restrict ftp access to only local ips and use sftp instead.
Also check out the security extensions around JED, the ones which prevents high level DDoS and extend admin page access protection might be also helpful, usually they are simple modules or plugins.
And yes, do not forget change default username for superuser. And change all passwords ftp/superusers/mysql/htpasswd on regular basis.
Follow this simple rules and you will be fine, at least most of the time you will be fine.
While Joomla security is fairly good, you need to keep up with the patches and, as dmi3y mentioned, you need to watch the third party extensions.
When it comes to information security, nothing is ever perfect. This solution may or may not be appropriate depending on the type of information that you are looking to secure, the number of users accessing it and how you manage the user rights.
Is showing TCM ID on the SiteEdit instruction on the public website a security issue? My thoughts are it should not be an issue since Tridion is behind the firewall. I want to know the experts opinion.
I think you're asking the wrong question here. It is not important whether those SiteEdit instructions are a security risk, they should only be present on the publishing target(s) where you use SiteEdit. On any other target they just needlessly increase the size and expose implementation details that are not relevant to the visitors of that target.
So unless you enable SiteEdit on your public web site (highly unlikely), the SiteEdit instructions should not be in the HTML.
It depends on the level of security you require. In principle, your security should be so good that you don't rely on "security by obscurity". You should have modelled every threat, and understood it, and designed impregnable defences.
In real life, this is a little harder to achieve, and the focus is more on what is usually described as "security in depth". In other words, you do your best to have impregnable defences, but if some straightforward disciplines will make it more difficult for your attacker, you make sure that you go to that effort as well. There is plenty of evidence that the first step in any attack is to try to enumerate what technology you are using. Then if there are any known exploits for that technology, the attacker will attempt to use them. In addition, if an exploit becomes known, attackers will search for potential victims by searching for signatures of the compromised technology.
Exposing TCM URIs in your public-facing output is as good as telling an attacker that you are using Tridion. So, for that matter, is exposing SiteEdit code. If you use Tridion, it is utterly unnecessary to do either of these things. You can simply display a web site that gives no clues about its implementation. (The ability to avoid giving these clues will be a hard requirement for many large organisations choosing a WCMS, and Tridion's strength in this regard may be one of the reasons why the organisation you work for chose to use it.)
So while there is nothing in a TCM URI which of itself causes a security problem, it unnecessarily gives information to potential attackers, so yes, it is a security issue. Financial institutions, government organisations, and large corporations in general, will expect you to do a clean implementation that gives no succour to the bad guys.
I would argue that it does not really present an issue. If there are holes in the firewall that can be breached, an attacker may find a way to get through regardless. The fact that there is a Tridion CMS installation behind the firewall is somewhat irrelevant.
Whether you have the URIs in your source code or not, your implementation should be secured well enough that the knowledge gained by knowing that you have a Tridion CMS is of no value to a hacker.
There are a few questions (C#, Java) that cover how one might implement automatic updates. It appears initially easy to provide automatic updates, and there are seemingly no good reasons not to provide automatic updates for most software.
However, none appear to cover the security aspects of automatic updates.
How safe are automatic updates now?
How safe should they be?
How safe can they be?
My main issue is that the internet is, for all intents and purposes, a wild west where one cannot assume anything about any data they receive. Automatic updates over the internet appears inherently risky.
A company computer gets infected, spoofs the DNS (only a small percentage of which win), and makes the other company computers believe that the update server for a common application is elsewhere, they download the 'new' application and become infected.
As a developer, what possible attacks are there, and what steps should I take to protect my customers from abuse?
-Adam
With proper use of cryptography your updates can be very safe. Protect the site you distribute your updates from with SSL. Sign all your updates with GPG/PGP or something else, make your clients verify the signature before applying the update. Takes steps to make sure your server and keys are kept extremely secure.
Adequate, is very subjective. What is adequate for a internet game, maybe completely inAdequate for the security system for our nuclear missiles. You have to decide how much potential damage could occur if someone managed to break your security.
The most obvious attack would be an attacker supplying changed binaries through his "evil" update server. So you should ensure that the downloaded data can be verified to originate from you, using a digital signature.
To ensure security, obviously you should avoid distributing the key for the signature. Therefore, you could implement some variation of RSA message signing
Connecting to your update server via SSL can be sufficient, provided your client will refuse to connect if they get an invalid certificate and your server requires negotiating a reasonable level of connection security (and the client also supports that).
However realistically almost anything you do is going to be at least as secure as the route via which your users get the first install of your software anyhow. If your users initially download your installer via plain http, it is too late to start securing things on the updates.
This is also true to some extent even if they get your intial software via https or digitally signed - as most users can easily be persuaded to click OK on almost any security warning they see on that.
there are seemingly no good reasons not to provide automatic updates for most software.
There are good reasons not to force an update.
bug fixes may break code
users may not want to risk breaking production systems that rely on older features