JS Files not updating in Site Assets - sharepoint

Ok, I have never seen anything like this before and hoping someone else has. I just finished patching our Dev and Test servers to Nov2017CU (SharePoint 2013). Since then, any solutions that are using JS injection from Site Assets are not updating. I'll make a change to the file, the library reflects that I made the change, but when I attempt to load the page accessing the js file, the changes are not reflected. Hard refreshes and full cache cleans are not affecting it. If I close and reopen my editor (VSCode) my changes are gone. When I look at the version history, the current version doesn't have my changes, but the previous version does. If I try to revert to that version, it doesn't take (still shows the previous version of the file).
Here's where it becomes extra weird. I have deleted the entire file from the library. Reset IIS (heck, I even rebooted the server at one time). It somehow still loads the file. The file is no longer in the library, but the server is still serving it up to the browser. I have confirmed it is not getting it from another location as the Dev tools are showing the file is located in the Asset Library the file was deleted from. Even users who have never accessed the site before are still getting that file in their browser.
This isn't limited to a single site either. I have other developers in different sub sites (same site collection) that are having the same issues.
Anyone seen this before?

Looks like your web application has BLOB cache enabled which is causing files to served from the cache.
There are 2 ways to fix:
1) The heavy handed way would be to flush the BLOB cache using powershell commands mentioned:
$webApp = Get-SPWebApplication "<WebApplicationURL>"
[Microsoft.SharePoint.Publishing.PublishingCache]::FlushBlob‌​Cache($webApp)
This will flush all the files in the BLOB. Usually, the files are cached based on the max-age attribute value. So, that is the reason that your files are being served even if you had deleted it from the source.
2) The surgical knife approach would be to append a query string, like (https://sitecollurl/siteassets/app.js?v=1.1), to the file references (usually in master page, page layouts, webpart references, script links etc. wherever it is referenced). When you append a query string to the file, it will force the browser to download the newer version of the file. Would prefer this approach as it will not unnecessarily clear other files from BLOB.

Related

Can SVG-Edit be made to work in a standalone/offline context?

Because SVG-Edit is such a unique and appealing program, I've been searching for an answer to this question for years, but have come up dry.
After a major struggle, I was able to get it to work by installing Windows IIS, then setting up a web server, etc. However, this is far from ideal.
Is there some reason why it won't (or shouldn't) run in a fully standalone/offline mode? Specifically, what I'd like to do is extract the GetHub zip file to a local folder, and open "svg-editor.html" in a browser. In general, this produces either a blank window, or (in some previous versions) a window with various missing items.
There had been a race condition which was causing svgedit to err, evident in Chrome when loading with file:// URLs, and now fixed in the master branch on Github.
You won't be able to load svg-editor-es.html locally from a file:// URL--svg-editor-es.html being the original source which relies on ES6 Modules to load its files but problematic as they are not permitted to load locally, causing origin errors to show in the console), but the svg-editor.html file (which is the backward compatible way to use svgedit) appears to be working now after the fix--at least for some basic functionality like making drawings.
Some functionality may not be possible to work, however, due to limitations related to limited permissions with file:// URLs, e.g., loading some images. (I seemed to recall browsers previously preventing files outside of their directory or child directories from loading files in parent directories, but this restriction does not seem to apply now, though there are some warnings I see about Ajax not being able to load some images which svgedit attempts to load.)
As such, even with the above-mentioned recent fix, it might not be possible to fully work offline, unless perhaps you opt to disable the security restrictions on your browser, something one should not do lightly. But it does appear to work for some basic drawings at least.
While I figure this may address your direct question about why it doesn't work without a server, there is also another approach to working "offline" which, though it would need a server to initially serve the files, may allow svgedit to store the application files to work completely offline the next time you visit that URL in the browser--and not run into problems with browser security restrictions. Browsers nowadays can work offline even when served from a server (done by something called "service workers"--see https://caniuse.com/#feat=serviceworkers for the browsers that support this).
Service workers are, however, not all that easy to cobble together, and though you should be able to track any future progress on this by subscribing to the issue at https://github.com/SVG-Edit/svgedit/issues/243 (as it is already a requested feature), there is no one currently undertaking to implement this at this time. Hopefully someone will be inspired to implement this.
By the way, if you install svgedit using "npm" (a tool which becomes available if you install Node), svgedit has a start script which you can invoke from the command line with npm start from within the svgedit folder, and that will run a local (Node) server for you, specifically a simple static file server which will simply allow you to load svgedit from http URLs (i.e., http://localhost:8000/editor/svg-editor.html or http://127.0.0.1:8000/editor/svg-editor.html; you can also use the ES6 Modules file if you are on a modern browser: http://localhost:8000/editor/svg-editor-es.html )--without your needing to install any other server.

Prevent Azure App Service from viewing backend configuration

I am working on a project that has us deploying to an Azure Web Site.
The code is overall working and now we are focusing more on security.
Right now we are having an issue that back end configuration files are visible with the direct URL.
Examples (Link won't work):
https://myapplication.azurewebsite.net/foldername/FileName.xml (this
file is in a folder that is contained within the root application)
https://myapplication.azurewebsite.net/vApp/FileName.css (this file
is a part of virtual application sub folder)
I have found this to be true with multiple extensions and locations.
Extensions like:
.css
.htm
.xml
.html
the list likely goes on
I understand that certain files are downloaded to the client side and that those can't be stopped. However backend XML files are something we don't pass to the client (especially if has connection strings).
I did read a similar article, Azure App Service Instrumentation Profiling?
However this didn't directly relate to my issue.
Any insight would extremely helpful.
Do not store sensitive information in flat files, especially under your site root. Even if you web.config it just right you're still one botched commit away from disaster.
Use Application Settings instead, that's what they're for.
https://learn.microsoft.com/en-us/azure/app-service-web/web-sites-configure

Kentico getresource 404 error css and js

Using Kentico 9, while I was away it appeared that something changed! While our production site appears to be working fine our staging site is not. When I try to access the staging site it is unable to "find" any of the related CSS or JS files and therefore displays the site without any of those files. Another part of the problem is that the same thing is happening in the admin portal and since the admin portal relies on those CSS and JS files to work I can't troubleshoot there.
When I try to get to the resource directly in a browser: e.g. iddba-staging.azurewebsites.net/CMSPages/GetResource.ashx?stylesheetfile=/App_Themes/Default/bootstrap.css it fails.
When I try to access it directly at the prod site: www.iddba.org/CMSPages/GetResource.ashx?stylesheetfile=/App_Themes/Default/bootstrap.css as I would expect it provides me the opportunity to save the file.
Our site(s) are hosted in Azure.
Any idea of where to turn next? Thanks.
Also on a possibly unrelated note, I have also noticed an error in the logs on staging that is not on prod:
.NET Runtime version 4.0.30319.0 - Loading profiler failed. Failed trying to receive from out of process a request to attach a profiler.
Not sure if this might be related. Thanks.
Once I realized that getresource.aspx was a dynamic file and the css and js files were either already in the database or were in respective directories it looked at the dlls that power Kentico. In this case I decided to see if there were any differences in the bin directories of prod and staging. I don't know how they became out of sync but they had. Since the problems I had appeared to be serving up CSS files I noted that one file CMS.LessCss.dll was in prod and not in staging so I copied that file in. As soon as that file was copied Kentico threw an error with the word Jurassic in it (odd!). I found a file Jurassic.dll that was in Prod and not in Staging so I moved it.
Voila! The site was back, serving CSS and JS files on the front end and in administration. We are still clicking through all our pages to see if anything else is not working but for now we are good and if I have any further questions I will look in bin first since there are still files that live in prod that are not in staging and I don't want to re-populate unnecessary files. I am just not sure how the directories got out of sync in the first place... or at least no one here has admitted to having messed around! Yet.
Thanks for all your help.

Node Webkit Desktop App - Browser default caching of PDF files

I have built a desktop app using node webkit and need to cache PDF files that are viewed via the App when online so that they are also available offline. I haven't found a solution yet but during testing I noticed that files that I had previously viewed online were available offline even though I haven't written any code for this yet. Therefore these must already be cached automatically. I did a search to find where the files are being saved exactly but couldn't find anything.
Can anyone explain this or point me in the direction of information on this so that I understand how it works and ensure my App can utilise the default behaviour of the browser caching?
********UPDATE***********
I have found a solution to store the PDFs locally, however this isn't my query. I am looking for an explanation as to HOW the PDFs are available when offline without this code I have written. The files must be automatically be stored somewhere otherwise how would they display?
The default caching behavior of node-webkit is controlled by the page-cache property in package.json :
"webkit": {
"page-cache": true
},
Only typical web resources can be cached this way (scripts, style sheets, etc.). To be able to view PDF files offline, you can store them manually.
There are several ways to do that :
Save a file directly to disk (the simple solution, just store the files in App.dataPath)
Use a database
Use Web Storage
Use the application cache
All of these are documented here : Save persistent data in app
The default location to cache your app files is mentioned in your package.json manifest file.When the app is initialized the settings in your manifest files are loaded by default.Since cached files cannot be accessed programmatically,you can overwrite the default files manually.
To get the application’s data path in user’s directory for windows,you can write it in Jason format in your package :
Windows: %LOCALAPPDATA%/
You can read about other cache menthods in node webkit's documentation :
http://docs.nwjs.io/en/latest/References/App/#appclearcache

Coldfusion security issue...how to hide directory of files?

So, I decided to try to break my website...I googled my site by typing in site:mysite.com/whatever and behold, all of the users uploaded files were available for view under a specific directory.
What kind of script/ counter measure should I use to block these files from being viewed? I already have a script that checks the path and the logged in status, however this doesn't seem to be working. I've looked all over for solutions...but I can't quite find one. I'm using ColdFusion 8.
This isn't a ColdFusion issue so much as a web server configuration issue.
You should either:
configure your web server not to show a directory of files when using a URL without a filename (e.g., http://www.example.com/files/)
drop a blank default web document (index.html, index.htm, default.htm, index.cfm, whatever) into that directory so that it displays that document rather than the list of files. If you use index.cfm, it'll fire your Application.cfm/cfc in your file path and use whatever other security you've built.
(or, better, do both)
The best way to secure your file listings and the files themselves is to store them in another folder outside of the Web site root folder. You can then serve them up using CFDIRECTORY and CFCONTENT. The pages that display the files can check your access controls and only serve the files to those allowed to see them.

Resources