Kentico v7 - Disable 'GetAzureFile' Permanent url - kentico

I'm working in a site on Kentico v7 but i have a problem with the images that were stored in media folder; because i was trying to get on CMS the direct URL link of the image in the folder, but the link that CMS displayed is using the page "GetAzureFile.aspx" to get the image; I was validated in SiteManager -> Content -> Media -> General that the option "Use Permanent URL" is disabled but the problem appeared again.
Any insights would be greatly appreciated!

The Azure projects always use blob storage to store newly uploaded files. This is because technically the only files available physically in file system are the ones that were deployed with the project, and when any Azure instance restarts, it looses its local file system and only deployment package is restored on new instances.
As media library content may change on-the-fly, Kentico uses GetAzureFile links for all files to be able to serve them regardless of their storage.
You can however use hardcoded links directly to file system to the files that were part of the deployment package, e.g. ones that you use for site design.

Related

Kentico Update Media Libary Direct Path to Azure

We have moved the media stoage to Azure. Any new files successfully get uploaded to Azure and showing the correct URL
but how do I change the direct path of the old files? I have already uploaded all files to Azure. Just need to know how to update the direct path of the old files.
You need to either do it manually and change the links of the old direct path URLs or, you can create a script which will check the DB tables and change the URLs. Regrettably, there is no tool or feature to do this available out of the box.

Prevent Azure App Service from viewing backend configuration

I am working on a project that has us deploying to an Azure Web Site.
The code is overall working and now we are focusing more on security.
Right now we are having an issue that back end configuration files are visible with the direct URL.
Examples (Link won't work):
https://myapplication.azurewebsite.net/foldername/FileName.xml (this
file is in a folder that is contained within the root application)
https://myapplication.azurewebsite.net/vApp/FileName.css (this file
is a part of virtual application sub folder)
I have found this to be true with multiple extensions and locations.
Extensions like:
.css
.htm
.xml
.html
the list likely goes on
I understand that certain files are downloaded to the client side and that those can't be stopped. However backend XML files are something we don't pass to the client (especially if has connection strings).
I did read a similar article, Azure App Service Instrumentation Profiling?
However this didn't directly relate to my issue.
Any insight would extremely helpful.
Do not store sensitive information in flat files, especially under your site root. Even if you web.config it just right you're still one botched commit away from disaster.
Use Application Settings instead, that's what they're for.
https://learn.microsoft.com/en-us/azure/app-service-web/web-sites-configure

Node Webkit Desktop App - Browser default caching of PDF files

I have built a desktop app using node webkit and need to cache PDF files that are viewed via the App when online so that they are also available offline. I haven't found a solution yet but during testing I noticed that files that I had previously viewed online were available offline even though I haven't written any code for this yet. Therefore these must already be cached automatically. I did a search to find where the files are being saved exactly but couldn't find anything.
Can anyone explain this or point me in the direction of information on this so that I understand how it works and ensure my App can utilise the default behaviour of the browser caching?
********UPDATE***********
I have found a solution to store the PDFs locally, however this isn't my query. I am looking for an explanation as to HOW the PDFs are available when offline without this code I have written. The files must be automatically be stored somewhere otherwise how would they display?
The default caching behavior of node-webkit is controlled by the page-cache property in package.json :
"webkit": {
"page-cache": true
},
Only typical web resources can be cached this way (scripts, style sheets, etc.). To be able to view PDF files offline, you can store them manually.
There are several ways to do that :
Save a file directly to disk (the simple solution, just store the files in App.dataPath)
Use a database
Use Web Storage
Use the application cache
All of these are documented here : Save persistent data in app
The default location to cache your app files is mentioned in your package.json manifest file.When the app is initialized the settings in your manifest files are loaded by default.Since cached files cannot be accessed programmatically,you can overwrite the default files manually.
To get the application’s data path in user’s directory for windows,you can write it in Jason format in your package :
Windows: %LOCALAPPDATA%/
You can read about other cache menthods in node webkit's documentation :
http://docs.nwjs.io/en/latest/References/App/#appclearcache

Publish website to Azure, remove additional files at destination, but ignore specific folders

I currently manually delete obsolete folders from a published azure website. I know there is an option in visual studio to Remove additional files at destination. My problem is that I have an Images folder (quite large) that users upload, that will be deleted when I publish with this option checked. My question is, is there a way to use this option with exclusions? Meaning, to delete all files that are not in the local project except "\Images" folder?
You can most likely customize the web deploy usage from VS to do what you want but I don't think I would recommend it since things like that tend to get fragile.
I would suggest changing your architecture to store the images in a blob container, then possibly mapping your blobs to a custom domain (https://azure.microsoft.com/en-us/documentation/articles/storage-custom-domain-name/).
Having your images in blob storage will also prevent any accidental deletion of the Images folder by someone else that doesn't know it shouldn't be touched (or you simply forgetting about it one day).
Using blob storage will also allow you to configure CDN usage if ever find that you needed it.
Another option would be to create a virtual directory on your WebApp configuration and put the Images there - that way your VS deploy/publish wouldn't be modifying that subdirectory. This link may help with that: https://blogs.msdn.microsoft.com/tomholl/2014/09/21/deploying-multiple-virtual-directories-to-a-single-azure-website/

Setting Up Continuous Deployment of a WPF Desktop Application

For a project I am currently working on, I need to create a setup application for an existing desktop application. The setup application will be downloaded from a website, and will download required files to the correct locations. When the application is started, it will look for newer versions of these files, download them if any exist, then start the application.
I am using Visual Studio Online with TFVC, linked to Azure. I have a test application set up so that when I trigger a build, Release Management finds the build directory, and moves the files to Azure Blob Storage, but prepends a GUID to the file names being transferred. So what I have in my storage container is:
{Some GUID}/2390/Test.exe
{Some GUID}/2389/Test.exe
{Some GUID}/2387/Test.exe
...
What I want in my container is the latest version of Test.exe, so I can connect to the container, and determine whether I want to download or not.
I have put together a NullSoft installer that checks a website, and downloads files. I have also written a NullSoft "launcher" that will compare local file versions with versions on the website (using a version xml file on the website), and download if newer, then launch the application. What I need to figure out is how to get the newer files to the website after a build, with automation being one of the goals.
I am an intern, and new to deployment in general, and I don't even know if I'm going about this the right way.
Questions:
Does what I am doing make sense for what I am trying to accomplish?
We are trying to emulate ClickOnce functionality, but can't use ClickOnce due to the fact that the application dynamically loads a number of DLLs. Is there a way to configure ClickOnce to include non-referenced DLLs?
Is there a best practice for doing what I'm describing?
I appreciate any advice, links to references, or real-world examples.
You are mentioning ClickOnce, which you investigated but can't use. Have you already tried an alternative: Squirrel? With Squirrel you can specify which files should be part of the installation, allowing you to explicitly specify which files to include even if you load them dynamically.
Link: https://github.com/Squirrel/Squirrel.Windows
Squirrel is a full framework for creating an auto-update application and can work with Azure Blob Storage hosting (and also CDN if you need to scale up)

Resources