Kentico Update Media Libary Direct Path to Azure - kentico

We have moved the media stoage to Azure. Any new files successfully get uploaded to Azure and showing the correct URL
but how do I change the direct path of the old files? I have already uploaded all files to Azure. Just need to know how to update the direct path of the old files.

You need to either do it manually and change the links of the old direct path URLs or, you can create a script which will check the DB tables and change the URLs. Regrettably, there is no tool or feature to do this available out of the box.

Related

gitlabs copy/paste upload folder

I'm creating a wiki in gitlabs, I copy/pasted an image into a page and it automatically uploaded the image and referenced it like this:
![image](uploads/84329e7811b5d2efb31b764c4767770d/image.png)
How do I access these uploads via the web browser to update or manage them? I've tried the documentation and it just goes on about default physical locations via a shell which I don't have access to (this is a private gitlabs installation).
Also, does anyone know if this is a permanent location or something wiped (e.g. after a server restart).
I've tried all variations of 'uploads' in my url.
thanks.

I want to take backup and restore of SQLite Database file using xamarin Azure SDK for My Xamarin App

I am using xamarin Azure SDK to download and manage the local database for my Xamarin . Forms App.
We are facing downloading time issues because we have a lot of data,
so I am thinking of taking backup once of the SQLite File from one device and use it to restore on different devices as restoring the same SQLite File.
Planned to use Azure Blob storage to store backup of SQLite files and for different device planning to download that blob of SQLite file and thinking of restore it on different devices.
Any Help will be appreciated.
Thanks :)
An approach I have used in the past is to create a controller method on the azure end which the client app can call that generates a pre-filled sqlite database or 'snapshot' on the server (making sure you include all the extra azure tables and columns) and then return a download url for the file to the client. We also zip-up the snapshot database to reduce the download times. You could store this 'snapshot' in azure blob if you desired.
Please refer given link. SQLite is not supporting only relationship like Foreign Key.
Memory Stream as DB
you can upload back up file on Blob with respective user details. and when there is any call with same user details you can download it from blob.
Those are the links that provide you with the code / knowledge required to use Azure Blob Storage on Xamarin:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-xamarin-blob-storage
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-dotnet-how-to-use-blobs
As this question is very general I can provide you only with those general links. There are many details on how to implement that in your case, if you face some specific problem I recommend to ask another question with the exact description of that specific problem.
EDIT: According to your comment you have some problems in replacing the local file. The only thing is that you must replace it before you initialize SQLite, otherwise it is a simple file operation.

Publish website to Azure, remove additional files at destination, but ignore specific folders

I currently manually delete obsolete folders from a published azure website. I know there is an option in visual studio to Remove additional files at destination. My problem is that I have an Images folder (quite large) that users upload, that will be deleted when I publish with this option checked. My question is, is there a way to use this option with exclusions? Meaning, to delete all files that are not in the local project except "\Images" folder?
You can most likely customize the web deploy usage from VS to do what you want but I don't think I would recommend it since things like that tend to get fragile.
I would suggest changing your architecture to store the images in a blob container, then possibly mapping your blobs to a custom domain (https://azure.microsoft.com/en-us/documentation/articles/storage-custom-domain-name/).
Having your images in blob storage will also prevent any accidental deletion of the Images folder by someone else that doesn't know it shouldn't be touched (or you simply forgetting about it one day).
Using blob storage will also allow you to configure CDN usage if ever find that you needed it.
Another option would be to create a virtual directory on your WebApp configuration and put the Images there - that way your VS deploy/publish wouldn't be modifying that subdirectory. This link may help with that: https://blogs.msdn.microsoft.com/tomholl/2014/09/21/deploying-multiple-virtual-directories-to-a-single-azure-website/

Kentico v7 - Disable 'GetAzureFile' Permanent url

I'm working in a site on Kentico v7 but i have a problem with the images that were stored in media folder; because i was trying to get on CMS the direct URL link of the image in the folder, but the link that CMS displayed is using the page "GetAzureFile.aspx" to get the image; I was validated in SiteManager -> Content -> Media -> General that the option "Use Permanent URL" is disabled but the problem appeared again.
Any insights would be greatly appreciated!
The Azure projects always use blob storage to store newly uploaded files. This is because technically the only files available physically in file system are the ones that were deployed with the project, and when any Azure instance restarts, it looses its local file system and only deployment package is restored on new instances.
As media library content may change on-the-fly, Kentico uses GetAzureFile links for all files to be able to serve them regardless of their storage.
You can however use hardcoded links directly to file system to the files that were part of the deployment package, e.g. ones that you use for site design.

Force sharepoint to save files in directory

I'd like to force sharepoint to save files in directory. Is there a way to do that?
I think about this scenario: users upload files to some list / library in sharepoint and automatically or by pushing "publish" the files are copied to some local server's directory.
Edit:
In other words i would like to connect sharepoint library with physical directory in server that runs IIS, so that files uploaded to library were seen in that folder.
I'm new to sharepoint.
Are you talking about Remote BLOB Storage? I have not tried this and assume that RBS can be enabled for a Site level only and not for individual document libraries. If you want this for a particular doc library, you can write an event handler to save the uploaded documents to file system and then remove the uploaded file.
Most likely you don't want to do this. If you're doing it in order to access the files from other applications, or having them show up in a users home directory or something, you can just map the document library as a network drive/web folder.

Resources