Media Library Scheduled Task - kentico

We are using Kentico 11. I have configured our media library to use Azure CDN. Media files will be placed out there via CloudBerry Storage Explorer. It seems to take about 10 minutes for the Kentico UI media library to notice there are new files out there that need to be imported to the database. 2 questions...is there something I can do to make the process quicker (Kentico recognizing new files out there) and has anyone created a scheduled task that can be run to check for files in the CDN that are not yet in the Media_File table and import them automatically?
Thanks!

Related

Move data from Sharepoint through a Logic App

We are using Logic App to move data from a Sharepoint folder to an Azure Blob Storage.
We were using the Sharepoint trigger "When a file is created or modified in a folder". Unfortunately, this trigger has been deprecated and does not work anymore (i.e., when a file is indeed created or modified, no further action is done after running the trigger).
No file is moved around anymore. The trigger does not execute the Logic App even though a file is created or modified in the Sharepoint origin folder. I have been through the various other Sharepoint triggers but they do not seem to fit our use case. We cannot create a Logic App for each file. We are not using Sharepoint lists but classic folders. We could use several triggers pointing directly at each existin file, but as we have many files to move in the same folder, we would have to create many Logic Apps and that is not how we want to do it. Moreover, some new files may be created in the future.
What could we do to keep the same architecture of moving data around from Sharepoint to Blob Storage through the non-deprecated Logic App triggers?
Thank you in advance,
Alexis
You can use When a file is created or modified (properties only) and get the properties of the file that is getting created or updated. Then you can use Get file content using the properties from the previous step. Finally, you can create a blob using the previous steps. Below is the flow of my logic app.
RESULTS:

Kentico v7 - Disable 'GetAzureFile' Permanent url

I'm working in a site on Kentico v7 but i have a problem with the images that were stored in media folder; because i was trying to get on CMS the direct URL link of the image in the folder, but the link that CMS displayed is using the page "GetAzureFile.aspx" to get the image; I was validated in SiteManager -> Content -> Media -> General that the option "Use Permanent URL" is disabled but the problem appeared again.
Any insights would be greatly appreciated!
The Azure projects always use blob storage to store newly uploaded files. This is because technically the only files available physically in file system are the ones that were deployed with the project, and when any Azure instance restarts, it looses its local file system and only deployment package is restored on new instances.
As media library content may change on-the-fly, Kentico uses GetAzureFile links for all files to be able to serve them regardless of their storage.
You can however use hardcoded links directly to file system to the files that were part of the deployment package, e.g. ones that you use for site design.

Publishing a web application with an executable to MS Azure

Here is my situation, I have a web app that contains:
An .exe (which is a .net project along with assembly files and so on)
ZIPped xml files
Folders containing js&css files
Now when executing the .exe it parses the xml inside the ZIPs to create html files( the end result is a complete html that imports some of the js libraries and css files).
Considering that I have basic experience in MS Azure, I am looking for a way to have my application run on azure? My guess is that the ZIPped xmls could be stored most probably using blob storage along with the js and css files. What I am not sure of is how to get the executable running there(Possibly deploying the .exe with its corresponding resources,assemblies,dlls etc...) and have it execute from there.
If you really want to use a home grown build process (your exe) then you need to use cloud services (your own VM) where you can run this and expose your website over whatever ports you want. However it sounds like you are new to .Net, I'd suggest reading up on ASP.Net MVC Web Projects. That way you can leverage Visual Studio for building the website and deploy to a Azure Website, which is designed to host websites.

Setting Up Continuous Deployment of a WPF Desktop Application

For a project I am currently working on, I need to create a setup application for an existing desktop application. The setup application will be downloaded from a website, and will download required files to the correct locations. When the application is started, it will look for newer versions of these files, download them if any exist, then start the application.
I am using Visual Studio Online with TFVC, linked to Azure. I have a test application set up so that when I trigger a build, Release Management finds the build directory, and moves the files to Azure Blob Storage, but prepends a GUID to the file names being transferred. So what I have in my storage container is:
{Some GUID}/2390/Test.exe
{Some GUID}/2389/Test.exe
{Some GUID}/2387/Test.exe
...
What I want in my container is the latest version of Test.exe, so I can connect to the container, and determine whether I want to download or not.
I have put together a NullSoft installer that checks a website, and downloads files. I have also written a NullSoft "launcher" that will compare local file versions with versions on the website (using a version xml file on the website), and download if newer, then launch the application. What I need to figure out is how to get the newer files to the website after a build, with automation being one of the goals.
I am an intern, and new to deployment in general, and I don't even know if I'm going about this the right way.
Questions:
Does what I am doing make sense for what I am trying to accomplish?
We are trying to emulate ClickOnce functionality, but can't use ClickOnce due to the fact that the application dynamically loads a number of DLLs. Is there a way to configure ClickOnce to include non-referenced DLLs?
Is there a best practice for doing what I'm describing?
I appreciate any advice, links to references, or real-world examples.
You are mentioning ClickOnce, which you investigated but can't use. Have you already tried an alternative: Squirrel? With Squirrel you can specify which files should be part of the installation, allowing you to explicitly specify which files to include even if you load them dynamically.
Link: https://github.com/Squirrel/Squirrel.Windows
Squirrel is a full framework for creating an auto-update application and can work with Azure Blob Storage hosting (and also CDN if you need to scale up)

Will Autoupdate Startup task work in azure application?

I have built one startup task for Azure application contain exe file(running periodically with some time interval) and now i would like to make it autoupdating at every week as i have asked before here
However i'll do some logic of replacing that file through that exe(startup task) then also it is not going to take any effect of new file. I have concluded that new startup task will take effect only if we upgrade/created that azure project with new file. (Correct me if i understood something wrong)
So is there any way to do my logic works by rebooting instance (by exe/startuptask) ?
I think it will also take original file(added in startuptask at the time of upgrading/creating application) instead of new file!
Is it possible anyway?
This is a very unreliable solution. If an Azure instance crashes or is taken down for updates you will have a new instance started from the original service package. All the state of the modified instance will be lost.
A much more reliable way would be to have the volatile executable stored somewhere like Azure Blob storage. You upload a new version to the blob storage and the role somehow sees that (either by polling the storage or by some user-invoked operation - doesn't matter), downloads the new version and replaces the existing version with the new one.
This way if your role crashes it will reliably fetch the newest version from the persistent storage on startup.
After I studied your problem i can propose a very simple solution as below which I have done before for a Tomcat/Java Sample:
Prepare your EXE to Reboot the VM along with your original code:
In your EXE, create a method to look for specific XML file on Azure storage at certain interval, also add retry logic to access XML
Parse XML for specific value and if certain value is set reboot the Machine
Package your EXE in ZIP format and place at your Azure Storage
Be sure to place the XML on Cloud and set the reboot = false value
What to do in Startup Task:
Create a startup task and download the ZIP from Azure Storage which contains your EXE
After the download, unzip the file and place the EXE to specific folder
launch the EXE
What to do when you want to update the EXE:
Update your EXE, package into ZIP and place at same place at Azure Storage with same name
Update your XML to enable Reboot
How update will occur:
The EXE will look for XML after certain internal as designed
Once it sees Reboot is set, it will reboot the VM
After the reboot, the Startup task will be launched and your new EXE will be downloaded to Azure VM and will be updated. Be sure that download and update is done at same folder.
Take a look at Startup tak in the sample below which use similar method:
http://tomcatazure.codeplex.com/

Resources