How to handle Synonym files in elasticsearch Elastic Cloud - azure

We have the requirement that we have to maintain multiple synonym files as token filter for an index.
But how do I do that in Azure managed Elatic Cloud Service?
I saw in the documentation (https://www.elastic.co/guide/en/elasticsearch/reference/current/analysis-synonym-tokenfilter.html) that it is generally posssible to add synonym file/s to an index.
But therefore you have to upload a file. This is not possilbe, or at least I don't know how to do so in Azure Managed Elastic Cloud Service.

On Elastic Cloud, you can create a bundle containing your synonym file and upload it as an extension.
You synonym file needs to be placed in a /dictionaries folder at the root of the ZIP bundle
/dictionaries
+-- synonyms.txt
When zipped, you can go in your ES Cloud console to Features > Create extension and upload your ZIP file.

Related

How Can I Save my images uploaded in file system for a deploy on google app engine

In my App deployed on Google App Engine I save images uploaded by user in public folder of my File system (using Multer)
Now If I deploy a new version of my app the images uploaded are lost are lost how do I solve this issue
Is there any method with which I can deploy new version of my App keeping the images uploaded Intact?
Basically How can I Backup my File System?
App Engine is a serverless and stateless product. You lost the image if you deploy a new version, but also is the service scale up and down, or if your instance is stopped for maintenance and restarted on other servers (it's the case for Flex app engine which restart at least once a week).
In this context, your design is not correct for using serverless product. You need to save the file elsewhere, typically in Cloud Storage and to load them from there. If you need to index, search, list your file, it's also a common pattern to save the metadata in a database, Cloud Firestore for example, to easily search the files, and then download them from Cloud Storage.
Notes: there is no persistent file system, it's in memory file system on serverless environment. You can also have out of memory error if you store too much file.

Where do I create folder "upload" in my server on google cloud to store files uploaded?

I am working on a Nodejs app, I have created a folder named "upload" on my app's directory public folder, this folder stores files that are uploaded using Nodejs. This works well when its locally now on Google Cloud where do I create that "upload" folder to be able to store my uploaded images?
I found a documentation [1] to use node.js with cloud storage, that may help you.
You can found more details about cloud storage here.
You can create a "folder" on GCP using Buckets (Storage). First, you'll need to create one. You can either make the whole bucket publicly readable or private (or do the same to every object, you'll see this option on the bucket)
Once you are done with creation, head into the docs for Uploading Objects. You can also see all other techniques on the second link if you need to. Just click on the "Code Samples" and pick NodeJS.

Upload and use web folder on Windows Azure

We already allow our users to upload files through the app to the Azure Blob Storage, and then view them inside the app.
What we need now is to allow the upload of an entire folder containing web files (html, js, css, images...) maintaining the folder structure that it has and then be able to run these files in the browser. The link references between the files must be maintained also so it can work.
What will be the correct way to do this?
Is it possible through Blob Storage or do we need to upload the folder and its contents directly to the file system?
Thanks!
Note the Azure Blob storage doesn't have a concept of a "folder". The closest you would get would be to name a file "foldername/filename.ext". How you populate blob storage in this fashion would depend on how you allow a user to upload all their files. Perhaps as a zip file or through some form of ajax-based file upload UI on a web page... not sure. Ultimately you can't build folders but should be able to replicate the behaviours of one.

Shared Umbraco Media Folder on Azure Cloud Instances

I have just implemented Umbraco in an Azure Cloud Instance. I was able to migrate my existing SQL Database to run on SQL Azure and everything runs fine, except for the images and documents inside the media folder.
By default the media folder resides in [siteroot]/Media.
Is there a way to map this folder to azure storage? If not I don't think I'm going to be able to scale up my cloud instances, since the images depend on the virtual server's local storage.
Edit: Bounty Started
What I have so far is this:
Define a stand alone web role which would hold the media directory
and all the files.
Map this folder to the Azure Blobg Storage service with Cloud Drive, in order to minimize the risk of losing data and relying on a
single point of storage.
Somehow (and this is the part I don't know how to accomplish) keep all the folder of [siteRoot]/media synced with this shared drive on
all running instances.
I've seen a similar approach taken with the Azure Accelerator project from Umbraco here: http://azureaccelerators.codeplex.com/releases
But they haven't updated the release since 2011, and I'm not sure it would work with the current version of Azure.
Edit 2:
Umbraco has their own accelerator, but they've deprecated it in favor of using Websites instead of Web Roles:
https://github.com/Microsoft-DPE/wa-accelerator-umbraco
This release works with the 1.6 SDK. Current version is 1.8 I believe...
I'm not sure about a way of mapping the path to storage, but depending on the version of Umbraco you are using, I think from 4.9 (possibly 4.10) they introduced FileSystemProviders configuration which may help solve your problem.
My understanding of it is that it allows you to replace the default Umbraco FileSystemProvider, Umbraco.Core.IO.PhysicalFileSystem with your own custom implementation. I'm pretty sure you could implement an Azure-based provider that wrote and read from the blob storage. In the source, it looks fairly straightforward, a matter of implementing their IFileSystem.
Ended up using Matt Brailsford's Universal Media Picker solution:
http://our.umbraco.org/projects/backoffice-extensions/universal-media-picker
The final solution actually circumvents the Umbraco Media Folder and reads directly from Blob Storage, so I had to rewrite all the macros and templates that rendered images before and point them directly to the Blob Storage account
Unfortunately theres no way to map a NTFS directory to BlobStorage directly.
Have a look at the CloudDrive class of the Windows Azure SDK. This feature allows you to upload a Virtual Hard Disk file (.vhd file) into your blob storage and mount it as a local drive inside Windows Azure Instances.
You sould know that (if you're using multiple instances) only one cloud instance can mount the VHD in read/write mode. The rest of them has only read access to the drive. If the "Media" folder stores static content that you update manually only a few times, this is okay. But if user content is placed there, too, you might want only one instance to mount the VHD and grant other instances access to it via Network Share.
This package provided by Ali Sheikh Taheri solves the problem of the media folder
http://our.umbraco.org/projects/backoffice-extensions/ast-azure-media-sync

Azure Blob - Multiple files into one zip file before downloading

I'm currenlty using Azure Blob to store files, and upload/download from ASP.Net Application hosted outside of Azure. (I do not have Web Role and Worker Role.)
Is it possible to zip multiple files into one zip file within Azure Blob before downloading?
Thanks in advance!
THe only way to achieve this would be to do it by using a WIndows Azure Compute Role in the cloud. You obviously wouldn't want to do it on your on-prem servers as you'd round-trip the files twice.
One approach you might consider would be to build a download 'client' in Silverlight. This could handle the communications to blob stgorage and pull down the blobs (maybe in parallel) and then create the zip client side for saving.
But the short answer is this is not possible using WIndows Azure storage alone.

Resources