Is there a way to use Azure Search against Azure File Shares. I only see blob storage as an option. We have on-prem servers that sync files to Azure File Shares and would like to search inside those files in a web application.
At this moment, there's no way unless you manually query and push file content to your Azure Cognitive Search index. In the future, there's a hope you'll be able to trigger an Azure Function using this type of binding, which will make your life easier. You can follow / vote up for this feature in the following link:
https://github.com/Azure/azure-webjobs-sdk-extensions/issues/14
Per UserVoice Page for Azure Search: https://feedback.azure.com/forums/263029-azure-search/suggestions/14274261-indexer-for-azure-file-shares#{toggle_previous_statuses}, Azure File Indexer is available in private preview (in fact this has been in this stage for almost 2 years now :)).
Search team would like to reach out to them in case you're interested.
Related
I want to spike whether azure and the cloud is a good fit for us.
We have a website where users upload documents to our currently hosted website.
Every document has an equivalent record in a database.
I am using terraform to create the azure infrastructure.
What is my best way of migrating the documents from the local file path on the server to azure?
Should I be using file storage or blob storage. I am confused about the difference.
Is there anything in terraform that can help with this?
Based on your comments, I would recommend storing them in Blob Storage. This service is suited for storing and serving unstructured data like files and images. There are many other features like redundancy, archiving etc. that you may find useful in your scenario.
File Storage is more suitable in Lift-and-Shift kind of scenarios where you're moving an on-prem application to the cloud and the application writes data to either local or network attached disk.
You may also find this article useful: https://learn.microsoft.com/en-us/azure/storage/common/storage-decide-blobs-files-disks
UPDATE
Regarding uploading files from local computer to Azure Storage, there are actually many options available:
Use a Storage Explorer like Microsoft's Storage Explorer.
Use AzCopy command-line tool.
Use Azure PowerShell Cmdlets.
Use Azure CLI.
Write your own code using any available Storage Client libraries or directly consuming REST API.
I am looking to do the following in Azure however I should point out that on my local machine I have no visual studio, no admin rights, no IT support and no tools (except SSMS) but I have a VERY strong drive to complete this work if its possible.
I have created an Azure blob which receives a file each day (zipped) from a 3rd party. I am looking to do the following:
1)Unzip the data in an automated fashion
2)Get the data into an Azure SQL database (already created) in an automated fashion
What I want to know is if this is possible to do using Azure alone or am I going to need admin rights / Visual Studio? If it is possible any directions that you could point me in would be greatly received!
Thanks
Dave
Based on your description, one approach would be to create a Blob Triggered Azure Function through the Azure Portal (Visual Studio is not required), unzip/process the file and save desired data into Azure SQL. Moreover, considering the there is only one new file per day, prefer the Consumption Plan to optimize cost.
Find more details about Azure Function Blob Binding at https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob.
Based on your description, one approach would be to create a Blob Triggered Azure Function through the Azure Portal (Visual Studio is not required), unzip/process the file and save desired data into Azure SQL. Moreover, considering the there is only one new file per day, prefer the Consumption Plan to optimize cost.
Find more details about Azure Function Blob Binding at https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob.
Spin up your data factory on Azure, unzip function is available on adf
Currently, I work on a task to sync files inside azure with file-storage on a custom data center. I need the way to get a notification if something changes inside Azure file storage.
For example, for AWS I can configure notification through lambda function. Is there any similar way to do this in Azure?
As of today, this feature is not there as Azure Files binding is not supported. There is an open ticket on Github regarding this: https://github.com/Azure/azure-webjobs-sdk-extensions/issues/14. It is available for Blob Storage though (that's why I asked in my comment).
For a list of available bindings, please see this: https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings.
I want to write the output of pipeline to an FTP folder. ADF seems to support on-premises file but not FTP folder.
How can I write the output in text format to an FTP folder?
Unfortunately FTP Servers are not a supported data store for ADF as of right now. Therefore there is no OOTB way to interact with an FTP Server for either reading or writing.
However, you can use a custom activity to make it possible, but it will require some custom development to make this happen. A fellow Cloud Solution Architect within MS put together a blog post that talks about how he did it for one of his customers. Please take a look at the following:
https://blogs.msdn.microsoft.com/cloud_solution_architect/2016/07/02/creating-ftp-data-movement-activity-for-azure-data-factory-pipeline/
I hope that this helps.
Upon thinking about it you might be able to achieve what you want in a mildly convoluted way by writing the output to a Azure Blob storage account and then either
1) manually: downloading and pushing the file to the "FTP" site from the Blob storage account or
2) automatically: using Azure CLI to pull the file locally and then push it to the "FTP" site with a batch or shell script as appropriate
As a lighter weight approach to custom activities (certainly the better option for heavy work).
You may wish to consider using azure functions to write to ftp (note there is a time out when using a consumption plan - not in other plans, so it will depend on how big the files are).
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-storage-blob-triggered-function
You could instruct data factory to write to a intermediary blob storage.
And use blob storage triggers in azure functions to upload them as soon as they appear in blob storage.
Or alternatively, write to blob storage. And then use a timer in logic apps to upload from blob storage to ftp. Logic Apps hide a tremendous amount of power behind there friendly exterior.
You can write a Logic app that will pick your file up from Azure storage and send it to an FTP site. Then call the Logic App using a Data Factory Web Activity.
Make sure you do some error handling in your Logic app to return 400 if the ftp fails.
I have a Azure website that allows customers to upload documents. There are a lot of documenet (~200Gb so far).
I need a way to backup the documents to another location (Azure or other location), or have live replication to a server. Is there anything I can use that will do this?
Have you looked at Azure storage redundancy options? The geo redundant option might solve your replication need.
http://blogs.msdn.com/b/windowsazurestorage/archive/2013/12/04/introducing-read-access-geo-replicated-storage-ra-grs-for-windows-azure-storage.aspx
You can use Cloudberry. Here is the link to their website. The tool will not give you cloud storage, but will assist in backup process. It can back up either to the cloud or to the local storages.