there is a tutorial in Microsoft docs that you can see here:
Tutorial: Build your first pipeline to transform data using Hadoop cluster
in "Prerequisites" section, in step 6, they wrote "Use tools such as Microsoft Azure Storage Explorer".
the question is, can I use some other tools? especially, is it possible to use scripting languages like Python directly?
I need to do all these 7 steps dynamically, using something like Azure Function Apps. do you know is it possible and if it is, from where I should start?
Short answer is YES. But again, you have not shared details on what functionality are you looking for specifically.
What you can do is call the REST API endpoints for the corresponding service.
Depending on whether you are using Blobs or Table or Queues, there are specific API's that you can call.
Azure Storage Services REST API Reference
Blob service REST API's
Queue Service REST API
Table Service REST API
File Service REST API
Taking Blobs as example, we have API's to upload content using PUT method. See this for more details: Put Blob
Similarly, you have API's for reading containers, listing containers etc.
There is also some samples on working with Azure Storage in Python on Github. See this: Azure-Samples/storage-python-getting-started
HTH
Related
I am working on something, where I need to replicate only few tables instead of entire database from the leader cluster. How should I do it in the Azure Portal using Azure data share? I can see from azure documentation, that they are using C# or some other language for it, can we do it directly via Azure Portal?
As of this writing, table-level sharing isn't yet available through Azure Data Share, but should become available in the next few weeks (follow this doc for updates: https://learn.microsoft.com/en-us/azure/data-explorer/data-share)
As you mentioned correctly, it is already available programmatically using the management API (documented here: https://learn.microsoft.com/en-us/azure/data-explorer/follower#table-level-sharing)
In my app I would like my users to upload local videos, which I would want to store in the Blob Storage. But I would like to achieve this by using the Azure Functions. Is this possible? If so, I couldn't find any resource that could point me in the right direction. If not what would be the ideal way to achieve this. I am building the app using flutter, for which we do not yet have the SDK's. Any help is appreciated.
There's no SDK available for Dart. You'll have to make a back-end part running in .NET, Java, JavaScript or Python in order to use Azure Functions.
But you can use the Azure Storage REST API to store the videos as blobs using a Storage Account.
Take a look at the official reference here. Using this, you'll be able to store the videos using Http.
Also, this tutorial might be useful.
The fella here is using the File Service instead the Blob Service.
A point of interest is that you have to be in mind some limitations, not that it's impossible to do what you want to, but is good to be aware:
There are limitations to the storage service.
[...] You can only upload 4mb “chunks” per upload. So if your
files exeed 4mb you have to split them into parts. If you are a good
programmer you can make use of tasks and await to start multiple
threads. Please consult the Azure limits documentation to see if any
other restrictions apply.
I want to use logic app with my rss connections.
I created rss by this tutorial.
I am searching about suitable azure service how to upload it and set rss url in logic app according to this.
First thing you should know, the logic app rss connector could not update, check the connector reference, it only have a List all RSS feed items action except the trigger.
If you still want to use logic app to do this thing with azure service, you could use azure function to implement it, and if you want to upload the file you could use azure storage blob. And the function type you could use HTTP trigger function , also azure function support java, more details you could refer to this tutorial:Azure Functions Java developer guide.
I want to spike whether azure and the cloud is a good fit for us.
We have a website where users upload documents to our currently hosted website.
Every document has an equivalent record in a database.
I am using terraform to create the azure infrastructure.
What is my best way of migrating the documents from the local file path on the server to azure?
Should I be using file storage or blob storage. I am confused about the difference.
Is there anything in terraform that can help with this?
Based on your comments, I would recommend storing them in Blob Storage. This service is suited for storing and serving unstructured data like files and images. There are many other features like redundancy, archiving etc. that you may find useful in your scenario.
File Storage is more suitable in Lift-and-Shift kind of scenarios where you're moving an on-prem application to the cloud and the application writes data to either local or network attached disk.
You may also find this article useful: https://learn.microsoft.com/en-us/azure/storage/common/storage-decide-blobs-files-disks
UPDATE
Regarding uploading files from local computer to Azure Storage, there are actually many options available:
Use a Storage Explorer like Microsoft's Storage Explorer.
Use AzCopy command-line tool.
Use Azure PowerShell Cmdlets.
Use Azure CLI.
Write your own code using any available Storage Client libraries or directly consuming REST API.
I am using ASE to access my Azure Storage Accounts and it works great, but for some reason when I try to access my HDIsight cluster container (the one that has the same name as the HDIsight cluster), I get nothing, it seems to time out without a message. This is quite frustrating. A search turned up nothing so I suspect this is not usual behavior?
Any ideas about what I could do to fix this?
Here is a fiddler screen shot. Looks like it transferred about 15+MB of data, but it never displayed it.... odd.
(note: I just noticed that it actually does work if I try and use ASE from a VM in the same data center as my storage account)
I haven't looked at the source code of Azure Storage Explorer (ASE) but from the Fiddler trace it seems that ASE is trying to fetch all blobs in your blob containers first and then displaying them in the UI (on the basis of multiple requests to your storage account and requests containing a continuation token as a query string parameter (marker parameter)) and it looks like you have lots of blobs in the container. Given that ASE makes use of .Net Storage Client Library, my guess is that it is using ListBlobs method which fetches all blobs in the container instead of ListBlobsSegmented method.
Your options would be:
Get the source code of ASE from CodePlex and mess around with it and implement some kind of pagination.
File a bug/feature request on ASE to support pagination.
Use another Storage Explorer. If you're looking for desktop based blob explorer, I would recommend looking into Cerebrata tools (www.cerebrata.com). If you're looking for browser based tool, I would recommend looking at Cloud Portam (www.cloudportam.com) [Disclosure: I am developing Cloud Portam]