I want to export a machine learning model I created in Azure Machine Learning studio. One of the required input is "Path to blob beginning with container"
How do I find this path? I have already created a blob storage but I have no idea how to find the path to the blob storage.
you should be able to find this from the Azure portal. Open the storage account, drill down into blobs, then your container. Use properties for the context menu, the URL should be the path ?
You can also get this URL using Azure Storage Explorer(on-prem) Software.
Also through an Online version of Azure Storage Explorer
You can also simply guess the URL if you know the storage account name and container name
https://[storageaccount].blob.core.windows.net/[container]/[blob]
Related
I have developed an app for a client with an azure database, I have never before worked with azure so I have no idea what I'm doing, I am using the .NETAutoUpdater package as the updater and require the update file to be a zip on a public link. Is there any to do this using Azure Storage Accounts --> Blobs and just make the blob public with a link? again I have no idea what I am doing in azure so any assistance will be appreciated
Yes, this is definitely possible. From the list of your blobs, click the grey dots at the end of the line and choose "Change access level".
Then change the access level to "Blob". Now open the blob container and select its properties view.
As you can see it shows an URL like https://yourapp.blob.core.windows.net/yourblobcontainer. Files that are placed in this container will be downloadable via that URL. E.g. if you have a file named foo.bar it'll be available at https://yourapp.blob.core.windows.net/yourblobcontainer/foo.bar.
It is actually fairly simple and straightforward to do. Simply set the ACL of the blob container containing your zip file to either Blob (recommended) or Container and the blobs inside that container will be publicly accessible.
You can set the ACL of the blob container on the portal, using any available storage explorers or programmatically.
With the ability now to export a managed disk and download it locally via the portal, is there a way to do this via powershell?
Or do you still need to first copy it to a storage account and then pull it down?
As per this post:
Download Azure VHD to local use powershell
The short answer is yes, the managed disk would not show the URL that you need to download from Azure through PowerShell, even if it is also just a VHD file. So you should first get the VHD file URL through copying it to a storage account blob. Then you can download it with the ways you want. I suggest the AzCopy that provide in your link.
Or you can just generate URL and then download it in the Azure Portal without copy it to a storage account first. Hope this will help.
Is there a way to examine an Azure blob's metadata through a web interface or the Azure portal?
I'm running into a problem where I set metadata on a blob programmatically, without any problems, but when I go back to read the metadata in another section of the program there isn't any. So I'd like to confirm that the metadata was, in fact, written to the cloud.
One of the simplest ways to set/get an Azure Storage Blob's metadata is by using the cross-platform Microsoft Azure Storage Explorer, which is a standalone app from Microsoft that allows you to easily work with Azure Storage data on Windows, macOS and Linux.
Just right click on the blob you want to examine and select Properties, you will see the metadata list if they exist.
Note: Version tested - 0.8.7
There is no way to check this on the portal, however you can try the Storage Explorer tool.
if you want to check the metadata in your code, please try this Get blob metadata
Would like to know whether it is a feasible to move the folder ( with files ) from Azure blob/file storage to webapp root.
Scenario: Would like to replace gallery of images folder used by static HTML site for gallery section weekly using powershell.
Request suggestions or alternatives as not sure how to handle this in azure and schedule swapping of folders between blob and ftp.
You can use the BlobTrigger trigger with WebJob deployed on the same web app and copy the files from blob storage to the local file system.
Would like to replace gallery of images folder used by static HTML site for gallery section weekly using powershell.
Please try to store the images in Azure blob directly. We can access the images in Azure blob with 'Full public read access' mode or 'Public read access for blobs only' mode. Refer to this article for more details. Then we can use Scheduler Webjob to replace the images directly.
It wasn't clear to me exactly what you are trying to do. If you have a legacy app / adoption of FTP you can mount an FTP server on Azure File Storage. Or alternatively Blob Storage can be used for public data as described above. If you want a simple tool for interacting with Blob Storage then you can try Storage Explorer.
I am trying to download the Azure VM blob by using the Azure Storage Explorer so that I could upload the .vhd to another subscription. However, I got this error "Unable to read data from the transportation connection:The connection was closed." when downloading the blob.
Is there any way to solve it?
In order to transfer VHD from one storage account to another, you don't need to do that. In fact, I will go out on a limb and say "Please Stop Using Azure Storage Explorer". This tool has not been updated in ages and does not have the latest functionality offered by Azure Storage.
Azure Storage supports Async Server Side copy blob which will copy the blobs from one storage account to another on the server side without having to download the blob first from source storage account and reupload it in target storage account.
I would recommend using AzCopy which is now part of Azure SDK. If you've the latest version of SDK installed on your computer, you can find it in C:\Program Files (x86)\Microsoft SDKs\Windows Azure\AzCopy folder. Here's the sample usage to copy file from one storage account to another:
AzCopy "https://<oldaccountname>.blob.core.windows.net/<oldaccountcontainername .. usually vhds>/" "https://<newaccountname>.blob.core.windows.net/<newaccountcontainername .. again vhds>/" "<filenametocopy.extension" /SourceKey:<oldaccountkey> /DestKey:<newaccountkey> /BlobType:page /S