Azure Created Container Not Displayed in Portal - azure

I'm following steps in this tutorial: How to use Blob storage from iOS to generate Shared Access Signatures (SAS). I ran the commands successfully including this one:
azure storage container sas create --container sascontainer
--permissions rw --expiry 2016-09-05T00:00:00
My terminal said:
info: Executing command storage container sas create
+ Creating shared access signature for container sascontainer
I looked at azure portal and I don't see that container: sascontainer created anywhere. According to this article (my understanding) is that it will create a container:
--container : The name of the storage container to create.
So, where is it!? Shouldn't that command be enough to create that container and make it visible in my azure portal!? I also have looked in Azure Classic Portal.

azure storage container sas create --container sascontainer
--permissions rw --expiry 2016-09-05T00:00:00
This command will not create a blob container. It will create a Shared Access Signature on a container named sascontainer with Read and Write permission that will expire on 2016-09-05T00:00:00.
To create a blob container, the command you want to use is:
azure storage container create "sascontainer"
Once this command completes successfully, you should be able to see the blob container in the portal.

Related

Not sure whether the container is listed as [] in azure storage

I am following (previous and) this tutorial: https://learn.microsoft.com/en-us/training/modules/connect-an-app-to-azure-storage/9-initialize-the-storage-account-model?pivots=javascript to connect an application to the Azure Storage account.
At step 8, when I verify the creation of the container by running the given Azure CLI command and replacing with my storage account:
az storage container list \
--account-name <name>
I get the following output:
There are no credentials provided in your command and environment, we will query for account key for your storage account.
It is recommended to provide --connection-string, --account-key or --sas-token in your command as credentials.
You also can add `--auth-mode login` in your command to use Azure Active Directory (Azure AD) for authorization if your login account is assigned required RBAC roles.
For more information about RBAC roles in storage, visit https://learn.microsoft.com/azure/storage/common/storage-auth-aad-rbac-cli.
In addition, setting the corresponding environment variables can avoid inputting credentials in your command. Please use --help to get more information about environment variable usage.
[]
which I am not sure whether the container is listed as [] at the end of the above output.
Comments and suggestions are welcome. Thanks!
This error you are getting is because of an auth issue.
So, there are three solution one is that you run the following command before the running the az storage container list
az login
The other way would be to use the --auth-mode option in the az storage container list this is written in the error prompt itself which you have given.
command:
az storage container list --account-name <name> --auth-mode login
this will prompt you for login credentials once provided the output should look like this
Lastly you can use the same option as above but with key
az storage container list --account-name <name> --auth-mode key <key>
you can get your key from the portal under access keys
The output of the command should look like this here I have two containers name photos and test.
I tried to reproduce in my environment and I got same error:
There are no credentials provided in your command and environment,
we will query for account key for your storage account. It is
recommended to provide --connection-string, --account-key or
--sas-token in your command as credentials.
You also can add --auth-mode login in your command to use Azure
Active Directory (Azure AD) for authorization if your login account is
assigned required RBAC roles. For more information about RBAC roles in
storage, visit
https://learn.microsoft.com/azure/storage/common/storage-auth-aad-rbac-cli.
In addition, setting the corresponding environment variables can avoid
inputting credentials in your command. Please use --help to get more
information about environment variable usage. []
The above error show that in your storage account you didn't create any containers and files.
I have created one container and add files.
I tried the same command now i got an output successfully.
If you need to remove warnings you can use this command--only-show-errors
Reference:
az storage container | Microsoft Learn

Delete content of Azure pipeline blob storage via Azure release

I’m trying to build a CICD pipeline. I want to copy my new static React build bundle to Azure blob storage. As part of this, the old content needs to be deleted. The only action I seem able to take, however, is copy. Is it possible to do this?
It's is not possible to delete/clean up the blob content using Azure File Copy.
You can use Azure CLI task to invoke the az cmdlets to clean up your container before running the Azure File Copy task.
az storage blob delete-batch --account-name <storage_account_name> --source <container_name>

bash integration with terraform

i am writing az copy script which captures linux 18.04 system log using azcopy and store it into storage account container, but this whole steps I am doing with terraform automation. i have created machine code and I integrate shell script file with terraform extension.
so the issue is when azcopy copy the file from system and pass to a storage account need azcopy login to authenticate this process but these steps we can't perform through automation.
using following azcopy script and version is v10 please help me on this
AzCopy /Source:/var/log/syslog/Dest:https://testingwt.blob.core.windows.net/insights-operational-logs//SourceKey:y/bUACOu/wogikUT1EG0XeaPC4Y6spHcZly2d26QeENKwMiRpjFu5PwmXrThRbNGS3PiPfqEX8WsYC3dg== /S
updated error of azcopy using linux machine in azure
To upload files to the Storage Blob with a shell script automatically, you can use the SAS token of the storage, or use the azcopy login with a service principal or the VM managed identity.
For the SAS token:
azcopy copy "/path/to/file" "https://account.blob.core.windows.net/mycontainer1/?sv=2018-03-28&ss=bjqt&srt=sco&sp=rwddgcup&se=2019-05-01T05:01:17Z&st=2019-04-30T21:01:17Z&spr=https&sig=MGCXiyEzbtttkr3ewJIh2AR8KrghSy1DGM9ovN734bQF4%3D" --recursive=true
For the Service Principal, you need to set the environment variable AZCOPY_SPA_CLIENT_SECRET with the secret of the service principal as value and assign the role Storage Blob Data Contributor or role Storage Blob Data Owner of the storage Blob:
azcopy login --service-principal --application-id <application-id> --tenant-id=<tenant-id>
azcopy copy "/path/to/file" "https://account.blob.core.windows.net/mycontainer1/" --recursive=true
For the VM managed identity, you need also to assign the VM managed identity with the role Storage Blob Data Contributor or role Storage Blob Data Owner of the storage Blob:
azcopy login --identity
azcopy copy "/path/to/file" "https://account.blob.core.windows.net/mycontainer1/" --recursive=true
But when you use the VM managed identity, you need to execute the shell script in the Azure VM, it means you need to deploy the Terraform in the Azure VM. So the best way is that use a service principal, you can execute the shell script in other Linux OS, for example, your local Linux machine. The SAS token is also a good way without assigning the role. For more details, see the Use Azcopy for the Azure Storage Blob.

Getting error while copying files between Azure BLOB container using AzCopy

I am trying to copy all files from one container to another. I am using AzCopy to accomplish this task.
AzCopy command as below:
azcopy copy "https://xxxxxxx.blob.core.windows.net/customers" "https://xxxxxxx.blob.core.windows.net/archive" --recursive
Error:
Alternatively is it possible to Move files between containers?
Please follow this doc to grant your user account the RBAC role Storage Blob Data Contributor in your account or your containers.
Besides, there isn't a "move" operation for Azure Blob Storage, you need to delete the original container after copying it.

Azure SAS for one container

I am able to generate SAS token for the storage account from the Azure portal but the problem which I am facing is explained below-
Storage account consists of two Containers. One container file has to be given access for the users whom I will provide the SAS token and one container should be completely private means user cannot see this container.
Problem is if I am generating SAS token and login into Azure explorer using that SAS token,I am seeing both the containers but my requirement is to see only 1 container. Is there any way to give permission for only one container by generating SAS token using Azure portal without creating any custom application for generating these tokens.
Easiest way to do that would be to use powershell:
Set-AzureRmStorageAccount -Name 'name'
$sasToken = New-AzureStorageContainerSASToken -Permission r -ExpiryTime (Get-Date).AddHours(2.0) -Name 'container name'
you could issue this command with -debug switch, capture the rest call and use that call to mimic it, using arm client, or custom app or whatever.
The Azure CLI alternative:
az storage container generate-sas --account-name ACCOUNT_NAME --account-key ACCOUNT_KEY --https-only --expiry 'YYYY-MM-DD' --name CONTAINER_NAME --permissions r
Valid permissions: (a)dd (c)reate (d)elete (l)ist (r)ead (w)rite
For more information, check out: az storage container generate-sas -h

Resources