AzCopy Sync command is failing - azure

I'm issuing this command:
azcopy sync "D:\Releases\Test\MyApp" "http://server3:10000/devstoreaccount1/myapp?sv=2019-02-02&st=2020-06-24T03%3A19%3A44Z&se=2020-06-25T03%3A19%3A44Z&sr=c&sp=racwdl&sig=REDACTED"
...and I'm getting this error:
error parsing the input given by the user. Failed with error Unable to infer the source 'D:\Releases\Test\MyApp' / destination 'http://server3:10000/devstoreaccount1/myapp?sv=2019-02-02&st=2020-06-24T03%3A19%3A44Z&se=2020-06-25T03%3A19%3A44Z&sr=c&sp=racwdl&sig=-REDACTED-
I would have thought my source was pretty clear.
Can anyone see anything wrong with my syntax?

I believe you have run into an issue with azcopy that it does not support local emulator (at least for sync command). There's an open issue on Github for the same: https://github.com/Azure/azure-storage-azcopy/issues/554.
Basically the issue is coming from the following lines of code, where it returns location as Unknown in case of storage emulator URLs:
func inferArgumentLocation(arg string) common.Location {
if arg == pipeLocation {
return common.ELocation.Pipe()
}
if startsWith(arg, "http") {
// Let's try to parse the argument as a URL
u, err := url.Parse(arg)
// NOTE: sometimes, a local path can also be parsed as a url. To avoid thinking it's a URL, check Scheme, Host, and Path
if err == nil && u.Scheme != "" && u.Host != "" {
// Is the argument a URL to blob storage?
switch host := strings.ToLower(u.Host); true {
// Azure Stack does not have the core.windows.net
case strings.Contains(host, ".blob"):
return common.ELocation.Blob()
case strings.Contains(host, ".file"):
return common.ELocation.File()
case strings.Contains(host, ".dfs"):
return common.ELocation.BlobFS()
case strings.Contains(host, benchmarkSourceHost):
return common.ELocation.Benchmark()
// enable targeting an emulator/stack
case IPv4Regex.MatchString(host):
return common.ELocation.Unknown()//This is what gets returned in case of storage emulator URL.
}
if common.IsS3URL(*u) {
return common.ELocation.S3()
}
}
}
return common.ELocation.Local()
}

I had this same issue when trying to do a sync from my local machine to Azure Blob storage.
This was the command I was running:
azcopy sync "C:\AzureStorageTest\my-app\*" "https://myapptest.z16.web.core.windows.net/$web"
But I got the error below:
INFO: The parameters you supplied were Source: 'c:\AzureStorageTest\my-app' of type Local, and Destination: 'https://
myapptest.z16.web.core.windows.net/' of type Local
INFO: Based on the parameters supplied, a valid source-destination combination could not automatically be found. Please
check the parameters you supplied. If they are correct, please specify an exact source and destination type using the -
-from-to switch. Valid values are two-word phases of the form BlobLocal, LocalBlob etc. Use the word 'Blob' for Blob St
orage, 'Local' for the local file system, 'File' for Azure Files, and 'BlobFS' for ADLS Gen2. If you need a combination
that is not supported yet, please log an issue on the AzCopy GitHub issues list.
error parsing the input given by the user. Failed with error Unable to infer the source 'C:\AzureStorageTest\my-app'
/ destination 'https://myapptest.z16.web.core.windows.net.z16.web.core.windows.net/'.
PS C:\Users\promise> azcopy sync "C:\AzureStorageTest\my-app" "https://myapptest.z16.web.core.windows.net.z16.web.core.window
s.net/$web" -from-to localblob
Error: unknown shorthand flag: 'f' in -from-to
Here's how I solved it:
I was missing the ?[SAS] argument at the end of the Blob storage location. Also, as of this writing, the azccopy sync command does not seem to support the --from-to switch. So instead of this:
azcopy sync "C:\AzureStorageTest\my-app" "https://myapptest.z16.web.core.windows.net/$web"
I had this:
azcopy sync "C:\AzureStorageTest\my-app" "https://myapptest.blob.core.windows.net/%24web?[SAS]" --recursive
Note:
The format is azcopy sync "/path/to/dir" "https://[account].blob.core.windows.net/[container]/[path/to/directory]?[SAS]" --recursive. You only need to modify the "/path/to/dir", [account] and [container]/[path/to/directory]. Every other thing remains the way it is.
My actual Blob storage location is https://myapptest.blob.core.windows.net/$24web but I used https://myapptest.blob.core.windows.net/%24web, since $ will throw some error when used, so %24 was used.
Don't use the blob website address like I did which is https://myapptest.z16.web.core.windows.net/ that got me frustrated with lots of errrors, rather use the blob storage address https://myapptest.blob.core.windows.net
The --recursive flag is already inferred in this directory sync operation, so you may consider leaving it out. I put it though for clarity sake.
That's all.
I hope this helps

OK—after much futzing I was finally able to get this to work, using Azurite and PowerShell. It's clear that neither AzureCLI nor AzCopy are well-tested under emulation.
Here's a rough-and-tumble script that can be called from a pipeline:
[CmdletBinding()]
param(
[Parameter(Mandatory)][string] $Container,
[Parameter(Mandatory)][string] $Source
)
$Context = New-AzureStorageContext -Local
$BlobNames = Get-AzureStorageBlob -Context $Context -Container $Container | % { $_.Name }
$FilesToSync = gci $Source\* -Include RELEASES, Setup.exe
$Packages = gci $Source -Filter *.nupkg
$Packages | % {
If (!($BlobNames.Contains($_.Name))) {
$FilesToSync += $_
}
}
$FilesToSync | Set-AzureStorageBlobContent -Context $Context -Container $Container -Force
Note that this is highly customized for my Squirrel deployments (*.nupkg, RELEASES, Setup.exe), so a person will want to adjust accordingly for his own environment.
Azurite can be set to always-on using a Scheduled Task to run this command every hour:
powershell -command "Start-Process azurite-blob.cmd -PassThru -ArgumentList '--blobHost 0.0.0.0'"
The argument sets Azurite to listen on any IP so that it can be reached from other computers on the network. I punched a hole in the firewall for ports 10000-10002.
Be careful to set the Task to run under the same account that was used to install Azurite, otherwise the Task won't be able to see azurite-blob.cmd (it's in %AppData%\npm, which is added to PATH during installation).

The command line needs the --recursive option. See https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-blobs?toc=/azure/storage/blobs/toc.json

Related

Microsoft Azure Storage AzCopy : error parsing the argument "copy"

I'm trying to dowload some data from a blob for which I have a SaS key with AzCopy.
My code :
azcopy copy "my/path/testMindee/fileData.txt" "https...blob.core.windows.net/90138?sv=mykey&sp=rl"
The error :
The syntax of the command is incorrect. Error parsing the argument "copy" : parameter is required.
A screenshot of my code
I've tried after that :
My code :
azcopy cp "https...blob.core.windows.net/90138/DBxxx/file.csv?sv=mykey&sp=rl" "my/path/testMindee/fileData.txt" --recursive
The error :
The syntax of the command is incorrect. Error parsing the argument "copy" : parameter is required.
My second try
Both still give me the error : The syntax of the command is incorrect. Error parsing the argument "cp": parameter name is required.
Thank you in advance
Are you using the latest version of AzCopy? The documentation you are following is for AzCopy 10.
I had the same error as you, and suddenly found out I'm using AzCopy version 8.1, which has a completely different way of sending in parameters.
Example usage of AzCopy 8.1:
AzCopy /Source:<source> /Dest:<destination> [options]
In you image you have code like this:
azcopy copy "path/to/local/file" "path/to/storageacc/blob"
The error message says: "Error parsing the argument copy".
Basically you need to do azcopy cp instead of azcopy copy
You want to downlaod a file from a storageaccount with a sas token
I'm trying to get some data from a blob for which I have a SaS key with AzCopy
From the ms azcopy copy documentation:
Download a single file by using a SAS token:
azcopy cp "https://[account].blob.core.windows.net/[container]/[path/to/blob]?[SAS]" "/path/to/file.txt"
In case you want to uplaod a file
azcopy cp "/path/to/file.txt"> "https://[account].blob.core.windows.net/[container]/[path/to/blob]?[SAS]"
So when you want to download from a storage account you need to put the remote location first and the path where the fiel shoudl be stored second. Also the azcopy command is followed by cp not copy.
Kind regards

Alternative to New-SqlAzureKeyVaultColumnMasterKeySettings

I currently have a script that is similair to the script that is described inside https://learn.microsoft.com/en-us/sql/relational-databases/security/encryption/configure-always-encrypted-keys-using-powershell?view=sql-server-ver15
However the script uses powershell 5 which isn't available to me on linux agents in our azure-devops environment. Because of this a majority of the azure-sql commands aren't available to us. Is there an alternative to these sqlserver module cmdlets.
The recommended alternative is to use the cloud console, but you may be able to get away with linux powershell like so:
I can't guarantee if this all works, but that specific command only creates a SqlColumnMasterKeySettings object that contains information about the location of your column master key. You can probably just create one manually, but you'll need to know the exact values. I would recommend running it from a windows machine first to see what the values should be for your environment.
# On Windows [Optional]
$cmkSettings = New-SqlAzureKeyVaultColumnMasterKeySettings -KeyURL $akvKey.Key.Kid
$cmkSettings | Format-List KeystoreProviderName,KeyPath
KeystoreProviderName : # Take these strings
KeyPath : # And use them in your script on linux
# Now on Linux, using the values above:
$cmkSettings = [Microsoft.SqlServer.Management.PowerShell.AlwaysEncrypted.SqlColumnMasterKeySettings]::new("YourKeystoreProviderName","YourKeyPath")
New-SqlColumnMasterKey -Name 'CMK1' -InputObject $database -ColumnMasterKeySettings $cmkSettings
# Success!
The key settings properties are just strings that get saved to your SQL Instance, so this should work fine. The harder part is authenticating to Azure to create keys from your master key, but you can try importing the desktop version of the commands like so:
# Start a NEW powershell session without the sqlserver module:
pwsh
# Get the module directory:
$d = (Get-Item (Get-Module SqlServer).path).DirectoryName
# Import the desktop version of these assemblies:
Import-Module "$d/Microsoft.SqlServer.Diagnostics.Strace.dll"
Import-Module "$d/Microsoft.SqlServer.Management.PSSnapins.dll"
Import-Module "$d/Microsoft.SqlServer.Management.AzureAuthenticationManagement.dll"
Import-Module "$d/Microsoft.SqlServer.Management.AlwaysEncrypted.Types.dll"
# Then import the module normally (there will be errors - you can ignore these)
Import-Module SqlServer
# Now you may be able to authenticate to Azure to generate new keys:
# Required to generate new keys
# NOTE: -Interactive fails on linux
Add-SqlAzureAuthenticationContext -ClientID "YourID" -Secret "YourSecret" -Tenant "YourTenant"
# Create a key using your master key:
New-SqlColumnEncryptionKey -Name 'CEK1' -InputObject $database -ColumnMasterKey 'CMK1'
This worked on my installation of centos7/pwsh7.1.3 - make sure you have SqlServer version 21.1.18245 (only 10 days old at the moment) as many new sql commands got ported to pwsh 7.1.

Azcopy: Copying files to an Azure Fileshare using Azcopy 10

I'm trying to copy files to and from an Azure Fileshare using AZCopy v10. I have had this successfully working using v8.1 but I keep getting errors using v10.
From the command line I'm using this to copy a file from the local drive to the fileshare;
c:\Temp\azcopy.exe copy "c:\temp\sample.txt" "https://myfiles.file.core.windows.net/dbfiles/sample.txt?SASKeyText"
This generates the error message;
failed to perform copy command due to error: cannot transfer individual files/folders to the root of a service. Add a container or directory to the destination URL
I have tried adding a directory to the fileshare and adding that to the command string but I get the same error.
If I reverse the copy from the fileshare to the local drive I get the error;
failed to perform copy command due to error: account copies are an inherently recursive operation, and thus --recursive is required
I have followed the guide at https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-files but haven't been able to see what's wrong.
Thanks in advance for any help.
The error here was with the SAS token and not the form of the command.
I suppose this should be marked up amongst examples of unhelpful error messages.
Thanks to everyone who took the time to have a look.
I had this same issue when trying to do a copy from my local machine to Azure Blob storage.
This was the command I was running:
azcopy copy --from-to=LocalBlob "C:\AzureStorageTest\my-app\*" "https://myapptest.blob.core.windows.net/%24web" --recursive
But I got the error below:
failed to perform copy command due to error: cannot transfer individual files/folders to the root of a service. Add a container or directory to the destination URL
Here's how I solved it:
I was missing the ?[SAS] argument at the end of the Blob storage location. So instead of this:
azcopy copy --from-to=LocalBlob "C:\AzureStorageTest\my-app\*" "https://myapptest.blob.core.windows.net/%24web" --recursive
I had this:
azcopy copy --from-to=LocalBlob "C:\AzureStorageTest\my-app\*" "https://myapptest.blob.core.windows.net/%24web?[SAS]" --recursive
Note:
The format is azcopy copy "/path/to/dir" "https://[account].blob.core.windows.net/[container]/[path/to/directory]?[SAS]" --recursive. You only need to modify the "/path/to/dir", [account] and [container]/[path/to/directory]. Every other thing remains the way they are.
Specify the source-destination routing using the --from-to=LocalBlob (if you're copying from local to blob storage) argument to be explicit about the copy operation.
My actual Blob storage location is https://myapptest.blob.core.windows.net/$24web but I used https://myapptest.blob.core.windows.net/%24web, since $ will throw some error when used, so %24 was used.
That's all.
I hope this helps
Here is a sample azcopy script which worked for me
az storage azcopy blob upload
-c'https://$AZURE_STORAGE_ACCOUNT_NAME.blob.core.windows.net/\\\$web' \
--account-name $AZURE_STORAGE_ACCOUNT_NAME \
-s "build/*" \
--account-key $AZURE_STORAGE_ACCOUNT_ACCESS_KEY \
--recursive
If you get this error while trying to copy to $web container:
"failed to perform copy command due to error: cannot transfer individual files/folders to the root of a service. Add a container or directory to the destination URL"
Per a solution listed here, we need to add an escape character (\) before $web. Following command (to copy all files and subfolders to web container) worked for me:
azcopy copy "<local_folder>/*" "https://******.blob.core.windows.net/\$web/?<SAS token>" --recursive
Without the escape character, the following command fails with the above error.
azcopy copy "<local_folder>/*" "https://******.blob.core.windows.net/$web/?<SAS token>" --recursive

Azure: Deprovision a linux instance using Custom Script Extension

I am attempting to deprovision an Azure Linux instance using the Custom Script Extension.
My Script stored in an Anonymous Access Blob:
sudo waagent -deprovision+user -verbose -force
exit
My Command to apply the extension:
az vm extension set --resource-group rg01--vm-name vm01--name CustomScript --publisher Microsoft.Azure.Extensions --version 2.0 --settings "{'fileUris': [ 'https://mystorageaccount.blob.core.windows.net/scripts/Deprovision.sh' ], 'commandToExecute': 'sh Deprovision.sh'}"
When I run the az command, all the /var/log/azure sub directories and logs disappear!. I can tell from the bastion window, something tried to delete my user account, so I am confident the extension is getting provisioned and run.
Unfortunately, the extension, shows all status information as unavailable, and my az command just sits there. The "Create or Update Virtual Machine Extension" item in the VM's activity log also shows no activity once the deprovision starts. The Azure activity log suggest a restart occurred and my account is no longer valid.
Hopefully Linux/Azure folks have a recipe for this...
-
I saw similar behavior in Windows, and ended up using this script to Sysprep (the -Wait was critical, as it forced the Powershell process to wait for completion, preventing the Agent from returning success/fail until the process completed.), this prevents a restart. I can then script deallocation to occur when the extension completes. I suspect something similar is going on here.
Start-Process -FilePath C:\Windows\System32\Sysprep\Sysprep.exe -ArgumentList '/generalize /oobe /quiet /quit' -Wait
While the command:
sudo waagent -deprovision+user -verbose -force
works via SSH, when run via CustomScript extension this command basically kills everything on the machine. The CustomScript extension is not able to acknowledge completion.
Using this script:
sudo shutdown +3
sudo waagent -deprovision+user -verbose -force -start
Line 1, shuts the VM down in 3 minutes(deprovision command seems very fast)
Line 2, adding '-start', runs waagent as a background process. This allows CustomScript extension to acknowledge completion.
Now this command completes (instead of hangs):
var cmd = "sh Deprovision.sh";
var result = vm.Update()
.DefineNewExtension("deprovision")
.WithPublisher("Microsoft.Azure.Extensions")
.WithType("CustomScript")
.WithVersion("2.1")
.WithMinorVersionAutoUpgrade()
.WithPublicSetting("fileUris", new string[] { blobUri })
.WithPublicSetting("commandToExecute", cmd)
.Attach()
.Apply();
After completion, we must poll Azure for when the VM is Stopped.
WaitForVMToStop(vm);
private void WaitForVMToStop(IVirtualMachine newVM)
{
Context.Logger.LogInformation($"WaitForVMToStop...");
bool stopped = false;
int cnt = 0;
do
{
var stoppedVM = Context.AzureInstance.VirtualMachines.GetById(newVM.Id);
stopped = (stoppedVM.PowerState == PowerState.Stopped);
if (!stopped)
{
cnt++;
Context.Logger.LogInformation($"\tPolling 60 seconds for 'PowerState = Stopped on [{newVM.Name}]...");
System.Threading.Thread.Sleep(60000);
if (cnt > 20)
{
Context.Logger.LogInformation($"\tSysPrep Extension exceeded 20 minutes. Aborting...");
throw new Exception($"SysPrep Extension exceeded 20 minutes on [{newVM.Name}]");
}
}
} while (!stopped);
Context.Logger.LogInformation($"\tWaited {cnt} minutes for 'PowerState = Stopped...");
}
This seems way too complicated to me, but it works. I especially do not like assuming deprovision will occur in 3 minutes or less. If anybody has a better way, please share.

The 'Run PowerShell' artifact failed to install with CommandNotFoundException

I'm trying to download and run a PowerShell script (from blob storage) using the Run Powershell artifact on an existing VM in Azure DevTest labs.
I get the following error and I assume I am doing something stupid.
& : The term './script.ps1' is not recognized as the name of a cmdlet,
function, script file, or operable program. Check the spelling of the name, or
if a path was included, verify that the path is correct and try again.
At line:1 char:3
+ & ./script.ps1
+ ~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (./script.ps1:String) [], Comman
dNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
Here is my setup...
I have also tried the JSON array syntax, which gave the same result, and an invalid URL which gave a 404 error so it seems as if it is downloading my script but then failing to find it.
Below is info I wrote a while back.
Few items to note:
Folder structure is not supported as of this writing. Therefore, script needs to be at the root of the container
Ensure your blob is public
First you will need your file in Azure storage. Once uploaded in your container, click the file to get to its properties and copy the URL field.
As an example, I have created the following Run.ps1 script file and uploaded it to storage as a blob:
param ( [string]$drive = "c:\" )
param ( [string]$folderName = "DefaultFolderName" )
New-Item -Path $drive -Name $folderName -ItemType "directory"
Now, while adding the 'Run PowerShell' artifact to the VM, we provide the following information:
File URI(s): URL field copied from earlier step. (eg. https://myblob.blob.core.windows.net/mycontainer/Run.ps1)
Script to Run: Name of the PS1 script, (eg. Run.ps1)
Script Arguments: Arguments as you would write them at the end of your command (eg. -drive "d:\" -folderName "MyFolder")

Resources