az storage blob upload command fails with strange exception when invoked inside a Windows executable - exe

We have below az storage blob upload command in a powershell script named upload_file.ps1 that uploads a file to Azure storage as a blob.
$ErrorActionPreference = "Stop"
# Blob connection string parsed from a secure string
az storage blob upload --container-name "ftp" --connection-string "$blobConnStr" --name "testfile.txt" --file testfile.txt
There is no problem when executing this script directly. But after converting it into a Windows executable upload_file.exe using this PS2EXE tool, the execution fails with below exception.
ERROR: System.Management.Automation.PSInvocationStateInfo
ERROR: System.Management.Automation.ActionPreferenceStopException: The running command stopped because the preference variable "
ErrorActionPreference" or common parameter is set to Stop: System.Management.Automation.RemoteException
at System.Management.Automation.ExceptionHandlingOps.CheckActionPreference(FunctionContext funcContext, Exception exception)
at System.Management.Automation.Interpreter.ActionCallInstruction`2.Run(InterpretedFrame frame)
at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)
at System.Management.Automation.Interpreter.EnterTryCatchFinallyInstruction.Run(InterpretedFrame frame)
at System.Management.Automation.Interpreter.Interpreter.Run(InterpretedFrame frame)
at System.Management.Automation.Interpreter.LightLambda.RunVoid1[T0](T0 arg0)
at System.Management.Automation.DlrScriptCommandProcessor.RunClause(Action`1 clause, Object dollarUnderbar, Object inputToPro
cess)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Management.Automation.Internal.PipelineProcessor.SynchronousExecuteEnumerate(Object input)
at System.Management.Automation.Runspaces.LocalPipeline.InvokeHelper()
at System.Management.Automation.Runspaces.LocalPipeline.InvokeThreadProc()
ERROR: Failed
ERROR: The running command stopped because the preference variable "ErrorActionPreference" or common parameter is set to Stop: S
ystem.Management.Automation.RemoteException
To print out above verbose level message, I modified the ps2exe.ps1 script a bit as below.
if (powershell.InvocationStateInfo.State == PSInvocationState.Failed) {
ui.WriteErrorLine(powershell.InvocationStateInfo.ToString());
ui.WriteErrorLine(powershell.InvocationStateInfo.Reason.ToString());
ui.WriteErrorLine(powershell.InvocationStateInfo.State.ToString());
ui.WriteErrorLine(powershell.InvocationStateInfo.Reason.Message);
}
Not sure if there is any compatibility issue between Azure CLI and Windows executables. Would appreciate it a lot if someone with Windows experience could enlighten us on this.

Related

Az copy remove files is failing

The azcopy to remove files failed when I tried to remove a folder with subfolders and files from a fileshare with azcopy v10.
My az copy command was as follows
azcopy rm https://<storage-account-name>.file.core.windows.net/<file-share-name>/SystemScheduledJobs-22-06-01?<sas-token> --recursive=true
The error which I was getting is
panic: inconsistent path separators. Some are forward, some are back. This is not supported.
and the stack trace
goroutine 565 [running]:
github.com/Azure/azure-storage-azcopy/v10/common.DeterminePathSeparator({0xc0071c6d00, 0x3e})
/home/vsts/work/1/s/common/extensions.go:140 +0x97
github.com/Azure/azure-storage-azcopy/v10/common.GenerateFullPath({0x0, 0x0}, {0xc0071c6d00, 0x3e})
/home/vsts/work/1/s/common/extensions.go:155 +0xe5
github.com/Azure/azure-storage-azcopy/v10/common.GenerateFullPathWithQuery({0x0, 0x248f909e0be}, {0xc0071c6d00, 0xb85c3d}, {0x0, 0x0})
/home/vsts/work/1/s/common/extensions.go:172 +0x34
github.com/Azure/azure-storage-azcopy/v10/ste.(*JobPartPlanHeader).TransferSrcDstStrings(0x248f8ec0000, 0x1853)
/home/vsts/work/1/s/ste/JobPartPlan.go:181 +0x28f
github.com/Azure/azure-storage-azcopy/v10/ste.(*jobPartMgr).ScheduleTransfers(0xc000029500, {0x17c3190, 0xc0005b2000})
/home/vsts/work/1/s/ste/mgr-JobPartMgr.go:418 +0x692
github.com/Azure/azure-storage-azcopy/v10/ste.(*jobMgr).scheduleJobParts(0xc0007a3880)
/home/vsts/work/1/s/ste/mgr-JobMgr.go:851 +0x3e
created by github.com/Azure/azure-storage-azcopy/v10/ste.NewJobMgr
/home/vsts/work/1/s/ste/mgr-JobMgr.go:180 +0x9a6
I would be really grateful if anyone could provide more insight into this issue.
I tried to reproduce in my environment using the powershell command I got result successfully.
Powershell command
azcopy rm https://<storage-account-name>.file.core.windows.net/<file-share-name>/<path>?<sas-token> --recursive=true
Output:
*FileShare Url (https://<storage-account-name>.file.core.windows.net/<file-share-name>/<path>)
SAS Token at Storage Account level..
kindly check once with path in your url and try to reproduce once again.
Reference:
azcopy remove | Microsoft Docs

Publishing Azure Function Using CloudShell

I am using CloudShell to publish AzureFunction. I was able to publish the AzureFunction previously but today I am getting this Error while executing the publish command func azure functionapp publish <APP_NAME>
Getting site publishing info...
Creating archive for current directory...
Performing remote build for functions project. Deleting the old .python_packages directory Uploading 15.91 KB [##############################################################################]
Remote build in progress, please wait...
Unexpected character encountered while parsing value: {. Path '[0].build_summary', line 1, position 630.

Azure WebApp Linux and KissLog fail to access /tmp path

I have an Azure webapp ASP.Net API core 3.1 application running on Linux, I use KissLog to log the system, with a certain frequency I get the following errors:
C:\Catalin\KissLog-net\KissLog.Sdk\src\KissLog\LoggerFiles\LoggerFiles.cs LogFile :58
Exception:
System.UnauthorizedAccessException: Access to the path '/tmp/KissLog/2d76c974d7d1.tmp' is denied.
---> System.IO.IOException: Bad file descriptor
--- End of inner exception stack trace ---
at System.IO.FileStream.Dispose(Boolean disposing)
at System.IO.FileSystem.CopyFile(String sourceFullPath, String destFullPath, Boolean overwrite)
at KissLog.LoggerFiles.LogFile(String sourceFilePath, String fileName)
Inner Exception:
System.IO.IOException: Bad file descriptor
this error only occurs on WebApp Linux, I have the same code running on Windows WebApp, and it works well
This error is triggered when KissLog is trying to log the HTTP response body. If this step fails, the exception is logged and the rest of the execution is not affected.
I have released KissLog 5.0.0, which contains improvements on this functionality.
If possible, please update the sdk to 5.0.0.

Trying to create docker container function app, when i do docker run getting the error "Storage not defined"

Following the microsoft document (https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-function-linux-custom-image?tabs=bash%2Cportal&pivots=programming-language-python) but for Azure Blob Storage trigger template.
When i run docker run -p 8080:80 -it example/azurefunctionsimage:v1.0 getting the below error,
fail: Host.Startup[402]
fail: Host.Startup[402]
The 'voice-text' function is in error: Microsoft.Azure.WebJobs.Host: Error indexing method 'Functions.voice-text'. Microsoft.Azure.WebJobs.Extensions.Storage: Storage account connection string 'AzureWebJobsStorage' does not exist. Make sure that it is a defined App Setting.
Hosting environment: Production
Content root path: /
Now listening on: http://[::]:80
Application started. Press Ctrl+C to shut down.
info: Host.General[316]
Host lock lease acquired by instance ID '0000000000000000000000000'.
warn: Microsoft.Azure.WebJobs.Script.ChangeAnalysis.ChangeAnalysisService[0]
Breaking change analysis operation failed
System.InvalidOperationException: The BlobChangeAnalysisStateProvider requires the default storage account 'Storage', which is not defined.
at Microsoft.Azure.WebJobs.Script.ChangeAnalysis.BlobChangeAnalysisStateProvider.GetCurrentAsync(CancellationToken cancellationToken) in /src/azure-functions-host/src/WebJobs.Script.WebHost/BreakingChangeAnalysis/BlobChangeAnalysisStateProvider.cs:line 40
at Microsoft.Azure.WebJobs.Script.ChangeAnalysis.ChangeAnalysisService.TryLogBreakingChangeReportAsync(CancellationToken cancellationToken) in /src/azure-functions-host/src/WebJobs.Script.WebHost/BreakingChangeAnalysis/ChangeAnalysisService.cs:line 92
Please help in solving this error..
Azure Functions needs a storage account in order to run locally.
The best way to add this is to create a Storage Account in Azure and pass the connection string as an environment variable called AzureWebJobsStorage to the docker run command.
docker run -p 8080:80 -it -e AzureWebJobsStorage="{connection-string}" <docker-id>/mydockerimage:v1.0.0

AzCopy Sync command is failing

I'm issuing this command:
azcopy sync "D:\Releases\Test\MyApp" "http://server3:10000/devstoreaccount1/myapp?sv=2019-02-02&st=2020-06-24T03%3A19%3A44Z&se=2020-06-25T03%3A19%3A44Z&sr=c&sp=racwdl&sig=REDACTED"
...and I'm getting this error:
error parsing the input given by the user. Failed with error Unable to infer the source 'D:\Releases\Test\MyApp' / destination 'http://server3:10000/devstoreaccount1/myapp?sv=2019-02-02&st=2020-06-24T03%3A19%3A44Z&se=2020-06-25T03%3A19%3A44Z&sr=c&sp=racwdl&sig=-REDACTED-
I would have thought my source was pretty clear.
Can anyone see anything wrong with my syntax?
I believe you have run into an issue with azcopy that it does not support local emulator (at least for sync command). There's an open issue on Github for the same: https://github.com/Azure/azure-storage-azcopy/issues/554.
Basically the issue is coming from the following lines of code, where it returns location as Unknown in case of storage emulator URLs:
func inferArgumentLocation(arg string) common.Location {
if arg == pipeLocation {
return common.ELocation.Pipe()
}
if startsWith(arg, "http") {
// Let's try to parse the argument as a URL
u, err := url.Parse(arg)
// NOTE: sometimes, a local path can also be parsed as a url. To avoid thinking it's a URL, check Scheme, Host, and Path
if err == nil && u.Scheme != "" && u.Host != "" {
// Is the argument a URL to blob storage?
switch host := strings.ToLower(u.Host); true {
// Azure Stack does not have the core.windows.net
case strings.Contains(host, ".blob"):
return common.ELocation.Blob()
case strings.Contains(host, ".file"):
return common.ELocation.File()
case strings.Contains(host, ".dfs"):
return common.ELocation.BlobFS()
case strings.Contains(host, benchmarkSourceHost):
return common.ELocation.Benchmark()
// enable targeting an emulator/stack
case IPv4Regex.MatchString(host):
return common.ELocation.Unknown()//This is what gets returned in case of storage emulator URL.
}
if common.IsS3URL(*u) {
return common.ELocation.S3()
}
}
}
return common.ELocation.Local()
}
I had this same issue when trying to do a sync from my local machine to Azure Blob storage.
This was the command I was running:
azcopy sync "C:\AzureStorageTest\my-app\*" "https://myapptest.z16.web.core.windows.net/$web"
But I got the error below:
INFO: The parameters you supplied were Source: 'c:\AzureStorageTest\my-app' of type Local, and Destination: 'https://
myapptest.z16.web.core.windows.net/' of type Local
INFO: Based on the parameters supplied, a valid source-destination combination could not automatically be found. Please
check the parameters you supplied. If they are correct, please specify an exact source and destination type using the -
-from-to switch. Valid values are two-word phases of the form BlobLocal, LocalBlob etc. Use the word 'Blob' for Blob St
orage, 'Local' for the local file system, 'File' for Azure Files, and 'BlobFS' for ADLS Gen2. If you need a combination
that is not supported yet, please log an issue on the AzCopy GitHub issues list.
error parsing the input given by the user. Failed with error Unable to infer the source 'C:\AzureStorageTest\my-app'
/ destination 'https://myapptest.z16.web.core.windows.net.z16.web.core.windows.net/'.
PS C:\Users\promise> azcopy sync "C:\AzureStorageTest\my-app" "https://myapptest.z16.web.core.windows.net.z16.web.core.window
s.net/$web" -from-to localblob
Error: unknown shorthand flag: 'f' in -from-to
Here's how I solved it:
I was missing the ?[SAS] argument at the end of the Blob storage location. Also, as of this writing, the azccopy sync command does not seem to support the --from-to switch. So instead of this:
azcopy sync "C:\AzureStorageTest\my-app" "https://myapptest.z16.web.core.windows.net/$web"
I had this:
azcopy sync "C:\AzureStorageTest\my-app" "https://myapptest.blob.core.windows.net/%24web?[SAS]" --recursive
Note:
The format is azcopy sync "/path/to/dir" "https://[account].blob.core.windows.net/[container]/[path/to/directory]?[SAS]" --recursive. You only need to modify the "/path/to/dir", [account] and [container]/[path/to/directory]. Every other thing remains the way it is.
My actual Blob storage location is https://myapptest.blob.core.windows.net/$24web but I used https://myapptest.blob.core.windows.net/%24web, since $ will throw some error when used, so %24 was used.
Don't use the blob website address like I did which is https://myapptest.z16.web.core.windows.net/ that got me frustrated with lots of errrors, rather use the blob storage address https://myapptest.blob.core.windows.net
The --recursive flag is already inferred in this directory sync operation, so you may consider leaving it out. I put it though for clarity sake.
That's all.
I hope this helps
OK—after much futzing I was finally able to get this to work, using Azurite and PowerShell. It's clear that neither AzureCLI nor AzCopy are well-tested under emulation.
Here's a rough-and-tumble script that can be called from a pipeline:
[CmdletBinding()]
param(
[Parameter(Mandatory)][string] $Container,
[Parameter(Mandatory)][string] $Source
)
$Context = New-AzureStorageContext -Local
$BlobNames = Get-AzureStorageBlob -Context $Context -Container $Container | % { $_.Name }
$FilesToSync = gci $Source\* -Include RELEASES, Setup.exe
$Packages = gci $Source -Filter *.nupkg
$Packages | % {
If (!($BlobNames.Contains($_.Name))) {
$FilesToSync += $_
}
}
$FilesToSync | Set-AzureStorageBlobContent -Context $Context -Container $Container -Force
Note that this is highly customized for my Squirrel deployments (*.nupkg, RELEASES, Setup.exe), so a person will want to adjust accordingly for his own environment.
Azurite can be set to always-on using a Scheduled Task to run this command every hour:
powershell -command "Start-Process azurite-blob.cmd -PassThru -ArgumentList '--blobHost 0.0.0.0'"
The argument sets Azurite to listen on any IP so that it can be reached from other computers on the network. I punched a hole in the firewall for ports 10000-10002.
Be careful to set the Task to run under the same account that was used to install Azurite, otherwise the Task won't be able to see azurite-blob.cmd (it's in %AppData%\npm, which is added to PATH during installation).
The command line needs the --recursive option. See https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-blobs?toc=/azure/storage/blobs/toc.json

Resources