Azure-Environmentvariable not readable on NLog - Config - azure

I deployed a webjob onto Azure (under the home\site\wwwroot\App_Data\jobs\triggered directory). This application contains NLog-logging which is configured in the appsettings and uses environment-variable for the logfile-path:
"NLog": {
"throwConfigExceptions": true,
"targets": {
"logfile": {
"type": "File",
"fileName": "${environment:variable=DEPLOYMENT_SOURCE}\\LogFiles\\timer-${shortdate}.log",
"layout": "${message} "
},
The DEPLOYMENT_SOURCE - Environment-variable contains a valid path when displaying it in kudu:
echo %DEPLOYMENT_SOURCE%
C:\home
But Nlog does not seem to be able to resolve that environment var. When enabling Trace-Log I receive the following error message:
Debug Creating file appender: C:\LogFiles\timer-2020-11-13.log
Trace Opening C:\LogFiles\timer-2020-11-13.log with allowFileSharedWriting=False
Error FileTarget(Name=logfile): Failed write to file 'C:\LogFiles\timer-2020-11-13.log'. Exception: > System.UnauthorizedAccessException: Access to the path 'C:\LogFiles\timer-2020-11-13.log' is denied.
So it seems like DEPLOYMENT_SOURCE is simply an empty string.
When testing this locally though with a valid Windows-Env like %TEMP% everything works fine.
What has to be done to access Azure-Environments in Dotnetcoreapps/NLog-Config?

I solved this issue.
The problem is that Triggered WebJobs on Azure DevOps do NOT have the same environment variables available as the Kudu console.
So while Kudu displayed different environment variables like DEPLOYMENT_SOURCE, this variable is not available for webjobs.
But there are other environments (in this case "HOME", like Rolf already mentioned in the comments) that also points to C:\home on Azure. (D:\home in the past)

Related

How to configure Serilog settings in Azure Function?

I’m in the process of creating my first production Azure Function and I’m trying to write log information to a local file and then eventually to Blob Storage. The local file is more for development troubleshooting and then ultimately I would like to have production information stored in Blob Storage. I’m not only new with Azure Functions but I’m also new with Serilog. I’ve used NLog in all my other applications but couldn’t get it to work with Azure Functions.
Currently I’m trying to get the local log working. I actually seem to have it working but I’m not understanding how I can tweak a couple things.
The first thing I’m trying to change is the amount of information that is getting logged. It seems to be logging a whole bunch of system type of information like Request info to the blob storage. There is so much stuff getting logged that what entry I'm adding in code gets lost. It looks like all the system entries are marked as Information which is why it’s probably showing up in my log. However, I would like to see if I could get it to only log data from when I specifically call the logger.Information(“some text”) in my code. Is there a way to suppress all of the Microsoft system information?
Second thing is how I can I make the Serilog configuration come from my local.settings.json file. Below is a sample of my file and I’m not sure if I would add the configuration information in the Values: property or if I would put it outside of that property into its own property? I’m assuming it would be in it’s own property but so far all my custom settings have been coming from the Values: property?
Do I need to add the Serilog.Settings.Configuration NuGet package? If so, then I’m not understanding how I configure my Startup.cs file to get the information from the local settings file instead of configuring the settings directly in code. Eventually, I would like to add it to Dependency Injection so I can use the logger in other classes as well.
Startup.cs
public override void Configure(IFunctionsHostBuilder builder)
{
builder.Services.AddTransient<IDataManager, DataManager>();
ConfigureServices(builder.Services).BuildServiceProvider(true);
}
private IServiceCollection ConfigureServices(IServiceCollection services)
{
services
.AddLogging(loggingBuilder =>
loggingBuilder.AddSerilog(
new LoggerConfiguration()
.MinimumLevel.Information()
.WriteTo.Console()
.WriteTo.File(#"D:\logs\AzureFunction\log_.txt", rollingInterval: RollingInterval.Day)
.CreateLogger())
);
return services;
}
Local.settings.json
{
"IsEncrypted": false,
"Values": {
"ProcessLookBackDays": "90",
"SqlConnection": "connection info",
"StorageConnection": "connection info"
"AzureWebJobsStorage": "connection info"
"InputContainer": "test-files",
"InputFolder": "input",
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
},
"Serilog": {
"Using": [ "Serilog.Sinks.File" ],
"MinimumLevel": {
"Default": "Information"
},
"WriteTo": [
{
"Name": "File",
"Args": {
"path": "D:\\logs\\AzureFunction\\log_.log",
"rollingInterval": "Day",
"outputTemplate": "[{Timestamp:yyyy-MM-dd HH:mm:ss.fff zzz} {CorrelationId} {Level:u3}] {Username} {Message:lj}{NewLine}{Exception}"
}
}
]
}
}
Setting up configuration in local-setting. Json will not reflect in the azure function app. local-setting as the name suggest is for local use only while with azure you need to use the app setting and read them.
Just add a new setting in the app setting and you can then use the below code to read the settings.
var appsettings = Environment.GetEnvironmentVariable("Name of the setting");
When you want to use the external configuration with serilog then you can use the Serilog.Settings.Configuration.
Now you can configure the minimum level of a log event so all the log event with less importance than the specified minimum level will not be logged.
Log.Logger = new LoggerConfiguration()
.MinimumLevel.Debug()
.WriteTo.Console(restrictedToMinimumLevel: LogEventLevel.Information)
.CreateLogger();
Here we specify two things the .MinimumLevel.Debug() and restrictedToMinimumLevel this attribute dictates the minimum level for the that particular sink. A sink is just a place where you can log, they are configured using the Writeto tag. for e.g., in the above code the sink is a console. There are other sink too.
The minimum levels are verbose Debug, information, warning, error, fatal.
Reference:-
Read configuration from app setting by Ashish Patel
Use serilog to filter the logs
Serilog setting configuration

ARM Deployment Error- The request content was invalid and could not be deserialized: 'Cannot deserialize the current JSON array

I had gone through the previous posts similar and not able to find any solution for my situation. So asking again. Please consider.
I am trying to deploy Azure Policy using ARM templates. So, I have created
1- Policy Definition File
2- Policy Parameter File
3- Power Shell script – Run with both Policy and Parameter file as input.
But when I trying to deploy, I am getting the error as attached. The “policyParameters” are being passed as Object type. Seems like the problem resides there. It would be great if you could look at this screen shot attached and advice.
Also the Powershell script out put shows the values expected I think but "ProvisioningState : Failed".
Thanks,
PolicyFile
Error Output
Parameter File
JSON-part1
JSON-Part2
You have to create a variable for policyParametars:
"variables": {
"policyParameters": {
"policyDefinitionId": {
"defaultValue": "[parameters('policyDefinitionId')]",
"type": "String"
},
...
This variable has to be passed to your parameters:
"parameters": "[variables('policyParameters')]",
You can find a sample here:
Configure Azure Diagnostic Settings with Azure Policies

Where is the log file created when debugging Azure Function in Visual Studio

I have a Timer Azure Function which I execute in VS. Right click on the Azure Function project and Debug. The function has an ILogger log.
Inspecting the log object I can see that is has two loggers
Azure.Functions.Cli.Diagnostics.ColoredConsoleLogger
Microsoft.Azure.WebJobs.Script.Diagnostics.FileLogger
I also can see that the RootLogPath is %temp%\LogFiles\Application\Functions.
However at that location there is only a "Host" folder. I expected to find a "Function" folder as well with the log file.
Do I need to enable somehow the File Logger? Do I miss anything?
To get file logs in local dev, we do have to modify the fileLoggingMode to always in host.json. The default debugOnly setting doesn't make function write file logs locally.
For v2 Functions
{
"version": "2.0",
"logging": {
"fileLoggingMode": "always"
}
}
For v1 Functions
{
"tracing": {
"fileLoggingMode": "always"
}
}

Issues deploying dscExtension to Azure VMSS

I've been having some issues deploying a dscExtension to an Azure virtual machine scale set (VMSS) using a deployment template.
Here's how I've added it to my template:
{
"name": "dscExtension",
"properties": {
"publisher": "Microsoft.Powershell",
"type": "DSC",
"typeHandlerVersion": "2.9",
"autoUpgradeMinorVersion": true,
"settings": {
"ModulesUrl": "[concat(parameters('_artifactsLocation'), '/', 'MyDscPackage.zip', parameters('_artifactsLocationSasToken'))]",
"ConfigurationFunction": "CmvmProcessor.ps1\\CmvmProcessor",
"Properties": [
{
"Name": "ServiceCredentials",
"Value": {
"UserName": "parameters('administratorLogin')",
"Password": "parameters('administratorLoginPassword')"
},
"TypeName": "System.Management.Automation.PSCredential"
}
]
}
}
}
The VMSS itself is successfully deploying, but when I browse the InstanceView of the individual VMs, the dscExtension shows the failed status with an error message.
The problems I'm having are as follows:
The ARM deployment does not try to update the dscExtension upon redeploy. I am used to MSDeploy web app extensions where the artifacts are updated and the code is redeployed on each new deployment. I do not know how to force it to update the dscExtension with new binaries. In fact it only seems to give an error on the first deploy of the VMSS, then it won't even try again.
The error I'm getting is for old code that doesn't exist anymore.
I had a bug previously in a custom DSC Powershell script where I tried to use the -replace operator which is supposed to create a $Matches variable but it was saying $Matches didn't exist.
In any case, I've since refactored the code and deleted the entire resource group then redeployed. The dscExtension is still giving the same error. I've verified the blob storage account where my DSC .zip is located no longer has the code which is capable of producing this error message. Azure must be caching the dscExtension somewhere. I can't get it to use my new blob .zip that I upload before each deployment.
Any insight into the DSC Extension and how to force it to update on deploy?
It sounds like you may be running into multiple things here, so trying the simple one first. In order to get a VM extension to run on a subsequent deployment you have to "seed" it. (and you're right this is different than the rest of AzureRM) Take a look at this template:
https://github.com/bmoore-msft/AzureRM-Samples/blob/master/VMDSCInstallFile/azuredeploy.json
There is a property on the DSC extension called:
"forceUpdateTag" : "changeThisToEnsureScriptRuns-maxlength=50",
The property value must be different if you ever want the extension to run again. So for example, if you wanted it to run every time you'd seed it with a random number or a guid. You could also use version numbers if you wanted to version it somehow. The point is, if the value in the template is the same as the one you're passing in, the extension won't run again.
That sample uses a VM, but the VMSS syntax should be the same. That property also applies to other extensions (e.g. custom script).
The part that seems odd is that you said you deleted the entire RG and couldn't get it to accept the new package... That sounds bad (i.e. like a bug). If the above doesn't fix it, we may need to dig deeper into the template and script. LMK...

Error w/ Azure CLI on importing publish settings file

This is similar to the issue this SO user was having except I'm getting a different error for the same behavior.
I downloaded the publishsettings file from azure and
Issued this command in the azure cli: azure account import <MySite>.azurewebsites.net.PublishSettings
and I got the following error:
{ name: 'AssertionError',
message: undefined,
actual: 'UNIVERSAL-primative-0',
expected: 'UNIVERSAL-primative-6',
operator: '==' }
AssertionError: "UNIVERSAL-primative-0" == "UNIVERSAL-primative-6"
...Shortened for brevity. Let me know if you'd like the full stack trace...
I wasn't particularly anxious to wrap this node project in a VisualStudio project, but I think in a pinch, I could and just format the publish settings from within VS. But if there is a way to do this correctly, I'd prefer that.
Where did you get the file? Were you using the following command to get it?
azure site download
It seems like you are using the publishsettings file of an Azure Web Site while xplat-cli expects the publishsettings file of the subscription.
There are kinds of 2 publishsettings files. And yeah, it's confusing.

Resources