I've a simple console app running both locally and as a webjob in App Service.
var configBuilder = new ConfigurationBuilder();
configBuilder.AddEnvironmentVariables();
configBuilder.AddUserSecrets<Program>();
var config = configBuilder.Build();
// Configure the LoggerFactory
var loggerFactory = LoggerFactory.Create(builder =>
{
builder
.AddConsole()
.SetMinimumLevel(LogLevel.Information)
.AddAzureWebAppDiagnostics();
// If the key exists in settings, use it to enable Application Insights.
string? connStr = config["APPLICATIONINSIGHTS_CONNECTION_STRING"];
Console.WriteLine(connStr);
if (!string.IsNullOrEmpty(connStr))
{
builder.AddApplicationInsightsWebJobs(o => o.ConnectionString = connStr);
Console.WriteLine("Application Insights enabled");
}
});
// Create a logger
var logger = loggerFactory.CreateLogger<Program>();
logger.LogInformation("Test sending to App insights");
I want the last text in the last statement to be logged to App Insights. What's missing in my code?
I've run the code both locally and in Azure, neither alternative succeeds in sending the log to App Insights. I've also made sure that the connection string is correct.
Related
Azure App Functions 3.0
I am attempting to log activities and errors from an internal class used by functions but the logs are not correct written, since I cannot instantiate/find the service binded to the correct "ILogger" (the one naturally injected in constructors and functions).
I do not want to pass down the Logger instance through classes from the entry point, but inject it correctly.
Tried new LoggerFactory() with no success.
var loggerFactory = new LoggerFactory();
var logger = loggerFactory.CreateLogger("InfoLogger");
logger.LogInformation("Please, log this information!");
Tried ActivatorUtilities.CreateInstance() with Ninject registering the service provider associated with the logger at startup with no success.
// Startup.cs
NinjectKernel._kernel
.Bind<IServiceProvider>()
.ToMethod(context => builder.Services.BuildServiceProvider())
.InSingletonScope();
// MyInternalClass.cs
var serviceProvider = NinjectKernel.Get<IServiceProvider>();
var logger = ActivatorUtilities.CreateInstance<ILogger>(serviceProvider);
logger.LogInformation($"Information Ticket: {ticket} | Data: ...");
Any help are welcome!
When I inject IConfiguration in a function, it does not find any keys that only live in my "Azure App Configuration".
I have a functionApp (V3) that accesses App Configuration using the DefaultAzureCredential. I am running this locally in debug hence the need for a default credential. I also have multiple Tenants so I had to set the VisualStudioTenantId and SharedTokenCacheTenantId on DefaultAzureCredentialOptions. My Visual studio user was also given the role "App Configuration Data Reader" to be able to debug.
When connecting to App configuration I get no errors.
Editedto add: I have setup AppConfiguration to authenticate with AzureAD.
See code below:
public override async void ConfigureAppConfiguration(IFunctionsConfigurationBuilder builder)
{
var credOptions = new DefaultAzureCredentialOptions();
var tenantId = Environment.GetEnvironmentVariable("Tenant_Id");
credOptions.VisualStudioTenantId = tenantId;
credOptions.SharedTokenCacheTenantId = tenantId;
var cred = new DefaultAzureCredential(credOptions);
/*Works but requires SharedTokenCacheTenantId*/
var secretClient = new SecretClient(new Uri(vaultURI), cred);
var secret = await secretClient.GetSecretAsync("<secret name>");
/*Works but where are my keys when I try to access them?*/
builder.ConfigurationBuilder.AddAzureAppConfiguration(options =>
{
options.Connect(new Uri(appConfigURI), cred);
}).Build(); //Should I be building this??
}
In my function
public FunctName(IConfiguration configuration)
{
_configuration = configuration;
}
And when I access the property
var prop = _configuration["PropertyName"];
There is an example function app that uses IFunctionsConfigurationBuilder here https://github.com/Azure/AppConfiguration/blob/main/examples/DotNetCore/AzureFunction/FunctionApp/Startup.cs . I would recommend taking a look and seeing if there are any missing pieces.
The title mentions "using DefaultAzureCredential on local". Does that mean that this works as expected if you use a connection string?
Notice the async void ConfigureAppConfiguration. This caused my ConfigureAppConfiguration to not execute synchronously, causing configure to add my App Configuration before it was populated.
From where is the InstrumentationKey read?
context.Configuration["APPINSIGHTS_INSTRUMENTATIONKEY"];
I have put that key in the applicationInsights.config in the section
<InstrumentationKey>94efb022-e651-46a0-b103-5735daa213f1</InstrumentationKey>
but its not taken from there...
var builder = new HostBuilder()
.UseEnvironment("Development")
.ConfigureWebJobs(b =>
{
// Add extensions and other WebJobs services
})
.ConfigureAppConfiguration(b =>
{
// Add configuration sources
})
.ConfigureLogging((context, b) =>
{
// Add Logging Providers
b.AddConsole();
// If this key exists in any config, use it to enable App Insights
string appInsightsKey = context.Configuration["APPINSIGHTS_INSTRUMENTATIONKEY"];
if (!string.IsNullOrEmpty(appInsightsKey))
{
// This uses the options callback to explicitly set the instrumentation key.
b.AddApplicationInsights(o => o.InstrumentationKey = appInsightsKey);
}
})
.UseConsoleLifetime();
If you want to read it on Azure just set it in the Application Settings in the Portal.
And if you are running it locally, in the appsettings.json file add APPINSIGHTS_INSTRUMENTATIONKEY field there.
{
"AzureWebJobsStorage": "{storage connection string}",
"APPINSIGHTS_INSTRUMENTATIONKEY": "{instrumentation key}"
}
Further more information, refer to this doc :Add Application Insights logging. Hope this could help you.
You need to install below packages:
Microsoft.Azure.WebJobs.Logging.ApplicationInsights (Currently in beta)
Microsoft.Extensions.Logging
Microsoft.Extensions.Logging.Console
and Configure JobHostConfiguration as below:
string instrumentationKey = Environment.GetEnvironmentVariable("APPINSIGHTS_INSTRUMENTATIONKEY");
if (!string.IsNullOrEmpty(instrumentationKey))
{
// build up a LoggerFactory with ApplicationInsights and a Console Logger
config.LoggerFactory = new LoggerFactory().AddApplicationInsights(instrumentationKey, null).AddConsole();
config.Tracing.ConsoleLevel = TraceLevel.Off;
}
You can read about configuration application insight with Azure web job here. Hope it helps.
I created a simple ASP.NET Core Web application using OAuth authentication from Google. I have this running on my local machine fine.
Yet after deploying this as an AppService to Azure the OAuth redirects seem to get messed up.
The app itself can be found here:
https://gcalworkshiftui20180322114905.azurewebsites.net/
Here's an url that actually returns a result and shows that the app is running:
https://gcalworkshiftui20180322114905.azurewebsites.net/Account/Login?ReturnUrl=%2F
Sometimes the app responds fine but once I try to login using Google it keeps loading forever and eventually comes back with the following message:
The specified CGI application encountered an error and the server terminated the process.
Behind the scenes, the authentication callback that seems to be failing with a 502.3 error:
502.3 Bad Gateway “The operation timed out”
The error trace can be found here:
https://gcalworkshiftui20180322114905.azurewebsites.net/errorlog.xml
The documentation from Microsoft hasn't really helped yet.
https://learn.microsoft.com/en-us/azure/app-service/app-service-authentication-overview
Further investigation leads me to believe that this has to do with the following code:
public GCalService(string clientId, string secret)
{
string credPath = System.Environment.GetFolderPath(System.Environment.SpecialFolder.Personal);
credPath = Path.Combine(credPath, ".credentials/calendar-dotnet-quickstart.json");
var credential = GoogleWebAuthorizationBroker.AuthorizeAsync(
new ClientSecrets
{
ClientId = clientId,
ClientSecret = secret
},
new[] {CalendarService.Scope.Calendar},
"user",
CancellationToken.None,
new FileDataStore(credPath, true)).Result;
// Create Google Calendar API service.
_service = new CalendarService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = "gcalworkshift"
});
}
As I can imagine Azure not supporting personal folders? Googling about this doesn't tell me much.
I followed Facebook, Google, and external provider authentication in ASP.NET Core and Google external login setup in ASP.NET Core to create a ASP.NET Core Web Application with Google authentication to check this issue.
I also followed .NET console application to access the Google Calendar API and Calendar.ASP.NET.MVC5 to build my sample project. Here is the core code, you could refer to them:
Startup.cs
public class Startup
{
public readonly IDataStore dataStore = new FileDataStore(GoogleWebAuthorizationBroker.Folder); //C:\Users\{username}\AppData\Roaming\Google.Apis.Auth
public Startup(IConfiguration configuration)
{
Configuration = configuration;
}
public IConfiguration Configuration { get; }
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
services.AddDbContext<ApplicationDbContext>(options =>
options.UseSqlServer(Configuration.GetConnectionString("DefaultConnection")));
services.AddIdentity<ApplicationUser, IdentityRole>()
.AddEntityFrameworkStores<ApplicationDbContext>()
.AddDefaultTokenProviders();
services.AddAuthentication().AddGoogle(googleOptions =>
{
googleOptions.ClientId = "{ClientId}";
googleOptions.ClientSecret = "{ClientSecret}";
googleOptions.Scope.Add(CalendarService.Scope.CalendarReadonly); //"https://www.googleapis.com/auth/calendar.readonly"
googleOptions.AccessType = "offline"; //request a refresh_token
googleOptions.Events = new OAuthEvents()
{
OnCreatingTicket = async (context) =>
{
var userEmail = context.Identity.FindFirst(ClaimTypes.Email).Value;
var tokenResponse = new TokenResponse()
{
AccessToken = context.AccessToken,
RefreshToken = context.RefreshToken,
ExpiresInSeconds = (long)context.ExpiresIn.Value.TotalSeconds,
IssuedUtc = DateTime.UtcNow
};
await dataStore.StoreAsync(userEmail, tokenResponse);
}
};
});
services.AddMvc();
}
}
}
CalendarController.cs
[Authorize]
public class CalendarController : Controller
{
private readonly IDataStore dataStore = new FileDataStore(GoogleWebAuthorizationBroker.Folder);
private async Task<UserCredential> GetCredentialForApiAsync()
{
var initializer = new GoogleAuthorizationCodeFlow.Initializer
{
ClientSecrets = new ClientSecrets
{
ClientId = "{ClientId}",
ClientSecret = "{ClientSecret}",
},
Scopes = new[] {
"openid",
"email",
CalendarService.Scope.CalendarReadonly
}
};
var flow = new GoogleAuthorizationCodeFlow(initializer);
string userEmail = ((ClaimsIdentity)HttpContext.User.Identity).FindFirst(ClaimTypes.Name).Value;
var token = await dataStore.GetAsync<TokenResponse>(userEmail);
return new UserCredential(flow, userEmail, token);
}
// GET: /Calendar/ListCalendars
public async Task<ActionResult> ListCalendars()
{
const int MaxEventsPerCalendar = 20;
const int MaxEventsOverall = 50;
var credential = await GetCredentialForApiAsync();
var initializer = new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = "ASP.NET Core Google Calendar Sample",
};
var service = new CalendarService(initializer);
// Fetch the list of calendars.
var calendars = await service.CalendarList.List().ExecuteAsync();
return Json(calendars.Items);
}
}
Before deploying to Azure web app, I changed the folder parameter for constructing the FileDataStore to D:\home, but got the following error:
UnauthorizedAccessException: Access to the path 'D:\home\Google.Apis.Auth.OAuth2.Responses.TokenResponse-{user-identifier}' is denied.
Then, I tried to set the parameter folder to D:\home\site and redeploy my web application and found it could work as expected and the logged user crendentials would be saved under the D:\home\site of your azure web app server.
Azure Web Apps run in a secure environment called the sandbox which has some limitations, details you could follow Azure Web App sandbox.
Additionally, you mentioned about the App Service Authentication which provides build-in authentication without adding any code in your code. Since you have wrote the code in your web application for authentication, you do not need to set up the App Service Authentication.
For using App Service Authentication, you could follow here for configuration, then your NetCore backend can obtain additional user details (access_token,refresh_token,etc.) through an HTTP GET on the /.auth/me endpoint, details you could follow this similar issue. After retrieved the token response for the logged user, you could manually construct the UserCredential, then build the CalendarService.
I am using socket.io in node.js to implement chat functionality in my azure cloud project. In it i have been adding the user chat history to tables using node.js. It works fine when i run it on my local emulator, but strangely when i deploy to my azure cloud it doesnt work and it doesnt throw up any error either so its really mind boggling. Below is my code.
var app = require('express')()
, server = require('http').createServer(app)
, sio = require('socket.io')
, redis = require('redis');
var client = redis.createClient();
var io = sio.listen(server,{origins: '*:*'});
io.set("store", new sio.RedisStore);
process.env.AZURE_STORAGE_ACCOUNT = "account";
process.env.AZURE_STORAGE_ACCESS_KEY = "key";
var azure = require('azure');
var chatTableService = azure.createTableService();
createTable("ChatUser");
server.listen(4002);
socket.on('privateChat', function (data) {
var receiver = data.Receiver;
console.log(data.Username);
var chatGUID1 = 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, function(c) {
var r = Math.random()*16|0, v = c == 'x' ? r : (r&0x3|0x8);
return v.toString(16);
});
var chatRecord1 = {
PartitionKey: data.Receiver,
RowKey: data.Username,
ChatID: chatGUID2,
Username: data.Receiver,
ChattedWithUsername: data.Username,
Timestamp: new Date(new Date().getTime())
};
console.log(chatRecord1.Timestamp);
queryEntity(chatRecord1);
}
function queryEntity(record1) {
chatTableService.queryEntity('ChatUser'
, record1.PartitionKey
, record1.RowKey
, function (error, entity) {
if (!error) {
console.log("Entity already exists")
}
else {
insertEntity(record1);
}
})
}
function insertEntity(record) {
chatTableService.insertEntity('ChatUser', record, function (error) {
if (!error) {
console.log("Entity inserted");
}
});
}
Its working on my local emulator but not on cloud and I came across a reading that DateTime variable of an entity should not be null when creating a record on cloud table. But am pretty sure the way am passing timestamp is fine, it is right? any other ideas why it might be working on local but not on cloud?
EDIT:
I hav also been getting this error when am running the socket.io server, but in spite of this error the socket.io functionality is working fine so i didnt bother to care about it. I have no idea what the error means in the first place.
{ [Error: connect ECONNREFUSED]
code: 'ECONNREFUSED',
errno: 'ECONNREFUSED',
syscall: 'connect' }
Couple things:
You shouldn't need to set Timestamp, the service should be populating that automatically when you insert a record.
When running it locally you can set the environment variables to the Windows Azure storage account settings and see if it will successfully write to the table when running on your developer box. Instead of running in the emulator, just set the environment variables and run the app directly with node.exe.
Are you running in a web role or worker role? I'm assuming it's a cloud service since you mentioned the emulator. If it's a worker role, maybe add some instrumentation to log to file to assist in debugging. If it's a web role you can add an iisnode.yml file in the root of the application, with the following line in the file to enable logging of stdout/stderr:
loggingEnabled: true
This will capture stdout/stderr to an iislog folder under the approot folder on e: or f: of the web role instance. You can remote desktop to the instance and look at the logs to see if the logs you have for successful insertion are occurring.
Otherwise, it's not obvious from the code above what's going on. Similar code worked fine for me. Relevant bits for my test code can be found at https://gist.github.com/Blackmist/5326756.
Hope this helps.