Does log4net has standard feature to read settings from database?
I mean, on my application I store only connection string, and all other settings log4net get from database.
In Initialize log4net settings from database display how to load settings from DB, but loading becomes manually by external code.
Related
I am trying to get Users logged in information in Azure SQL Database Using Log Analytics. Can it be possible if so, can you please help me on this ?
Below are Options available in Diagnostic Settings for Azure SQL Database.
Click 'Add Diagnostic setting' above to configure the collection of the following data:
DmsWorkers
ExecRequests
RequestSteps
SqlRequests
Waits
Basic
InstanceAndAppAdvanced
WorkloadManagement
I want to achieve this without using Sys schemas objects related to Azure SQL Databases.
Thanks,
Brahma
You need to enable Auditing in Azure SQL Server using and then you can check the logs in Azure Log Analytics.
Easiest way to enable auditing is through the Azure Portal. However, it can be easily set up through ARM templates, Azure Powershell, Azure CLI.
Auditing can be enabled either at the individual database level or at the logical server level. If enabled at the server level then it automatically applies to existing databases and any new databases that are created.
However, enabling both at the server and database level leads to duplicate logs.
In the homepage of the desired Azure Sql server, in the left pane there is an option for “Auditing”.
By default, Auditing is off. Enable it. Choose the Log Analytics Workspace where you need to store the logs. Click on Save.
Click on Add diagnostics setting. Let us enable diagnostics for errors and InstanceAndAppAdvanced. Send this data to the log analytics workspace using your subscription and log analytics workspace. Click on Save for the configuration.
To view the logs, open up the Log Analytics workspace that was configured as a sink and choose logs and select the scope.
Summarizing the connection attempts by caller IP addresses
AzureDiagnostics
|summarize count() by client_ip_s
Source: https://www.mssqltips.com/sqlservertip/6782/kusto-query-language-query-audit-data-azure-sql-database/
how to Get the CRUD Level Access logs for Azure Files , if any one has used it to check on who has accessed the file , any update they did or deleted the file on Azure Files . Please let me know if you can help
As I know, Azure Storage has Diagnostic settings that can write the logs about the Blob, Queue, and Table, except the File. You can the Diagnostic settings in the storage account like this:
When you set the logging, it will write the logs and store them in the storage account. For example, when you set the logging, you can see the logs like this:
That is what I know to write the logs about CRUD level. But only the File does not have the feature. Not pretty sure if there is another way to achieve what you want, but most likely there is no another way.
I am associating an Azure SQL DB Table to my Azure Search using an Indexer. I am setting this all up using Azure's website: https://portal.azure.com
When I try and create the Indexer in Azure Search, I get the warning about "Consider enabling integrated change tracking on your database." However, I have enabled integrated change tracking on my database and table.
I have successfully setup several tables this way, in the same database, and they're working just fine with Azure Search. However, this table has a schema other than [dbo], and the others with change tracking were [dbo]. The same SQL user is being used for all the tables, and it has been granted the change tracking permission to this table, too.
Is there a problem with the Azure website where I cannot do this via the UI? Can this be done otherwise? Is there a permission issue with my DB's schema? Something else?
Because of this warning, I have not actually created this Azure Search Index.
Any help is appreciated!
It's a limitation of Azure Search portal - it doesn't support enabling integrated change tracking for non-default schemas. The workaround is to create the indexer programmatically, using REST API or .NET SDK. For a walkthrough, see https://learn.microsoft.com/azure/search/search-howto-connecting-azure-sql-database-to-azure-search-using-indexers.
I'm using Log4net to log WebApi controllers method call to log method info, execution duration and exceptions in log files (I can change it to Elmah or NLog). I also have some custom message logs which are written by developers. All the log files resides on the same web server and if I want to analyse them, I should move them to the other machine.
Is there any way to write them directly to Azure Storage and which storage type (Blob/Table/File) is the best one?
Thanks
You should probably look into the Azure appender for log4net: http://stemarie.github.io/log4net.Azure/. It works with both blob storage and table storage. I would choose to save to table storage, since you will have some basic search capabilities on top of that, not available on blob storage.
Both ELMAH and NLog have similar features for logging to Azure storage.
For a completely automated solution, you could use Audit.NET with its Audit.WebAPI and Audit.log4net extensions.
Just adding an attribute to your controllers/methods:
[AuditApi]
public class UsersController : ApiController
{ ... }
And a static initialization code:
Audit.Core.Configuration.Setup()
.UseLog4net();
And that's it, you have an auditing system working on your web api calls and logging using your default log4net configuration.
I am reading this article:
https://technet.microsoft.com/en-us/library/jj683108.aspx
On step 6b it says to:
In the Database area, leave the prepopulated values for Database Server, Database Name, and Database authentication, which is Windows authentication (recommended) unless you have specific design needs to change them.
However my question on that is the prepopulated values for the database server is Sharepoint. why would I want that as a default if I am trying to set up an external database for external content types?
I want to make sure my understanding is correct before I make a change at this high level of SharePoint
The document you are referring to is about setting up BDC service. The database mentioned in step 6 is not the database containing the external data but internal database of BDC service that will hold the configuration data of the service.
Setting up new connection for particular data source is another step (following the BDC service setup). You can definitely connect to remote servers.