configure and watch log4net using blob store - azure

we are using log4net with custom appenders to log our stuff from azure machines to table store, and that works fine. What we need now is to use blob-store to configure logging for all our instances in one place and be able to modify it on run-time, just modify the config file in blob store and few moments later all my machines should know about this change.
Simply said what I need is:
XmlConfigurator.Configure(Uri blobStoreUri, bool watch=true)
or even better:
XmlConfigurator.ConfigureAndWatch(string blobStoreUrl, TimeSpan refreshInterval)
I googled arround but was not able to find anything like that. What would be the best way to implement this or do you know any similar implementations of this?
thanks
almir

I couldn't find a way to configure log4net to look for a file stored in the blob, but the windows azure diagnostics provides a way to specify the configuration information in a log file that's stored in blob.
http://msdn.microsoft.com/en-us/library/windowsazure/hh411551.aspx

this is solution we implemented than
https://gist.github.com/kaza/9207832
cheers

Related

How to uncompress rar files using Azure DataFactory

We have a new client, while landing the project we gave them a blob storage for them to leave files so we could later automate and process the information.
The idea is to use Azure Datafactory but we find no way of dealing with .rar files, and even .zip, being it files from windows, are giving us trouble. And since it is the clien giving the .rar format, we wanted to make absolutely sure there is no way to process before asking them to change it, or deploying a databricks or similar service just for the purpose of transforming the file.
Is there any way to get a .rar file from a blob storage, uncompress it, then process it?
I have been looking in posts like this and related official documentation and closest we have come is using ZipDeflate, but it does not seem to fill our requirement.
Thanks in advance!
Data factory compression only supported types are GZip, Deflate, BZip2, and ZipDeflate.
For the Unsupported file types and compression formats, Data Factory provides some workarounds for us:
You can use the extensibility features of Azure Data Factory to transform files that aren't supported. Two options include Azure Functions and custom tasks by using Azure Batch.
You can see a sample that uses an Azure function to extract the contents of a tar file. For more information, see Azure Functions activity.
You can also build this functionality using a custom dotnet activity. Further information is available here.
Next way, you may need to figure out how to using Azure function to extract the contents of a rar file.
you can use logic apps
you can use webhook activity calling a runbook
both are easiee than using a custom activity

Output file in Azure-automation script

I'm adapting a powershell script I have at work for use in Azure-automation, which outputs 3 different CSV files. I'm trying to avoid having to create a DB and send the information there since it would require a changing the script too much, and its quite complex.
Does anyone know if there's a way to just send the 3 files to some kind of folder in Azure? Or maybe another solution that wouldn't require messing too much with the script?
Sorry if it is a dumb question, I'm not very familiar with Azure yet.
Probably the easiest option is to continue writing the file as you are now, then after the file is written have your Powershell code upload it to Blob storage using Set-AzureStorageBlobContent. See https://savilltech.com/2018/03/25/writing-to-files-with-azure-automation/ for an example.
You can read more about using Powershell to upload to blob storage, including all the steps you need to create the storage account and container, at https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-powershell.

Anyone know good rule of thumb for what's in .CSDEF versus .CSCFG?

I was surprised by a few questions on the 532 and 533 exam that more or less wanted to me to recall exactly what settings were in which configuration files for Cloud Services. I think at the basic level this is a pretty tough thing to discern without documentation in front of me.
For example: Scaling the instance count for a given Role is defined in the .csdef file, but the instance size for a Role is in .cscfg. It's not obvious to me why one versus the other is appropriate.
Anyone have any useful tips for remembering/recalling what goes where?
The main difference is that you can upload a new service configuration file (.cscfg) without redeploying the cloud service so configuration values can be changed without any downtime. There aren't many configuration settings that can go to the service configuration file (.cscfg) so just remember them and assume that all other settings go to the service definition file (.csdef).
Here's a great article on the subject: What is the Cloud Service Model and how do I package it?
Any on-the-fly changeable settings are in the configuration file. The definition file has several items that may only be changed with a re-deployment, along with a user-defined list of settings you'll want to change on-the-fly (the list itself is static, but the values are changeable).
You might be able to argue that some settings should go in the configuration file vs the definition file (e.g. a role's vm size), but these are not changeable.
Schemas are fully published for both the configuration file and the definition file.

log4net - configure using multiple configurations

I use the Log4Net as my log tool, everything works really well when the test system just has a single database.
But my real system has more than one database. Different user may have the different database. I want to put the log information into different database according to the current logined user.
But so far as I know. It seems that the Log4Net does't support this topic. It seems the log4Net is configured just "once" in the lifetime.
Is it possible for me to make the log4net select database configuration basing on my information on the fly.
I found the answer:
log4net - configure using multiple configuration files
and this: http://logging.apache.org/log4net/release/manual/repositories.html
The reason I thought that Log4Net only supports one configuration is I did NOT dig. As the content of the above links said: We need to create our own repository for each configuration.
Now everything is working well, and log information goes to different databases now based on the given configuration on the fly as I expected.

In Azure, where does the cscfg file come from?

So in Azure, I created a cloud service, and now I want to upload a deployment. It asks for a package (sure, that's easy, a zip file) and a configuration file (.cscfg file). I understand that the .cscfg file is supposed to define the roles, network configuration, etc.
But I don't have a cscfg file. Where are they supposed to originate? Do I have to write one by hand? The documentation for that is substandard at best. Is there any way to generate one? Or do a deployment somehow that bypasses this step? My approach must be wrong on some level (unless I really do have to write one by hand, but I somehow doubt that is a typical case).
You can either rely on Visual Studio to create it or manually create with command line tools.
http://www.microsoft.com/en-us/download/details.aspx?id=15658
You can also create using msbuild:
http://msdn.microsoft.com/en-us/library/windowsazure/hh535755.aspx

Resources