I am working with Windows Azure Diagnostics. I add the below code in Webrol.cs
try
{
string wadConnectionString = "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString";
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue(wadConnectionString));
RoleInstanceDiagnosticManager roleInstanceDiagnosticManager = storageAccount.CreateRoleInstanceDiagnosticManager(RoleEnvironment.DeploymentId, RoleEnvironment.CurrentRoleInstance.Role.Name, RoleEnvironment.CurrentRoleInstance.Id);
DiagnosticMonitorConfiguration config = DiagnosticMonitor.GetDefaultInitialConfiguration();
//Windows Azure logs
config.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1D);
config.Logs.ScheduledTransferLogLevelFilter = LogLevel.Undefined;
//IIS 7.0 logs
config.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1D);
////Failed Request logs
config.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1D);
//Windows Event logs
// config.WindowsEventLog.DataSources.Add("System!*");
config.WindowsEventLog.DataSources.Add("Application!*");
config.WindowsEventLog.ScheduledTransferPeriod = TimeSpan.FromMinutes(1D);
////Crash dumps
CrashDumps.EnableCollection(true);
//update the config with changes
roleInstanceDiagnosticManager.SetCurrentConfiguration(config);
}
catch (Exception ee)
{
System.Diagnostics.Trace.TraceWarning("Diagnostics failed");
}
and the remaining neccesary things in Web.config and the connection string in the .cscfg file.
Now I am able to log the Diagnostics in the from the Development environment using the Deployment Storage. But when i host the same application in Cloud I am not able to log the Diagnostics. I am getting an error like
"500 - Internal server error.
There is a problem with the resource you are looking for, and it cannot be displayed."
I tried by changing Copy local to true for the namespaces, but that doesn't work.
I want the application to work in the deployment environment.
If anyone have any idea to resolve this please reply me.
Thanks in advance.
The problem looks like you are not changing the connection string for "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString". You can change this in the settings for the web role project or you service configuration file. Set it to your account name and key. I normally do this with a build script so I can change this when I push to production. You can check out the post here and the code here.
Related
i'm new to azure and i have the following issue:
i have this key in my web.config:
add key="BlobStorageConnectionString" value="xxxxxxxx"
The thing is when i add it to app settings in azure app service, when i search in the logs i get this:
Getting "BlobStorageConnectionString" from ServiceRuntime: FAIL
Getting "BlobStorageConnectionString" from ConfigurationManager: PASS.
i've already tried a few tutorials, but i still can't find a reason.
I'm running out of ideas, any suggestions?
If you add the Storage Account string in the Application Settings, it will be stored as an environment.So you could read it with Environment.GetEnvironmentVariable("storageconnectionstring").
Then parse it the code will be like below shows.
string storageConnectionString = Environment.GetEnvironmentVariable("storageconnectionstring");
// Check whether the connection string can be parsed.
if (CloudStorageAccount.TryParse(storageConnectionString, out storageAccount))
{
try
{
// Create the CloudBlobClient that represents the Blob storage endpoint for the storage account.
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
........
........
}
}
And you could also configure your connection in your app like WebJob, you could use JobHostConfiguration(). And the code would be like this.And the connection name should be AzureWebJobsStorage.
var config = new JobHostConfiguration();
config.DashboardConnectionString = "";
Also you could choose to use Configuration Classes,about the details you could refer to this article.
Hope this could help you, if you still have other questions,please let me know.
I have an Azure app service that host a wordpress site. I want to write a console application that will copy files from the website (file storage) and paste them in a deployment slot. All of the online resources talk about "access keys" for connection to the file storage, but I do not see anything like this in the app service portal. Can I use deployment credentials or web deploy credentials to access these files?
According to your description, I suggest you could use webjob’s file tigger to achieve your requirement.
Link:webjob-extension
You could use the file tigger to watch the file changes in your system file path, you could find the deployment slot’s ftp credential, then use it to upload the file form production folder to the deployment slot by webjob’s extension package.
More details, you could refer to follow image and codes:
1.Find the ftp credential and set password
Set username and password
2..Install the Microsoft.Azure.WebJobs.Extensions from nugget package manager and write the webjob method.
Codes like below:
Note: The default file path is D:/home/data, if your file inside your website folder, you need change its path as below.
static void Main()
{
var config = new JobHostConfiguration();
FilesConfiguration filesConfig = new FilesConfiguration();
string home = Environment.GetEnvironmentVariable("HOME");
if (!string.IsNullOrEmpty(home))
{
filesConfig.RootPath = Path.Combine(home, "site");
}
config.UseFiles(filesConfig);
var host = new JobHost(config);
// The following code ensures that the WebJob will be running continuously
host.RunAndBlock();
}
Function:
public static void ImportFile(
[FileTrigger(#"wwwroot\doc\{name}", "*.*", WatcherChangeTypes.Created | WatcherChangeTypes.Changed)] Stream file,
FileSystemEventArgs fileTrigger,
TextWriter log)
{
log.WriteLine(string.Format("Processed input file '{0}'!", fileTrigger.Name));
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(string.Format("ftp://yourftpurl.ftp.azurewebsites.windows.net/site/wwwroot/doc/{0}", fileTrigger.Name));
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Credentials = new NetworkCredential(#"username", "password");
Stream requestStream = request.GetRequestStream();
file.CopyTo(requestStream);
requestStream.Close();
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
log.WriteLine("Upload File Complete, status {0}", response.StatusDescription);
response.Close();
}
Result:
If you add file to the production’s doc folder, the web job will copy it to deploymeny_solt’s doc folder.
You could use the "Azure Site Replicator" extension. A slot is like another azure app service, so it should replicate between slots just fine.
In your deployment slot that you want everything copied to, download the Publish Settings from the overview tab by clicking "Get Publish Profile"
In your app service production slot go to Extensions and add the Site Replicator extension. Then after it is installed, click it and click 'Browse.' That will open a new window with the configuration options.
In the configuration window, upload the Publish Settings file.
I am new in azure application development. As per the requirements given to me, I had developed an application on azure and successfully deployed it on azure cloud.Blob storage was also being used in that application. Everything was working fine.
When I was deploying it on cloud then that time I was not very much aware about azure deployment so I had deployed it as a cloud service. It worked fine but the only issue was about slow loading at the very first time. Then after doing lot of research I found some solutions so I have deployed it as web app then the slow loading problem was resolved. But with web app deployment, I am facing another problem with a single page which is using blob storage. Below is the error I am getting when opening that specific page :
Below is the code which I had written :
public List<string> ListContainer()
{
List<string> blobs = new List<string>();
if (!Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment.IsAvailable) return null;
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment.GetConfigurationSettingValue("FileStorageAccount"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
IEnumerable<CloudBlobContainer> containers = blobClient.ListContainers();
foreach (CloudBlobContainer item in containers)
{
foreach (IListBlobItem blob in item.ListBlobs())
{
blobs.Add(string.Format("{0}", blob.Uri));
}
}
return blobs;
}
It is working fine when running on visual studio but on deployment If I am going with cloud service deployment then I am not getting this specific error. Other pages are working fine with web deployment. The page which causing an error uses blob storage.
I had done lot of research but no luck. Please help!!!
I suspect you have not defined the "FileStorageAccount" Setting in the CSCFG file. Try using the CloudConfigurationManager.GetSetting method. It will read configuration values from the web.config or service configuration file. See also https://stackoverflow.com/a/19643516/5382426.
I am a beginner in Azure and need some help. We are facing a bit of problem with Azure Storage services and are unable to proceed.
Ok now the issue is
http://blogs.msdn.com/b/windowsazurestorage/archive/2014/08/05/microsoft-azure-storage-service-version-removal.aspx
To summarize:
We have to inspect the log version of an/all of blobs,tables,queues in case any of them are using the one set for planned removal. I have enabled logging for the webapplication on the azure portal site. I am able to see the three services as under
https://.blob.core.windows.net
https://.table.core.windows.net
https://.queue.core.windows.net
Now in the articles as below I gather that we get the log format as this where they have a version included but have NOT specfied from where to locate the logs and how to gather the logs. I have tried different things from using https://.blob.core.windows.net/$logs but makes no difference.
The logs required should be in this format(sample)
Here is a sample log entry, with the version used highlighted – in this case the request was an anonymous GetBlob request which implicitly used the 2009-09-19 version:
1.0;2011-08-09T18:52:40.9241789Z;GetBlob;AnonymousSuccess;200;18;10;anonymous;;myaccount;blob;"https:// myaccount.blob.core.windows.net/thumbnails/lake.jpg?timeout=30000";"/myaccount/thumbnails/lake.jpg";a84aa705-8a85-48c5-b064-b43bd22979c3;0;123.100.2.10;2009-09-19;252;0;265;100;0;;;"0x8CE1B6EA95033D5";Friday, 09-Aug-11 18:52:40 GMT;;;;"8/9/2011 6:52:40 PM ba98eb12-700b-4d53-9230-33a3330571fc"
Can you please show me a way to view these logs. Any tool to use ?
Since these logs are stored in a blob container called $logs, any storage explorer which supports viewing data from this blob container can be used to view the contents. To the best of my knowledge following tools support viewing data from this container: Azure Storage Explorer, Cerebrata Azure Management Studio, Cloud Portam (Disclosure: I am the developer working on this tool).
However before you could view the data you would need to enable logging on your storage account. Only when logging is enabled on the storage account you will see this container show up in your storage account. To enable logging, again you can use Azure Management Studio or Cloud Portam or you could use the code below (the code I mentioned below assumes you have the latest version of Storage Client Library):
static void SetLoggingProperties()
{
CloudStorageAccount account = new CloudStorageAccount(new StorageCredentials(StorageAccount, StorageAccountKey), true);
LoggingProperties properties = new LoggingProperties()
{
LoggingOperations = LoggingOperations.All,
RetentionDays = 365,
Version = "1.0",
};
ServiceProperties serviceProperties = new ServiceProperties()
{
Cors = null,
HourMetrics = null,
MinuteMetrics = null,
Logging = properties,
};
var blobClient = account.CreateCloudBlobClient();
blobClient.SetServiceProperties(serviceProperties);
var tableClient = account.CreateCloudTableClient();
tableClient.SetServiceProperties(serviceProperties);
var queueClient = account.CreateCloudQueueClient();
queueClient.SetServiceProperties(serviceProperties);
}
Once logging properties are set, give it some time for logs to show up.
I am new to the Azure platform and simply trying to pull my IIS logs into my storage account.
When running locally and using the storage emulator, I can see the logs without issue. However, when deploying the application, the log files are never created and the only container I see in blob storage is "vsdeploy".
I have followed the steps outlined here: http://www.windowsazure.com/en-us/develop/net/common-tasks/diagnostics/
I have imported the diagnostics module in my ServiceDefinition.csdef
...<Imports>
<Import moduleName="Diagnostics" />...
I have created a WebRole.cs class and configured directories for a scheduled transfer every two minutes.
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
// For information on handling configuration changes
// see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.
DiagnosticMonitorConfiguration diagConfig = DiagnosticMonitor.GetDefaultInitialConfiguration();
diagConfig.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(2.0); //IIS LOGS
DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", diagConfig);
return base.OnStart();
}
}
I have also verified my storage account connection string for cloud deployment is correct.
Still, when I deploy, nothing is ever created in the storage account. What piece am I missing or not configuring correctly that is keeping me from my logs?
Thanks.