MSTest parallelize getting modified by azure pipeline - azure

I have the following in the runsettings
<?xml version="1.0" encoding="utf-8"?>
<RunSettings>
<!-- MSTest adapter -->
<MSTest>
<Parallelize>
<Workers>0</Workers>
<Scope>ClassLevel</Scope>
</Parallelize>
</MSTest>
</RunSettings>
I dont have the parallize setting in assemblyinfo.cs file
however before the tests are run the azure pipeline updates the workers attribute to 4. I want it to remain at 0 so it determines the cpu count using the available resources.
Does any have any thoughts on this?
Regards
Cleaned up the local files to make sure old files are not there.
Tried different workers setting however it keeps reverting to 4

Related

VS 2013 Azure Publish hangs with last Pubxml file

If I publish to Azure Cloud Service using the Cloud Project and follow the publish wizard from it works fine but only if I delete the last Pubxml file (stored in the profile folder) first.
If I try and publish when a Pubxml file already exists it will hang trying to navigate between wizard steps.
Using SDK V2.9.6, although had the same issue on previous versions.
I have multiple subscriptions, and the issue seems to be whilst the Pubxml does store the subscription, it is ignored and the default (first one in the list alphabetically) is used instead.
So if I run a publish where there was a previous Pubxml, it jumps to the Diagnostics page. I click next for the summary and the issue areas are highlighted with a red error indicator (it has the wrong subscription selected by default, and hence it cannot find the correct cloud service). However I cannot click back as it just hangs.
I'm using SDK V2.9.6, and didn't see this problem. Actually, my pubxml files have subscription information included.
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="12.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<AzureCredentials>{"ServiceManagementEndpoint":"https:\/\/management.core.windows.net\/","ResourceManagementEndpoint":"https:\/\/management.azure.com\/","SubscriptionId":"my-subscription-id"}</AzureCredentials>
...
</PropertyGroup>
</Project>
Which version of Azure SDK is your pubxml files generated from?

Loading Frame Hangs During Automated Coded UI Test

I am using a recorded action via coded ui tests in Visual Studio 2013 to launch an internal website and log in. All of this processes normally, except after logging in a loading animation continues to spin until the test fails. Attempting to log in normally without the automated test works flawlessly and the page loads normally with a <2 second delay.
I have attempted to add a pause in but no matter how long I wait it does not load. Is there a reason the coded ui test would be causing the page to hang?
This post here explains the problem using a much more simple application and points to this answer here on SO:
Jquery AJAX success not getting triggered with Coded UI test project
And if you do not have an app.config file, just right-click your solution->Add->New Item->Application configuration file and then replace existing text with:
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<appSettings>
<add key="WebWaitForReadyLevel" value="3"/>
</appSettings>
</configuration>

Azure configuration with multiple tasks possible?

We have a third party .NET application which comes with its' own deployment tool. This deployment tool generates a custom startup.cmd file. We have an option to modify the CFG file but not the cmd file.
I would like to have my own cmd file. Could someone please confirm whether multiple tasks are allowed in a cloud service definition file?
For example...
<Startup>
<Task commandLine=”thirdpartyPropietryStartup1.cmd” executionContext=”elevated” />
<Task commandLine=”startup\startup2.cmd” executionContext=”elevated” />
</Startup>
Multiple startup tasks should work according to the documentation.
It is not that explicit but it states that
simple tasks are executed synchronously, one at a time, in the order specified in the ServiceDefinition.csdef file.
which would make no sense if multiple tasks would not work.
If there are issues you could always create a combined.cmd which runs thirdpartyPropietryStartup1.cmd and then startup\startup2.cmd (unless your executionContext should be different).

What is the best way to centralize logging with NLog?

I have been assigned a project with a lot of poorly written code that is based around SharePoint.
It consists of about 15 subprojects, some of them being windows services, some web services, some web applications running inside of SharePoint, some being webparts and even console applications. They all run on the same server and call each other.
There are already many issues in production but they are hard to trace down.
The original developer must have been a fan of either Salinger or Pokémon series judging by his tireless effort to catch all exceptions. Unfortunately, none of them get reported or logged, ever.
My current task is to introduce logging into the whole project so I could find now-invisible exceptions, follow tangled recurring calls and have some stack traces at least. I decided to go with NLog, seeing it's active and cool, as opposed to log4net which is perfectly fine but somewhat not as fancy to my taste.
Because the components are tightly coupled, I want to centralize logging in one file so related errors don't get scattered across the hard drive. Therefore, I am looking to have two or three different log files with five or more projects writing to each of them more or less simultaneously.
What is the best way to configure NLog to centralize logging? Should I have a config file for each project, or should related projects share them? Where should I put config file to log from SharePoint webparts? Am I going to face any permission issues?
I'm using SharePoint 2007.
The easiest way to centralize is probably to simply log to a database, one benefit being that multiple applications and write to the database easier than to the same log file. For each application, configure NLog to log to the Database target, using the same Database target configuration parameters for each. Your NLog.config file might look something like this:
<?xml version="1.0" encoding="utf-8" ?>
<!--
This file needs to be put in the application directory. Make sure to set
'Copy to Output Directory' option in Visual Studio.
-->
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
autoReload="true"
internalLogLevel="Debug"
internalLogFile="nlog_log.log">
<targets async="true">
<target name="sqlexpress" xsi:type="Database">
<connectionString>
Data Source=.\SQLEXPRESS;Initial Catalog=LoggingDB;Integrated Security=True;
</connectionString>
<commandText>
insert into LogTable(DateTime,Logger,LogLevel,Message,ProcessId,ManagedThreadId) values (#DateTime,#Logger,#LogLevel,#Message,#ProcessId,#ManagedThreadId);
</commandText>
<parameter name="#DateTime" layout="${date:format=yyyy\-MM\-dd HH\:mm\:ss.fff}"/>
<parameter name="#Logger" layout="${logger}"/>
<parameter name="#LogLevel" layout="${level}"/>
<parameter name="#Message" layout="${message}"/>
</target>
</target>
</targets>
<rules>
<logger name="*" minlevel="Trace" writeTo="sqlexpress" />
</rules>
</nlog>
You could certainly log to files in addition to (or instead of) logging to a database.
I am not familiar with doing this from SharePoint, so I can't comment on any configuration or permission issues that you might run into there.
Here is a link I found where there is a discussion of getting NLog to work in a SharePoint environment:
http://nlog-forum.1685105.n2.nabble.com/Is-anyone-using-NLog-with-SharePoint-td2171451.html
That link appears to put you at the top of the NLog forum instead of the specific post. Search for this text in the forum "Is anyone using NLog with SharePoint" and you should find the right post.
Good luck!
You could also just leverage the existing logging infrastructure in SharePoint and write to the ULS logs. This way your log information can be viewed in a complete context using the ULS log viewer. For SharePoint 2007 see this blog how to write to the ULS log:
SharePoint Trace Logs and the Unified Logging Service (ULS)
With SharePoint 2010 it has become even easier with improvements to the SPDiagnosticsService class where you can use the new WriteTrace method.
Personally I log Exceptions to the Event Logger. And I use NLog for logging details, debug information or tracing.
Since NLog can be easily switched on and off I only activate it when I'm debugging or when I need to inspect an exception in production. I never was a big fan of the default tracing functionality in .NET.
I prefer simple plain text log files. Although logging to a database works great if you don't have too many "log lines" implemented in your code.
I feel like we are working on the same project! Multiple projects consisting of web projects, core dll projects, console apps, services, etc. Unfortunately I'm not working in sharepoint like you are, but I can describe how I am trying to centralize our logging.
We have 1 core .Net framework project.This is where I placed our wrapper class of the log. This project also holds the nlog dlls and the nlog config file. In this core project file you can add this which automatically moves the config when you build projects with a dependency on this core project.
<None Include="Logging\NLog.config">
<link>NLog.config</link>
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
We found that some web projects that don't compile a dll don't automatically pull in the config file, so we will leave that up to the build process. This helps centralize our logging so you only have to manage a single config across them all.
In addition remember when you create a logger per class, the log name should have the namespace in it, so you can make specific targets that filter based off namespace if you want different settings for particular projects.
As for centralizing where the logs end up, we chose to use a file target and specify the full path. This is because on our servers the applications run off a C:\ but we have larger D:\ which can store the logs. In our production servers, we also have multiple servers, so we are using splunk to aggregate all of our logs.
If splunk is out of the question and you are on a distributed system, a database sounds like a good idea as suggested above. If you don't want to stand up an sql instance, there are target wrappers for mongo db as well.
Hopefully helpful, I'm curious if anyone has suggestions or opinions on how I'm doing it as well!

Unable to commit WebAdministration changes in Azure Web Role

I have an Azure Web Role running in the new 1.3 SDK and I am having permissions issues when trying to make changes to IIS using the Microsoft.Web.Administration.ServerManager. Whenever I execute CommitChanges() it throws this error:
an UnauthorizedAccessException "Cannot write configuration file due to insufficient permissions".
My ServerManager code is executing in the OnStart method of the RoleEntryPoint.
My understanding was that the purpose of moving to full IIS support in 1.3 was so that we could have greater control over the configuration of our application, including creating new IIS sites on the fly if desired.
Make sure your role is running with elevated privileges.
I think there are two questions here. Firstly the use of IIS in Azure. Yes using the 1.3 SDK means that we now have access to more features than we did previously. This means that we can setup more than one site and virtual directories for our sites in the configs as shown in the training kit.
Secondly there is the privileges issue that you're getting while trying to make changes programatically. I'm going to presume that you're not trying to do one of the things that you can simply do through the config above. The most likely reason your code is erroring is because web roles are not run with admin privileges. Fortunately in the 1.3 SDK we also have a way to run code with elevated privileges. As shown elsewhere in the training kit you can create a separate .exe that you specify to be run at startup with elevated privileges in the config.
Providing a clear example for reference to #smarx answer.
Here is the configuration to run RoleEntryPoint.OnStart (WebRole.Onstart) with Admin-level privileges.
<ServiceDefinition name="MyProject.Azure" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition" schemaVersion="2015-04.2.6">
<WebRole name="MyProject.WebRole" vmsize="Small">
<Runtime executionContext="elevated"/> <!-- Required for certain WebRoleOnStart tasks (avoid insufficient permission errors) -->
<Sites>
<!-- ... -->
</Sites>
</WebRole>
</ServiceDefinition>

Resources