understanding Cruise Control .NET continuous integration - cruisecontrol.net

I want to run a task/build after file change in my repository.
when I configure the "Interval trigger": buildCondition="IfModificationExists",
the task doesn't execute although I changed a file and made a Commit..
the log writes: No modifications detected.
how can I run the task just after file change?
thanks,
yehi

There is a bug in Mercurial Source Control as you can see here :
CruiseControl.NET's Filtered Source Control Provider Not Detecting Modifications When Using Mercurial
http://groups.google.com.ag/group/ccnet-user/browse_thread/thread/23cc02e1258a63ec
Otherwise, your configuration seems correct, except you don't have login information in your source control block.

personally id do the following:
<triggers>
<intervalTrigger initialSeconds="0" /> 1
</triggers>
<sourcecontrol type="<type here>">
as that polls the source control system and then only build if there are changes
please try that and see

Related

How to resolve upgrade exception in guidewire

When I started the server in guidewire CC I got this error.
com.guidewire.pl.system.exception.UpgradeException: Encryption has been removed or changed, but the OldEncryption plugin is not defined. Please see your documentation for details on encryption upgrade.
Please help me to resolve this error.
This exception happens when ClaimCenter detects that you have changed encryption schemes without properly leaving the old plugin registered.
You can have multiple plugins which implement the IEncryption interface.
Lets say you're using SHA1 encryption registered through a SHA1Encryption.gwp Plugin Registry:
<plugin
interface="IEncryption"
name="SHA1Encryption">
<plugin-gosu
gosuclass="com.mycompany.plugins.encryption.SHA1EncryptionPluginImpl"/>
</plugin>
And configured in config.xml:
<!-- The name of the current encryption plugin. -->
<param name="CurrentEncryptionPlugin" value="SHA1Encryption"/>
Then you decide to switch to AES encryption.
You first have to create a new Plugin Registry file AESEncryption.gwp:
<plugin
interface="IEncryption"
name="AESEncryption">
<plugin-gosu
gosuclass="com.mycompany.plugins.encryption.AESEncryptionPluginImpl"/>
</plugin>
Then you have to modify the config.xml to tell ClaimCenter to use the new Plugin:
<!-- The name of the current encryption plugin. -->
<param name="CurrentEncryptionPlugin" value="AESEncryption"/>
Do not remove the SHA1Encryption.gwp Plugin Registry.
ClaimCenter keeps track of the Encryption Plugin used to encrypt each record, by NAME of the Plugin Registry file. If it can't find the file, you will get this error.
ClaimCenter is also capable of detecting that the implementation class has changed even if the plugin name hasn't (metadata change). In that case, it looks for a Plugin Registry named OldEncryption.gwp.
If it can't find the specific name, AND can't find OldEncryption.gwp, then you will get this error.
More information can be found in the Integration Guide from Guidewire.
Sounds like the database you are connecting to is a newer or differently encrypted version that what the Guidewire application codebase you are running is expecting.
What is the value of CurrentEncryptionPlugin in your config.xml file?
<!-- The name of the current encryption plugin. -->
<param name="CurrentEncryptionPlugin" value="AESEncrypter"/>
I found this was due to my not having the extensions.properties file updated to the latest number. To fix:
Hit Cntl-Shift-n
Search for "extension.properties"
Go into the file and change the number to the appropriate number
Save
Restart Guidewire Studio
I was set after I did this.
Easy way to pretend this type of exceptions,
change the DB path from database-confix.xml
Open extension.property file
content of the file will be like
version=34
increment the value of version by one, if you had made changes in any of the existing table structure
Restart the server
Whenever will get this UpgradeException, the newer upgrade version number shows in console along with old version number. You have to update the new number in
the extension.property file.

Visual Studio Test Explorer Error Logs

I've been attempting to fix a problem I have with Test Explorer not showing tests when I'm in a different configuration environment. I make local environment changes by updating my app.config.
I have already tried a few steps found here and in other sites including cleaning my project, updating to the latest xUnit test runner, etc.
I am sure the reason is code/project based and not the Visual Studio environment itself. What I would like to know is whether or not there is a log file I can look at that would help me to determine why my tests are not showing. Or, is there another good method to troubleshoot errors with tests not showing up in Test Explorer?
There is a MSDN Blog post about this.
Essentially, you need to:
Go to folder %VSInstallDir%\Common7\IDE\CommonExtensions\Microsoft\TestWindow
Find the config file for the process you want to debug (e.g. vstest.console.exe.config, vstest.discoveryengine.exe.config, etc)
Alter the config file system.diagnostics node to include:
<system.diagnostics>
<switches>
<add name="TpTraceLevel" value="4" />
</switches>
</system.diagnostics>
Close VisualStudio, open it again, and discover/run tests from TestExplorer Window
Look for logs under %temp%[processname].TpTrace.log (e.g. vstest.console.tptrace.log)

Azure configuration with multiple tasks possible?

We have a third party .NET application which comes with its' own deployment tool. This deployment tool generates a custom startup.cmd file. We have an option to modify the CFG file but not the cmd file.
I would like to have my own cmd file. Could someone please confirm whether multiple tasks are allowed in a cloud service definition file?
For example...
<Startup>
<Task commandLine=”thirdpartyPropietryStartup1.cmd” executionContext=”elevated” />
<Task commandLine=”startup\startup2.cmd” executionContext=”elevated” />
</Startup>
Multiple startup tasks should work according to the documentation.
It is not that explicit but it states that
simple tasks are executed synchronously, one at a time, in the order specified in the ServiceDefinition.csdef file.
which would make no sense if multiple tasks would not work.
If there are issues you could always create a combined.cmd which runs thirdpartyPropietryStartup1.cmd and then startup\startup2.cmd (unless your executionContext should be different).

What is the best way to centralize logging with NLog?

I have been assigned a project with a lot of poorly written code that is based around SharePoint.
It consists of about 15 subprojects, some of them being windows services, some web services, some web applications running inside of SharePoint, some being webparts and even console applications. They all run on the same server and call each other.
There are already many issues in production but they are hard to trace down.
The original developer must have been a fan of either Salinger or Pokémon series judging by his tireless effort to catch all exceptions. Unfortunately, none of them get reported or logged, ever.
My current task is to introduce logging into the whole project so I could find now-invisible exceptions, follow tangled recurring calls and have some stack traces at least. I decided to go with NLog, seeing it's active and cool, as opposed to log4net which is perfectly fine but somewhat not as fancy to my taste.
Because the components are tightly coupled, I want to centralize logging in one file so related errors don't get scattered across the hard drive. Therefore, I am looking to have two or three different log files with five or more projects writing to each of them more or less simultaneously.
What is the best way to configure NLog to centralize logging? Should I have a config file for each project, or should related projects share them? Where should I put config file to log from SharePoint webparts? Am I going to face any permission issues?
I'm using SharePoint 2007.
The easiest way to centralize is probably to simply log to a database, one benefit being that multiple applications and write to the database easier than to the same log file. For each application, configure NLog to log to the Database target, using the same Database target configuration parameters for each. Your NLog.config file might look something like this:
<?xml version="1.0" encoding="utf-8" ?>
<!--
This file needs to be put in the application directory. Make sure to set
'Copy to Output Directory' option in Visual Studio.
-->
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
autoReload="true"
internalLogLevel="Debug"
internalLogFile="nlog_log.log">
<targets async="true">
<target name="sqlexpress" xsi:type="Database">
<connectionString>
Data Source=.\SQLEXPRESS;Initial Catalog=LoggingDB;Integrated Security=True;
</connectionString>
<commandText>
insert into LogTable(DateTime,Logger,LogLevel,Message,ProcessId,ManagedThreadId) values (#DateTime,#Logger,#LogLevel,#Message,#ProcessId,#ManagedThreadId);
</commandText>
<parameter name="#DateTime" layout="${date:format=yyyy\-MM\-dd HH\:mm\:ss.fff}"/>
<parameter name="#Logger" layout="${logger}"/>
<parameter name="#LogLevel" layout="${level}"/>
<parameter name="#Message" layout="${message}"/>
</target>
</target>
</targets>
<rules>
<logger name="*" minlevel="Trace" writeTo="sqlexpress" />
</rules>
</nlog>
You could certainly log to files in addition to (or instead of) logging to a database.
I am not familiar with doing this from SharePoint, so I can't comment on any configuration or permission issues that you might run into there.
Here is a link I found where there is a discussion of getting NLog to work in a SharePoint environment:
http://nlog-forum.1685105.n2.nabble.com/Is-anyone-using-NLog-with-SharePoint-td2171451.html
That link appears to put you at the top of the NLog forum instead of the specific post. Search for this text in the forum "Is anyone using NLog with SharePoint" and you should find the right post.
Good luck!
You could also just leverage the existing logging infrastructure in SharePoint and write to the ULS logs. This way your log information can be viewed in a complete context using the ULS log viewer. For SharePoint 2007 see this blog how to write to the ULS log:
SharePoint Trace Logs and the Unified Logging Service (ULS)
With SharePoint 2010 it has become even easier with improvements to the SPDiagnosticsService class where you can use the new WriteTrace method.
Personally I log Exceptions to the Event Logger. And I use NLog for logging details, debug information or tracing.
Since NLog can be easily switched on and off I only activate it when I'm debugging or when I need to inspect an exception in production. I never was a big fan of the default tracing functionality in .NET.
I prefer simple plain text log files. Although logging to a database works great if you don't have too many "log lines" implemented in your code.
I feel like we are working on the same project! Multiple projects consisting of web projects, core dll projects, console apps, services, etc. Unfortunately I'm not working in sharepoint like you are, but I can describe how I am trying to centralize our logging.
We have 1 core .Net framework project.This is where I placed our wrapper class of the log. This project also holds the nlog dlls and the nlog config file. In this core project file you can add this which automatically moves the config when you build projects with a dependency on this core project.
<None Include="Logging\NLog.config">
<link>NLog.config</link>
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
We found that some web projects that don't compile a dll don't automatically pull in the config file, so we will leave that up to the build process. This helps centralize our logging so you only have to manage a single config across them all.
In addition remember when you create a logger per class, the log name should have the namespace in it, so you can make specific targets that filter based off namespace if you want different settings for particular projects.
As for centralizing where the logs end up, we chose to use a file target and specify the full path. This is because on our servers the applications run off a C:\ but we have larger D:\ which can store the logs. In our production servers, we also have multiple servers, so we are using splunk to aggregate all of our logs.
If splunk is out of the question and you are on a distributed system, a database sounds like a good idea as suggested above. If you don't want to stand up an sql instance, there are target wrappers for mongo db as well.
Hopefully helpful, I'm curious if anyone has suggestions or opinions on how I'm doing it as well!

SharePoint and Log4Net

I'm looking for best practices to integrate log4net to SharePoint for web request, feature activation and all timer stuff.
I have several subprojects in my farm, and I would like to have only one Log4Net.config file.
[Edit]
Not only I need to configure log4net for the web application, which is easy to do (I use global.asax, and a log4net.config file, so I can modify log settings withtout reloading the webapp), but I also need to log asynchronous events:
Event Handler (like ItemAdded)
Timer Jobs
...
I implemented this recently and came up with a solution that worked for me.
Deploy your log4net config file to the 12 hive and the log4net dll into the GAC using a globally scoped solution. Then in your application code explicitly initialize log4net from the location of your global file. This allows you to log feature receiver, timer jobs and web application code.
[assembly: log4net.Config.XmlConfigurator(ConfigFile =
#"C:\Program Files\Common Files\Microsoft Shared\" +
#"Web Server Extensions\12\CONFIG\log4net.config", Watch = true)]
see here http://www.codeproject.com/KB/sharepoint/SharepointLog4Net.aspx
Firstly, you will need to modify the web.config where your SharePoint virtual directory resides. This is because you'll need to add SafeControl entries to trust the log4net assembly. You can update the web.config programmatically using the SPWebConfigModification class in a feature receiver. As you have to modify web.config anyway, you may want to consider including your log4net config inside and not set up an external log4net config.
However, if you'd still like to do this, it may work if you add the following to the web.config file:
<configuration ...>
...
<configSections>
<section name="log4net" type="log4net.Config.Log4NetConfigurationSectionHandler,log4net" />
</configSections>
<log4net configSource="log4Net.config">
...
</configuration>
The log4net.config file should then be able to live alongside your web.config. As Nat says, you could deploy this file as a solution package.
Assuming you are attempting to run a minimal trust, you will need to update your Code Access Security file to include the log4net assemblies as well. All of your custom SharePoint code should then automatically use your log4net configuration.
You could release the config file as part of the solution package(s) to the 12 hive (use STSDev) to create any packages). This would give you a set location for the config and any changes to it can be released in a controlled manner (i.e. no need for manual editm, just roll back and re-install the solution).
I developed a log4net feature and packaged it in a wsp file. The feature receiver adds an httpmodule to the the web.config and the httpmodule loads the log4net.config from the layouts direcory when the application start event is raised in the http module.

Resources