Shared code libraries and cruise control testing - cruisecontrol.net

I have 2 applications sharing a common library. Both apps and the library are in active development. Both apps include the project file in their solution.
The folders are laid out in source control as:
Root
App1
App2
Library
We currently have separate cruise control builds set to run any time a file is committed to the app1, app2, or library folder hierarchies. A successful build of Library will then trigger builds of app1 and app2.
For the most part this works well; the problem is when someone commits changes to both Library and App1 (or app2). This generally occurs as a result of implementing a function that requires modification/addition to something in Library to be implemented. When this happens the triggers for building both Library (change in Library\foo.cs) and App1 (change in app1\bar.cs) will activate. Both will see that the file Base\Library\foo.cs has been changed and attempt to rebuild Library. Only one will succeed because the fire one to start writing an object file for Library will get an exclusive file lock; the second immediately fails. This has happened several times forcing someone to go in and manually rerun the build that failed due to the lock.
To try and mitigate the risk of this happening again we've changed the polling intervals for each of the triggers so that they're set to different values to try and avoid two firing at the same time. It's still not perfect because the cycles between Library and AppN will occur at the same time every N*M seconds (N and M being the respective polling intervals).
Is there a more elegant or less likely to fail solution available?

Yes. I believe you want to put all 3 projects in the same queue. This will prevent the projects from being built simultaneously.
You'll want something like this in your CCNet.Config file:
<project name="Library" queue="Q1" queuePriority="1">
<project name="App1" queue="Q1" queuePriority="2">
<project name="App2" queue="Q1" queuePriority="3">

Related

How to process wro4j at startup?

I have to use wro4j runtime solution. However, the first request to the server for the processed css file is very slow.
For production mode, I would like wro4j to generate it's files at application startup, to avoid the first slow request.
Here is my scenario, in case someone would advice me on an alternative approach :
I have a maven project which is built once (say generic.war) but customized for each hosted client (client1.war, client2.war etc).
For each client the appearance of the app can be overriden at different levels.
So I have a generic maven project and then another routine that unpack the war (generic.war), customize it by simply overwriting desired files, and repack it for a specific client (ie : client1.war).
This approach of generating specific wars by overwriting files is already in place and used all the time.
But now I want to use wro4j with this system. The first idea is to do the above, overwriting .less files from the generic files and rely on the runtime wro4j to do the final processing in the specifics wars (client1.war, client2.war etC).
But I don't want the first request to hang, I want the groups already in cache for the first request.
I saw this post, but it's a bit old now and I didn't find how to apply the recommended solution (no example and the part on how to trigger the processing from the ServletContextListener is not clear to me).
Thanks in advance :)

Why does VS2013 publish all website files when using a different machine?

I have a home machine and office machine I use to publish websites using Visual Studio 2013. If I make a change from the same machine, and re-publish, just the changes are published, not all files.
However, when using my clone machine at the office, even if I do a get latest, make one small change, and re-publish, all files are published, not just the ones that changed, and not just the ones that have been recompiled. ALL dll files, even third party dlls that have not changed or have been recompiled with a new date, are republished. Same thing happens if my cohort publishes a small change on his machine after I did the last publish. Not a problem if publishing twice from the same machine as then only the changed files are published.
Is there anyway to prevent complete republishing just because a different machine is used to publish than the one used for the last publish? Thanks.
This seems to make "Determining Changes" a lot slower, but for .Net 4.5 [and
up(?)], use this info from:
https://msdn.microsoft.com/en-us/library/ee942158:
To configure Web Deploy to use checksums instead of dates to determine
which files need to be copied to the server, add the following element
to the .pubxml file (Publish Settings):
<MSDeployUseChecksum>true</MSDeployUseChecksum>
Like this:
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<MSDeployUseChecksum>true</MSDeployUseChecksum>
<!— other settings omitted to keep the example short -->
<PublishDatabaseSettings>
<!— this section omitted to keep the example short -->
</PublishDatabaseSettings>
</PropertyGroup>
</Project>
First of all, I do not understand the behavior of MSBuild+VS2013 and the publishing feature completely as I just started to use the publishing feature myself. I'm looking for a way to speed up publishing via FTP web deploy in VS2013. (I'm not using TFS for a get latest though.)
I would say this is partially explained in a different context at this SO question. The MSBuild process. Timestamps of certain files are being compared and then could indicate MSBuild/VS2013 whether a target (build output) is up-to-date or not. Then files that are not up-to-date are being recompiled.
As you all work on different machines, timestamps are likely to be different quite soon.
To find out what is actually going on during builds/publishes, set build output verbosity to detailed or diagnostic, for a moment:
VS2013 menu - Tools - Options... - Project and solutions - Build and run - output verbosity - set to -> detailed or diagnostic. Run the build, and see the Output panel/tab in VS2013. Select "show output from: Build" to see results if not already visible. Don't forget to set it to the original setting after checking the build details, as it could slow the build down a bit.
But why even unchanged third party dll's are being republished? Possibly because these dll's ARE actually overwritten during a build. You might have the assembly reference property "Copy local" set to true to get your website running without any manual uploads to be done for this. Or you are using a commandline copy command with the overwrite parameter explicitly set to true during project's post build event (like 'copy /y ...' or 'xcopy /y ...'). Then the timestamp of the file that is to be published is overwritten for sure, see in
the "obj\Release\Package\PackageTmp" folder (or for example: "\Debug" instead of "\Release" if Debug build is set for the selected publish profile.)
Furthermore, VS2013 as default does NOT check timestamp differences on the targeted webserver if you are using FTP publishing, at least that is my experience. As for the other publishing options like a web deploy I don't know yet. But the differences you experience seem normal behavior to me, as you run builds on different machines and publishing files from different machines as well. So timestamps are likely to be different... which, again, indicates 'changed' files to be published.
As I'm curious how this question should be solved, I was looking for this in the first place:
Maybe a TFS build server is an option for you, configured with a rolling build. But I read on a specific SO (sorry, can't add more links at this moment as I've just registered) that is suggesting to do clean builds to prevent new problems. And that will force full publishing again as files are all being changed by the clean build... so that won't work I think.
As an answer, you might want to use these FAQ answers on MSDN for web deployments. See the questions:
"Can I exclude specific files or folders from deployment?" .NET 4.0/4.5 and/or
"How to make Web Deploy use file checksums instead of dates to
determine which files were changed?" .NET 4.5 only!
The first option is to exclude files.
The second option is to use a checksum to compare files instead of timestamps, but that could be somewhat slowing the build(?) process, as the FAQ says. Note the first few lines on that FAQ pages, on how you can edit your publishing profile to apply one or both of these elements!
Also it is an option to put the thirdparty dll's in a different project which you then could only include that project in the deploy for a certain solution configuration (VS2013 menu - Build - Configuration Manager, see the checkboxes in the 'deploy' column there for every project. Though I'm not sure if this is part of the VS2013 'publish' feature as a web deployment, because this deploy column checkboxes are greyed out for my solution projects for some reason I don't understand yet... so I can't test it to verify this option.)
Though it sounds logic, don't forget to create backups/copies/screenshots first before you change any settings or publishing profiles, and then change the same settings/configurable files on the other machines you and your colleagues work at.

source code location for debugging multiple instance of an application

Hi have an application running separateley (1 instance for customer) in different folders, 1 per each customer.
Each customer is a separate user on my machine.
At the moment I have the source code in each of these folders where I rebuild the code per each instance. Would it be better if I do something like the following?
create a shared folder where I build the code
deploy the binary in each user folder.
allow permission for each user to access the source code in READ ONLY mode.
when it is time to debug, by using gdb in each user folder will allow to read the source code and debug will happen.
Do you think that this could be a better approach or there are better practice?
My only concern is that each user has the chance to read the source code, but since the user will not access directly his folder (it is in my control) this should not trouble me.
I am using CENTOS 6.4, SVN and G++/GDB.
in different folders
There are no "folders" on UNIX, they are called directories.
I rebuild the code per each instance
Why would you do that?
Is the code identical (it sounds like it is)? If so, build the application once. There is no reason at all to have multiple copies of the resulting binary, or the sources.
If you make the directory with sources and binaries world-readable, then every user will be able to debug it independently.

How to add a lot of resource files to a monotouch project?

I have a lot of png images into a directory. I've added it to the project as Content/Copy if newer. I can load them from the app without problems.
But, the project needs a lot of time to compile. If i make a little change in the code, the project recompiles all again. It takes a lot of time.
I've tried to add another project, add the files to the new project, but then i can not access to the files from the app.
Is there any solution?
Of course, when i debug the app into the iPad, the uploading+install takes a lot of time. These files will not change ever, so...Is there any method to copy all the content ONE time?
Thanks
I just have discovered a tricks. It seems that monotouch does not remove directories when you upload and install from the MonoDevelop environtment, so:
Add your folders and all the files and mark them as Content
Build your project for iPhone/iPad and Run it from MonoDevelop
Remove your data foldres from your porject
Clean the solution
Make any changes you need in your code, your data reamins in the device!!!
That changes all!!! Before that, when i need to make a minor change in my code, i needed to wait about 15' for building and uploading. Now it's just 1 minute!!!
Place your images in a separate class library
Mark all your files as embedded resources
Add a logical name to each resource (in your project file)
<EmbeddedResource Include="Images\Folder\Filename.ext">
<LogicalName>LogicalNameForImage</LogicalName>
</EmbeddedResource>
4. Load the resource as
UIImage.FromResource(yourAssembly, "LogicalNameForImage");
Embedded resources are loaded on demand, not when the assembly loads.
In A future version of MonoDevelop (my patch didn't make it in time for the upcoming 4.0 release), this won't be an issue any longer.
What currently happens in MonoDevelop 3.x is that when building a project, it will only copy the images that have changed into the app bundle, however, after building, MonoDevelop invokes a script that is installed along with Xcode called iphone-optimize which scans the entire app directory and uses pngcrush to crush all of the images (it also converts all plist files into binary plists). This is the step that causes such slow build times if you have a lot of images.
Just after the 4.0 branch closed for QAing, I wrote a patch that avoids the need for invoking the iphone-optimize script. Instead, what MonoDevelop will do is it will directly invoke pngcrush on only the images that have changed, passing the proper app directory location as the output argument to pngcrush so that we avoid an additional file-copy.
From my own testing, this makes a massive improvement to build times for projects with a lot of image files.
In the meantime, what you could do, is make a backup of the iphone-optimize script (should be located somewhere under /Applications/Xcode.app) and then modify it to not crush image files. Then, once you've done that, go and pre-crush all of your png files in your project.
(Note: when the MonoDevelop with my patch finally ships, it'll also have an option to disable calling pngcrush for developers who have already pre-crushed their images).

How can I clear out my CruiseControl.Net working directories after a build?

I want to clear out the working directory in a CruiseControl.NET build after the site has been deployed because space is an issue and there's no requirement to keep it.
The way things are set up at the moment everything is on 1 machine (that's unlikely to change), this is acting as both Mercurial repository server, testing web server and CruiseControl.NET build server.
So on C:\Repositories\ and C:\inetpub\wwwroot\ we have a folder per website. Also in C:\CCNet\Projects we have a folder per website per type of build (Test and Live) - so that means we've got at least 4 copies of each website on the server and at around 100mb per site X 100 sites that's adding up to a lot of disk space.
What I thought I would like to do is to simply delete the Working Directory on successful build, it only takes 5-10 seconds to completely get a fresh copy (one small advantage to the build server being the same machine as the hg server) and only keep a handful of relatively active projects current. Of the 100 or so sites we'll probably work on no more than 10 in a week (team of 5).
I have experimented with a task that runs cmd.exe to /del /s /q the Working Directory folder. Sometimes this will complete successfully, othertimes it will fail with the message that "The process cannot access the file because it is being used by another process". When it does complete ok the build kicks off again, presumably because the WD is not found and it needs to be recreated, so I'm finding I'm in a never ending loop there.
Are there any ways I can reduce the amount of space required to run these builds or do I need to put together a business case for increasing hosting costs for our servers?
You need to create your own ccnet task and build the logic into it.
Create a new project called ccnet.[pluginname].plugin.
Use the artifact cleanup task source as base to get going quickly
Change the base directory from result.ArtifactDirectory to whatever you need it to be
Compile and place \bin\Debug\ccnet.[pluginname].plugin.dll to c:\Program Files\CruiseControl.NET\server or wherever CCNET is installed.
Restart the service and you should be able to use your task in a very similar way as the artifacts cleanup task

Resources