CruiseControl.NET service needs to be restarted to pick up changes in the projects configuration files.
I find this very annoying, not sure if it's a bug or it's the way it works.
Is there any way to overcome this issue in people's experience?
If your projects are separated in a different file from ccnet.config, then you need to restart the service unless you touch the actual ccnet.config.
We use ENTITY with SYSTEM file reference in ccnet.config for our projects, so we're in the same boat. I'm happy to pay the price for easier project maintenance, as it's easy to script a restart:
net stop CCService
net start CCService
IISRESET
If you wanted to completely automate this, and had your projects under source control, then you could trigger an update and restart whenever your project files are touched.
There was a bug in CC.Net prior to 1.4.4 if you were using a pre-processor include it did not reload the configuration when an included ccnet.config file was modified.
That was a bug that I reported and it is fixed in CC.Net 1.4.4 and greater.
Also, keep in mind that if a build is running and there is a change to the configuration it will not take place until that build is in an idle state.
How are you updating your config files? By hand? Mine always recognizes and adjusts. Is your config file in source control and designed to pull it down and replace the file? This for me requires a kick. How I ended up fixing it was have my project pull it down to a seperate folder. THen I call ccnet.exe -validate on it to make sure it is well formed, then I copy it over ontop of the current config file. CC.NET recognizes the changes and loads in the new config
Exceptions: If cc.net is currently running a project, it will not recognize the changes till that project has completed.
If your ccnet.config has errors, it will not ever recognize the changes and keep running the old version it has stored in memory. (However when CC.NET does restart it will try to parse the error filled config and choke.
Hope this helps!!
Do you mean you are using linked files, that is the ccnet.config file has links to the independent project files.
If so then they are not picked up, it's mentioned in the documentation that it doesn't watch the sub-files.
Internally we have modified our CruiseControl.net so that our ccnet.config is optionally a directory - and we can drop shortcuts to our project config files into that directory. We put watches on the directory, the files or shortcuts in the directory and all of the targets of the shortcuts. That means we have our project config files in ClearCase and just drop a shortcut into the ccnet.config directory.
I've just spent half a day or so moving from 1.2 to 1.4.2 dropping our changes into the new version for our internal use. We don't own our code, our client does and so it has to stay internal :(
I have never experienced this. Whenever I change the configuration files, the CruiseControl.NET service seems to automatically re-read them.
I'm using Version 1.3 of CC.NET.
Update:
In the service's config file (ccservice.exe.config), there is a setting to enable/disable watching the ccnet.config file for changes:
<add key="WatchConfigFile" value="true"/>
Make sure this is set to true.
Related
I'm working on 2 different machines (home vs. work) and transfer the code via GitHub, which works nice, but I just ran into a machine dependency when I added this code to the gradle.properties file to fix a vexing OAuth issue for google sheets:
org.gradle.java.home=C:\Program Files\Java\jdk1.8.0_131
org.gradle.java.home=C:\Program Files\Java\jdk1.8.0_77
Now I have to toggle between the 2 lines to get Gradle to compile. Need to check if I still need it (since I got the keystore files etc. sorted out), but I also wonder whether there is an easy solution to make this work (e.g. something like ifdef).
Obviously, I could just change the directory name in one of the machines I guess, but still curious how to solve this within Studio.
Lets start with a quote from the Gradle docs:
org.gradle.java.home
Specifies the Java home for the Gradle build process. The value can be set to either a jdk or jre location, however, depending on what your build does, jdk is safer. A reasonable default is used if the setting is unspecified.
So, by default, you should not need this project property (thats what they are called in Gradle).
However, there can be reasons, that you need to specify the Java directory. For this specific project property, you can follow Ray Tayeks advice and use the JAVA_HOME environment variable (on both systems). But there is also another approach, which can be used for any project property (and also for so-called system properties):
gradle.properties files can be located at different locations of the file system. Your files are located in the project directory and, therefor, they are included in your VCS. You can use them / it for project-related properties. An additional location is in the Gradle user home directory, which is by default the .gradle folder in your personal folder. This folder is not under version control, so simply define the property there.
try removing the line from the properties file. if that fails, try setting JAVA_HOME on each machine.
there are a lot of related questions.
you might try asking on the gradle forums.
I have a home machine and office machine I use to publish websites using Visual Studio 2013. If I make a change from the same machine, and re-publish, just the changes are published, not all files.
However, when using my clone machine at the office, even if I do a get latest, make one small change, and re-publish, all files are published, not just the ones that changed, and not just the ones that have been recompiled. ALL dll files, even third party dlls that have not changed or have been recompiled with a new date, are republished. Same thing happens if my cohort publishes a small change on his machine after I did the last publish. Not a problem if publishing twice from the same machine as then only the changed files are published.
Is there anyway to prevent complete republishing just because a different machine is used to publish than the one used for the last publish? Thanks.
This seems to make "Determining Changes" a lot slower, but for .Net 4.5 [and
up(?)], use this info from:
https://msdn.microsoft.com/en-us/library/ee942158:
To configure Web Deploy to use checksums instead of dates to determine
which files need to be copied to the server, add the following element
to the .pubxml file (Publish Settings):
<MSDeployUseChecksum>true</MSDeployUseChecksum>
Like this:
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<MSDeployUseChecksum>true</MSDeployUseChecksum>
<!— other settings omitted to keep the example short -->
<PublishDatabaseSettings>
<!— this section omitted to keep the example short -->
</PublishDatabaseSettings>
</PropertyGroup>
</Project>
First of all, I do not understand the behavior of MSBuild+VS2013 and the publishing feature completely as I just started to use the publishing feature myself. I'm looking for a way to speed up publishing via FTP web deploy in VS2013. (I'm not using TFS for a get latest though.)
I would say this is partially explained in a different context at this SO question. The MSBuild process. Timestamps of certain files are being compared and then could indicate MSBuild/VS2013 whether a target (build output) is up-to-date or not. Then files that are not up-to-date are being recompiled.
As you all work on different machines, timestamps are likely to be different quite soon.
To find out what is actually going on during builds/publishes, set build output verbosity to detailed or diagnostic, for a moment:
VS2013 menu - Tools - Options... - Project and solutions - Build and run - output verbosity - set to -> detailed or diagnostic. Run the build, and see the Output panel/tab in VS2013. Select "show output from: Build" to see results if not already visible. Don't forget to set it to the original setting after checking the build details, as it could slow the build down a bit.
But why even unchanged third party dll's are being republished? Possibly because these dll's ARE actually overwritten during a build. You might have the assembly reference property "Copy local" set to true to get your website running without any manual uploads to be done for this. Or you are using a commandline copy command with the overwrite parameter explicitly set to true during project's post build event (like 'copy /y ...' or 'xcopy /y ...'). Then the timestamp of the file that is to be published is overwritten for sure, see in
the "obj\Release\Package\PackageTmp" folder (or for example: "\Debug" instead of "\Release" if Debug build is set for the selected publish profile.)
Furthermore, VS2013 as default does NOT check timestamp differences on the targeted webserver if you are using FTP publishing, at least that is my experience. As for the other publishing options like a web deploy I don't know yet. But the differences you experience seem normal behavior to me, as you run builds on different machines and publishing files from different machines as well. So timestamps are likely to be different... which, again, indicates 'changed' files to be published.
As I'm curious how this question should be solved, I was looking for this in the first place:
Maybe a TFS build server is an option for you, configured with a rolling build. But I read on a specific SO (sorry, can't add more links at this moment as I've just registered) that is suggesting to do clean builds to prevent new problems. And that will force full publishing again as files are all being changed by the clean build... so that won't work I think.
As an answer, you might want to use these FAQ answers on MSDN for web deployments. See the questions:
"Can I exclude specific files or folders from deployment?" .NET 4.0/4.5 and/or
"How to make Web Deploy use file checksums instead of dates to
determine which files were changed?" .NET 4.5 only!
The first option is to exclude files.
The second option is to use a checksum to compare files instead of timestamps, but that could be somewhat slowing the build(?) process, as the FAQ says. Note the first few lines on that FAQ pages, on how you can edit your publishing profile to apply one or both of these elements!
Also it is an option to put the thirdparty dll's in a different project which you then could only include that project in the deploy for a certain solution configuration (VS2013 menu - Build - Configuration Manager, see the checkboxes in the 'deploy' column there for every project. Though I'm not sure if this is part of the VS2013 'publish' feature as a web deployment, because this deploy column checkboxes are greyed out for my solution projects for some reason I don't understand yet... so I can't test it to verify this option.)
Though it sounds logic, don't forget to create backups/copies/screenshots first before you change any settings or publishing profiles, and then change the same settings/configurable files on the other machines you and your colleagues work at.
I have a lot of png images into a directory. I've added it to the project as Content/Copy if newer. I can load them from the app without problems.
But, the project needs a lot of time to compile. If i make a little change in the code, the project recompiles all again. It takes a lot of time.
I've tried to add another project, add the files to the new project, but then i can not access to the files from the app.
Is there any solution?
Of course, when i debug the app into the iPad, the uploading+install takes a lot of time. These files will not change ever, so...Is there any method to copy all the content ONE time?
Thanks
I just have discovered a tricks. It seems that monotouch does not remove directories when you upload and install from the MonoDevelop environtment, so:
Add your folders and all the files and mark them as Content
Build your project for iPhone/iPad and Run it from MonoDevelop
Remove your data foldres from your porject
Clean the solution
Make any changes you need in your code, your data reamins in the device!!!
That changes all!!! Before that, when i need to make a minor change in my code, i needed to wait about 15' for building and uploading. Now it's just 1 minute!!!
Place your images in a separate class library
Mark all your files as embedded resources
Add a logical name to each resource (in your project file)
<EmbeddedResource Include="Images\Folder\Filename.ext">
<LogicalName>LogicalNameForImage</LogicalName>
</EmbeddedResource>
4. Load the resource as
UIImage.FromResource(yourAssembly, "LogicalNameForImage");
Embedded resources are loaded on demand, not when the assembly loads.
In A future version of MonoDevelop (my patch didn't make it in time for the upcoming 4.0 release), this won't be an issue any longer.
What currently happens in MonoDevelop 3.x is that when building a project, it will only copy the images that have changed into the app bundle, however, after building, MonoDevelop invokes a script that is installed along with Xcode called iphone-optimize which scans the entire app directory and uses pngcrush to crush all of the images (it also converts all plist files into binary plists). This is the step that causes such slow build times if you have a lot of images.
Just after the 4.0 branch closed for QAing, I wrote a patch that avoids the need for invoking the iphone-optimize script. Instead, what MonoDevelop will do is it will directly invoke pngcrush on only the images that have changed, passing the proper app directory location as the output argument to pngcrush so that we avoid an additional file-copy.
From my own testing, this makes a massive improvement to build times for projects with a lot of image files.
In the meantime, what you could do, is make a backup of the iphone-optimize script (should be located somewhere under /Applications/Xcode.app) and then modify it to not crush image files. Then, once you've done that, go and pre-crush all of your png files in your project.
(Note: when the MonoDevelop with my patch finally ships, it'll also have an option to disable calling pngcrush for developers who have already pre-crushed their images).
Where and how are Hudson jobs and slave information stored?
I accidentally canceled a Hudson upgrade today. It wouldn't permit me to continue the upgrade; only to downgrade to the previous version and then upgrade again. After I downgraded, the two jobs I had created in the recent past were gone from the dashboard along with the slave node I created for one of those jobs, and the job I had recently deleted showed up in the dashboard. After the upgrade, the jobs and nodes are in that same state.
What happened? Can I restore my recent jobs and nodes, and how would I do that? Please keep in mind that while I know C/C++ well, web services are out of my area and I don't really know what a jar or a war is... I just followed online directions to install and set up Hudson and it worked. I wish to avoid simply re-creating those jobs; setting up one of them was less than trivial.
More info: Looking in the configuration, the home directory is incorrect; it thinks HOME is /root/ instead of /home/hudson. How did it change, and how do I change it back?
The previous version of Hudson is 1.379. It's currently running 1.381. I'm running it on RHEL 5.
When I look in the .hudson/jobs directory, both of the recent jobs are there, and the previously deleted job is not there. These job directories are missing their "workspace" directories.
As you've noticed, job configuration is stored in HUDSON_HOME/jobs/[name]/config.xml.
Slave configuration is stored in the main Hudson config file, HUDSON_HOME/config.xml.
I'm not sure why Hudson didn't pick up the jobs when you restarted after the upgrade. Checking the Hudson log might provide a clue, usually /var/log/hudson/hudson.log.
If your jobs' config.xml files are present, Hudson might be able to reread them if you reload your configuration (Manage Hudson -> Reload Configuration From Disk). If Hudson still doesn't recognize them (and the config file is present), your best bet is probably to recreate the jobs manually grabbing whatever you can from the config file (keeping in mind that XML escapes are applied to text fields like the build commands).
I got a helpful clue when I revisited the "manage hudson" page and saw a message that I had data in an old and unreadable format. That suggested Hudson was running a .war that was different from the one used more recently. So I searched the disk for any "hudson.war" files and found two; one from a couple of weeks ago and one from some months ago. The newer one is in the place I expected to find one, and the older one was elsewhere. I renamed the older one. Also, I have a start-hudson.sh script, added 'export HUDSON_HOME=/home/hudson' to that script, and used it to restart hudson. Lo and behold, my new jobs were back and working.
I would have thought that simply naming the HUDSON_HOME variable would have done it, but I did that first and restarted Hudson, and no joy. It was only after I renamed the older .war AND had set the environment variable that I found the fix. My guess would be that the older .war file had root set as HUDSON_HOME and that somehow that .war was being run, but the version showed on the page was the current version. I don't understand it, but I'm happy to be back in business.
I am looking at the cruisecontrol web dashboard. I can see one farm and one server. However, I don't see any way to add a project?
Is this something I can do with the UI or do I need to edit the config file by hand?
You'll need to edit the ccnet.config file by hand (located within the CruiseControl directory) to add projects. There are some graphical tools to help you do this however you do get used to doing it by hand fairly quickly - just have the documentation near by!
Update: An example of one such tool is http://www.codeplex.com/ccnetconfig
You can use CCNETConfig to edit the config file through an UI although it doesn't support higher version > CruiseControl.NET 1.4.
You have to basically edit the configuration file by hand, however I have it setup so that the raw config file is split into different include files, each of which is setup in my source control system. Then I created a project for the configuration, and then for the whole config. So when something changes in the config, CC.NET itself pulls out the changes, recreates it's config files and the refreshes the system configuration.
This means that anyone can edit the config (if they can access the files in sourcecontrol), and no-one has to go into the program files directory of the CC.NET machine itself.
Not sure whether this answers the question you asked, but this is how our setup works