TeamCity path to external reference assemblies - reference

I have been working with setting up TeamCity, and I have almost everything working with the exception of being able to compile VS2005 solutions that have referenced assemblies that are outside of the solution path. I have our SVN repository structured as follows
Root
Libraries
Project 1
Trunk
Project 2
Trunk
Project 1 and Project 2 reference third party assemblies located in the Libraries. This works just fine from within the VS2005 IDE and when calling MSBuild on the solution files since the HintPath for all of the references look like this:
..\..\..\Libraries\ThirdParty.dll
The problem I have encountered is that when TeamCity dies the checkout from SVN for Project 1 or Project 2, it places everything into internal directories that don't match the structure of the relative path given by the HintPath.
How do I go about clearing this up, either through a TeamCity configuration or configuring my solutions/directory structure differently? Either one will work for my needs.
Thanks!

If you create a separate VCS root for Libraries, you can use checkout rules to control where the files are placed in the directory structure so that it matches the structure on your local machine.

We set up a network directory with all our third party dlls. Then we mapped the directory to a drive.
That way the dlls weren't a part of our solutions and all projects just call z:\3rdParty\example.dll to get the assemblies.
Someone else on my team actually set up our teamcity, so I could be completely mistaken about how the problem was actually fixed or if we even had that problem initially :)

What i have done is to set the VCS ROOT of the project to the top level directory ("Root" as per your project structure). And detached the default project vcs root created by teamcity. After this you can create a custom build step by specifying your solution here "Solution file path: *" in build type "Visual Studio (sln)". Now it properly handles library references.
There is a drawback here in that, since the vcs root is at the top level, even unrelated check ins could cause your project to build and that may not be suitable for time consuming builds. Don't have a workaround for that yet.

Related

How to make gradle.properties machine independent

I'm working on 2 different machines (home vs. work) and transfer the code via GitHub, which works nice, but I just ran into a machine dependency when I added this code to the gradle.properties file to fix a vexing OAuth issue for google sheets:
org.gradle.java.home=C:\Program Files\Java\jdk1.8.0_131
org.gradle.java.home=C:\Program Files\Java\jdk1.8.0_77
Now I have to toggle between the 2 lines to get Gradle to compile. Need to check if I still need it (since I got the keystore files etc. sorted out), but I also wonder whether there is an easy solution to make this work (e.g. something like ifdef).
Obviously, I could just change the directory name in one of the machines I guess, but still curious how to solve this within Studio.
Lets start with a quote from the Gradle docs:
org.gradle.java.home
Specifies the Java home for the Gradle build process. The value can be set to either a jdk or jre location, however, depending on what your build does, jdk is safer. A reasonable default is used if the setting is unspecified.
So, by default, you should not need this project property (thats what they are called in Gradle).
However, there can be reasons, that you need to specify the Java directory. For this specific project property, you can follow Ray Tayeks advice and use the JAVA_HOME environment variable (on both systems). But there is also another approach, which can be used for any project property (and also for so-called system properties):
gradle.properties files can be located at different locations of the file system. Your files are located in the project directory and, therefor, they are included in your VCS. You can use them / it for project-related properties. An additional location is in the Gradle user home directory, which is by default the .gradle folder in your personal folder. This folder is not under version control, so simply define the property there.
try removing the line from the properties file. if that fails, try setting JAVA_HOME on each machine.
there are a lot of related questions.
you might try asking on the gradle forums.

How should I go about using a temporarily changed copy of a DLL locally when it's been checked in to TFS?

We have a Libraries folder where we keep third-party DLLs and our own utility DLLs for all applications to reference. I want to do development against one of our utility DLLs and an application that consumes it at the same time. But if I check out the library DLL to change it for temporary local use, TFS insists on checking it out exclusively, which trips other people up. I understand the reasoning behind it doing that (hard/impossible to merge a DLL, so two people shouldn't be working on one at the same time), but I just want to mess with my local copy while I'm working on the library it represents.
I suppose I could delete my application's reference to the DLL and recreate the reference pointing to some other place, but of course this just begs for me to forget and check it in like that, which would obviously be bad. Not to mention that this is a pain in the neck.
How should I proceed in such a situation?
You are using a server workspace that does not allow editing outwith TFS. In TFS 2012 local workspaces were introduced which do not have a read only flag for files and you are free to edit at will.
You can change your existing workspace in a few clicks: http://msdn.microsoft.com/en-us/library/bb892960.aspx
You could just go into the file system and mark the file as writeable. Once you are happy the binary is good you could check it out, copy the new version of the file over and check it back in again. TFS marks binary files like this as locked for good reason, as you can't merge them in the way you can with textual content.
The best approach would be to use a NuGet repository to manage your binary dependencies, instead of relying on binaries checked into source control.

cspack behaviour differs from msbuild

Using Visual Studio 2012, Azure SDK 2.1, I am trying to figure out the best way to create the csx folder for running in the azure emulator. My understanding is that the csx folder is not created until I package the Azure project. I can create a package manually from Visual Studio, but this is not an option for an automated build. The other option is to create the package using the msbuild command line. This seems a bit heavy handed as it will actually do a build which is more time consuming than just repackaging.
So, I thought that cspack might be a more lightweight option. However, when I call cspack with the following command line:
cspack.exe ServiceDefinition.csdef /copyOnly
I get the error: Need to specify the physical directory for the virtual path 'Web/' of role MyProjWeb.
But, I don't do anything like that when using msbuild. I have read a bunch of things about specifying the physical directory and some of the confusion that it can cause. So, I would prefer not to use it unless absolutely necessary, especially since I don't need to specify this when building from msbuild.
So, my main question is what is msbuild doing that cspack is not doing and how do I do the same with cspack?
My other question is, what is the easiest way to generate the csx folder for testing in the azure emulator?
Edit - Resolution
I thought that I would put down how I resolved this here in case it helps someone else. The big answer to my question (thanks to Chandermani and some other reading) is that CSPack with /copyOnly is basically a fancy xcopy to a folder structure according to some rules. If not using /copyOnly it then also does a fancy zip to create a package. Not complaining, it is fine that it is simple, but it is good to know this at the outset. You can use it for packaging anything for azure it is not tied to what can be built in Visual Studio, e.g. a PHP site. Using msbuild has the added benefit of only copying that files that are part of your web site deployment.
So, what I found when I got CSPack working and pointed at the mvc project folder is that it copied everything including source files. Which is not what I wanted. The solution that I could find is to first package the web site then point CSPack at the packaged files. If you do down this path then this link is very valuable as it describes it step by step.
So, it was either having an msbuild post-step in the Web project to package the files and then a post-step in my Azure project to cspack it or to have an msbuild post-step in my azure project to create the package (do cspack with the benefit of only including my web deployment files). Well, it seemed simpler and less error prone to just to have the one post step and let msbuild do the heavy lifting. So, the post step in my azure project is something like:
"C:\Program Files\Microsoft SDKs\Windows Azure\Emulator\csrun.exe" /devfabric:shutdown > NUL
"C:\Program Files\Microsoft SDKs\Windows Azure\Emulator\csrun.exe" /devstore:shutdown > NUL
if $(ConfigurationName) == Debug set CONSTANTSPARAMETER=DEBUG
if $(ConfigurationName) == Release set CONSTANTSPARAMETER=
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\MSBuild.exe $(ProjectDir)$(ProjectFileName) /t:clean;publish /p:Configuration=$(ConfigurationName) /p:TargetProfile=cloud /p:OutputPath=bin\Cloud$(ConfigurationName) /p:VisualStudioVersion=11.0 /p:overwritereadonlyfiles=true /p:DefineConstants="%CONSTANTSPARAMETER%" /verbosity:minimal /p:PostBuildEvent=
The first two lines shut down the compute and storage emulator.
The next two lines set the preprocessor constants. I found that #if DEBUG was no longer taking effect when built using the msbuild line. I think that this is safety protection that DEBUG is stripped when creating a package. I only ever use the package that is created by an automated build system, so it is safe for me to keep the DEBUG constant.
The actual msbuild line has a number of switches. I'll describe the unusual ones:
/p:PostBuildEvent=
If we don't set the postBuildEvent to empty then the same post step will keep getting called forever. And ever...
/p:VisualStudioVersion=11.0
Those clever guys at Microsoft made it possible to open projects with both Visual Studio 2010 and 2012. Which is great, but can bring great sadness when you run msbuild from the command line and end up with nasty MSB4019 error messages because it is looking in the wrong Visual Studio folder for the Azure tools.
Also, note that that I use the cloud profile. Since I am only after the csx files it doesn't seem to make a difference whether I use local or cloud at this point. When I run in azure I specify ServiceConfiguration.Local.cscfg.
Edit: In the end I took this out of the post step and put in my automated build. My original intention was that running the tests from my dev machine would be the same as my automated build, but the post step took too long and the views were sourced from the obj folder rather than the proj folder when running under the debugger which meant I had to copy across when making changes on the fly.
Unanswered questions
It would still be good to understand how msbuild does things to reduce knowledge friction when dabbling in this area. Does it create a package for the website and pass it to CSPack? Or does it parse the project files and then pass some crazy arguments to CSPack? Also when you run an azure project in the debugger, it runs in the emulator with only the binaries in the csx folder (not the images, etc). How does it do that? It would be great to see some description with pictures of the Azure build pipeline with that showed the lifecycle all the way to deployment. That might also explain why there are two copies of the binaries. Also, this would have been a whole lot easier if Visual studio had a project flag like packageOnBuild for the Azure project with options to do a copyOnly or to create a package. I see no point in uneaten cake. Edit: There is a DeployOnBuild setting that can be added to csproj.
Finally, as I mentioned the whole purpose of this is to get a csx folder that I can point the emulator at so that I can run my unit tests on my dev machine. I do the formal packaging on a build machine so don't really need it in Visual Studio. So, really I don't want to package anything and was hoping that there was an easier way of achieving all this.
Since msbuild uses the the azure project file to perform the build, it can derive a lot of information form the project file.
For cspack, the assumption is the role code has been compiled and is available for packaging. Since cspack does not depend upon project file, it needs a explicit information for the code path of the the web\workerrole project. The csdef file does not contain any such information. I suggest if you want to use cspack. Look at its documentation and try to create a package for emulator deployment from command line (CopyOnly option). Once you find the correct syntax you can embed it in you build script.

Missing dll when deploying ClickOnce

When I publish a ClickOnce application, one of the references that is included in one of my projects is missing.
If I go to my project's Properties -> Application Files, this missing reference is not even listed here.
My bin/Release folder has an .exe.manifest file, and I noticed it that it is also missing from here.
However, when I build the project, the DLL is in fact copied to my bin/Release folder.
How can I ensure it also deploys this required dependency?
I finally found a solution for this problem and I hope it will solve your problem to.
In my case, I'm editing an old application at work which have multiple projects, but the main project and it's back end project are the most important here.
The back end is added in the References section of the main project.
In the back end, a third party dll was imported, but this dll requires 2 other dlls.
So those 3 dlls were added in the References section of the back end project.
At that point, one of the 2 other dlls was not showing in the Application Files section for ClickOnce.
I've come up with a couple of ways of fixing it, but the most elegant one was to add this dll in the Reference section of the main project.
As stated in How to: Specify Which Files Are Published by ClickOnce, change the Copy Local property value on the reference to True.
References to assemblies (.dll files) are designated as follows when you add the reference: If Copy Local is False, it is marked by default as a prerequisite assembly (Prerequisite (Auto)) that must be present in the GAC before the application is installed. If Copy Local is True, the assembly is marked by default as an application assembly (Include (Auto)) and will be copied into the application folder at installation. A COM reference will appear in the Application Files dialog box (as an .ocx file) only if its Isolated property is set to True. By default, it will be included.
I know is this an old question, but for anybody having similar issues. I think this is a cleaner way around the problem.
I had a similar issue and everything I did to get ClickOnce to deploy with the offending .dll failed.
Eventually, I had to deploy manually.
See walk-through here.
That worked for perfectly for me. But, for the life of me, I still wonder why that process can't be added to VS (I'm using 2017 community).

Visual Studio: Automatic COM registration with dependant DLLs outside Debug/Release dirs

I've written some unmanaged C++ COM DLLs that rely on native C++ DLLs not in the system path. When I build the associated projects without copying the DLLs into the Debug/Release directories I get the infamous PRJ0050 compiler error.
Clearly I could copy required DLLs all around the solution, but I'd like to avoid this. I know I could set project properties Linker->"Register Output" to No and then run regsvr32 directly during a post build step.
My question is if there's a better way to do this. Is there a way to use the automatic "Register Output" option with a custom path controlled at the project level? What am I missing here?
Edit: Originally I'd been thinking "Register Output" did some magic like un-registering on a clean, but that isn't the case. The only thing special it seems to do is pick out the proper way to register different types of projects.
I am not sure what you are asking - but there are post build steps you can do. For example - if these are 3rd party libraries/DLLs you can have them located in a known relative path or in a directory named by an environment variable.
This is not an unusual scenario from what I can tell of your situation.
Can you add a DLL as part of the project (wherever it is located) and as the build step for that do the registration. Or you can make the build step for that a file copy and registration
Again - I am not exactly sure what you are asking and why your 2nd paragraph is not acceptable to you if it works

Resources