TFS Build DLLs and PDBs don't match - visual-studio-2012

We are using TFS 2012 to build our solution. Once this is done I use the build output to create some NuGet packages which I publish internally. I have just started building these packages with symbols as well so that I can publish these NuGet symbols packages to our internal Symbols Server.
However I am having trouble publishing the symbols packages to the Symbols Server. The reason is that the DLLs and PDBs don't match. I used ChkMatch and indeed the age property is different on the DLLs and PDBs that sit in the TFS Dropfolder Release directory. If I grab the PDB files instead from the obj folder in the actual build directories then they match.
Now I believe that the age property is getting incremented because my Build Process Template has the property "Source and Symbol Server Settings > Index Sources" set to True.
Is it correct to just set this property for false?
Will there be any unforeseen consequences?
If I am using NuGet to publish my symbols can I just ignore an Source and Symbol Server Settings in the build process template?

The age discrepancy raised by chkmatch is misleading. As discussed here (bottom comments section) , it should not prevent Visual Studio from finding the matching Program Database file (PDB) and loading the Symbols.
I have been struggling with this and thought that the age difference was preventing me from stepping through the source code being indexed. There was another issue at hand and it got me on the wrong track. So, a word of caution regarding the age property difference when using chkmatch to debug such issues.

Related

NuGet allowOverrideExistingPackageOnPush Stops Symbol Packages Being Published

Version 2.8 of NuGet server now provides a flag in the web.config called allowOverrideExistingPackageOnPush which allows you to decide if you want existing package versions to be overwritten. We decided to set this to false on our internal repository because we didn't want existing packages being accidentally overwritten. However, now when I do a pack and push using the -Symbols flag, the push command will successfully publish the nuget package but when it tries to publish the symbols, it only uses the first part of the package name up to the end of the version number. This means that it fails to publish the symbols package as it believes that the package already exists.
I have yet to find any information or bug reports via googling, and for the package I am writing now I can dispense with symbols, but we need a longer term solution which will allow us to publish symbols and disallow overwriting.
Has anyone encountered this, and if so have you found a way round this (other than not publishing symbols or allowing overwriting)? This has been sapping time which I can't really spare.

Publishing Debug Symbols in TFS Build

Using TFS 2013 It is a simple matter to generate debug symbols as part of the build process by entering a location into the ‘Path to publish symbols’ field of the build definition. Unfortunately I can’t use any of the TFS build environment variables to specify the drop location for the symbols in the ‘Path to publish symbols’ field because symbol publishing takes place after the build is done and those variables are apparently no longer in scope. So I specified a Debug folder in a fixed location and was going to move it to the desired location with the PostBuild script. Even that does not work because the symbols are not yet present when the postbuild script runs. The order of events is (roughly):
1. Run prebuild script
2. Build
3. Run postbuild script
4. Tests
5. Generate symbols
It looks like this is typically accomplished with yet another server… a Symbols Server. Is that what everyone does?
I notice that the information to determine the proper location to save the files (for me anyway) can easily be found using information in ..\000Admin\server.txt. Using that info I could have the postbuild script wait (say… up to an hour) for the symbols to appear (they should be there in a minute). Then move the Debug folder from the fixed location to the proper location. Is there a better way?
Thanks.
The symbol server / symbol share is a separate thing from the drop location. It's structured in a specific way the Debugger understands and allows one to debug an application without having to ship the .pdb files with the application.
Since you may want to provide other parties access to your symbol server (similar to how Microsoft allows access to their symbol servers for most of the .NET framework), then you can simply tell them the location and optionally the credentials needed to access them.
The symbol share is not really meant for human consumption, it's all built up with GUIDs and hashes so that the debugger can find its way around easily and quickly. It's also structured so that multiple versions of the same symbol are stored side-by-side.
Especially that last part, storing different assemblies and different versions side by side in the same location, is why you should not try to inject project names or versions into the symbol share location. That's for the debugger to figure out.
Just to be clear, it doesn't have to live on a different server, the only thing required is that you enter a path to a share, it can even be a sub-folder of that share. so sometimes you see configurations like:
\\tfs\symbols\
\\tfs\builds\
Or
\\tfs\artifacts\symbols
\\tfs\artifacts\drops
But indeed, you could drop your symbols to a completely different server altogether:
\\tfs\builds
\\corporate\symbols
Or you could configure multiple distinct computer names for one system (or use multiple DNS records) and actually have the same server listen to:
\\tfs-symbols\share
\\tfs-builds\share
Or even register the shares at the Active Directory level, allowing you to just use
\tfs-symbols\
\tfs-builds\
What you choose is all up to you, but make sure that the two paths of symbols and builds are eventually unique.

Control creation of metagen files

A C++/CLI project I maintain using VS2012 mysteriously stopped creating a .metagen file for one of its dll's for one build configuration. The metagen file is still created for other build configs. We use the metagen files so I need to build them for all configs.
I've tried searching project properties for differences between the broken build config and others, but saw nothing that seemed to have anything to do with metagen files. I've also searched online and found nothing useful about creating or suppressing creation of these files.
How do I turn metagen file creation back on for this dll in this build config?
I just solved that same problem when I understood that Visual studio was actually driving me to search at the wrong place. Indeed, my release build aborted and the first error message I saw repeated (eating all the error window) was unresolved symbols followed by indications of a missing metagen file...
All this was actually a consequence of a DLL link step failure much earlier in the build, that same DLL out of which the 'metagen' is possibly (automatically) extracted. So I advise you to go through your project dependencies in your solution and perform 'project only' builds step by step until you find the first failure. And best you look at the 'build ouput' panel to see the first error that arises instead of trusting the 'error' panel to present you with the root cause at the top.
In my case it was a silly mistake in a library directory path that entailed a subsequent DLL link failure, that entailed subsequent XAML unresolved members in that DLL namespace, that entailed plenty of missing metagen file errors...

NuPeek Symbols doesn't download source in Visual Studio 2012

We recently installed NuPeek for our NuGet repository and NuPeek as symbols server.
NuGet works (above) fine. It was set up within an hour.
The Symbols Server on the other hand is a different story. Packages are pushed to NuPeek (normal packages and symbol packages). I see on the server that both are picked up and placed in the correct folder (source files too, .cs in this case).
I have set up Visual Studio so it can find the correct symbols server. When I create a new project, install the package (that also has a symbols package), use the code from that package and try to debug it, the following happens:
In the cache folder the "package" is downloaded
The cache folder also has a src folder. Within a source folder which has a folder with the same name as the package -> version.
The version folder is empty
The folder cache/ packagename.pdb/guid/ packagename.pdb is present
Still, Visual Studio cannot find the correct CS file to show. After some digging in the NuPeek server folders I noticed the folder symbolsPath -> temp -> PackageName -> lib -> net45 is empty, while the symbols.nupkg clearly has sources (one cs-file to be excact).
I had this working before, but we switched servers (Azure website to Azure CloudService), but I'm 99% sure this is not the problem.
Am I missing something? Does anyone has any clue?
Thanks in advance!
Does anyone has any clue?
If anyone does then the author of the project, Jérémie Chassaing, would be the most likely candidate. Don't hesitate to add an issue to the issue tracker. Not much there right now and he looks pretty responsive so worth your time.
Do run through the setup checklist first:
Tools + Options, Debugging, Symbols, add http://myserver/NuPeek/symbols to the Symbol file locations list. Ensure that you have a valid Cache symbols directory selected
Tools + Options, Debugging, General, tick the "Enable source server support" option
Untick the "Enable Just My Code" option.
Tick the "Print source server diagnostics" option. Update your question with what you see in the Output window so we'll have a better shot at figuring out the real problem
Ok, this is an old question but as i found the solution today i'll post it here.
This is probably because you instaled NuPeek too deep in your website structure.
A bug in Nupeek requires that it is installed at the root level of your site (for ex: www.domain.com, and not www.domain.com/Nupeek/).
Else you can fix the bug in SymbolTools.cs, replace the SourceBaseUri getter code by this :
private static string SourceBaseUri
{
get
{
var httpRequest = HttpContext.Current.Request;
var applicationUri = new Uri(new Uri(httpRequest.Url.GetLeftPart(UriPartial.Scheme | UriPartial.Authority)), Path.Combine(httpRequest.ApplicationPath, "source"));
return applicationUri.ToString();
}
}
Hope this helps.

cspack behaviour differs from msbuild

Using Visual Studio 2012, Azure SDK 2.1, I am trying to figure out the best way to create the csx folder for running in the azure emulator. My understanding is that the csx folder is not created until I package the Azure project. I can create a package manually from Visual Studio, but this is not an option for an automated build. The other option is to create the package using the msbuild command line. This seems a bit heavy handed as it will actually do a build which is more time consuming than just repackaging.
So, I thought that cspack might be a more lightweight option. However, when I call cspack with the following command line:
cspack.exe ServiceDefinition.csdef /copyOnly
I get the error: Need to specify the physical directory for the virtual path 'Web/' of role MyProjWeb.
But, I don't do anything like that when using msbuild. I have read a bunch of things about specifying the physical directory and some of the confusion that it can cause. So, I would prefer not to use it unless absolutely necessary, especially since I don't need to specify this when building from msbuild.
So, my main question is what is msbuild doing that cspack is not doing and how do I do the same with cspack?
My other question is, what is the easiest way to generate the csx folder for testing in the azure emulator?
Edit - Resolution
I thought that I would put down how I resolved this here in case it helps someone else. The big answer to my question (thanks to Chandermani and some other reading) is that CSPack with /copyOnly is basically a fancy xcopy to a folder structure according to some rules. If not using /copyOnly it then also does a fancy zip to create a package. Not complaining, it is fine that it is simple, but it is good to know this at the outset. You can use it for packaging anything for azure it is not tied to what can be built in Visual Studio, e.g. a PHP site. Using msbuild has the added benefit of only copying that files that are part of your web site deployment.
So, what I found when I got CSPack working and pointed at the mvc project folder is that it copied everything including source files. Which is not what I wanted. The solution that I could find is to first package the web site then point CSPack at the packaged files. If you do down this path then this link is very valuable as it describes it step by step.
So, it was either having an msbuild post-step in the Web project to package the files and then a post-step in my Azure project to cspack it or to have an msbuild post-step in my azure project to create the package (do cspack with the benefit of only including my web deployment files). Well, it seemed simpler and less error prone to just to have the one post step and let msbuild do the heavy lifting. So, the post step in my azure project is something like:
"C:\Program Files\Microsoft SDKs\Windows Azure\Emulator\csrun.exe" /devfabric:shutdown > NUL
"C:\Program Files\Microsoft SDKs\Windows Azure\Emulator\csrun.exe" /devstore:shutdown > NUL
if $(ConfigurationName) == Debug set CONSTANTSPARAMETER=DEBUG
if $(ConfigurationName) == Release set CONSTANTSPARAMETER=
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\MSBuild.exe $(ProjectDir)$(ProjectFileName) /t:clean;publish /p:Configuration=$(ConfigurationName) /p:TargetProfile=cloud /p:OutputPath=bin\Cloud$(ConfigurationName) /p:VisualStudioVersion=11.0 /p:overwritereadonlyfiles=true /p:DefineConstants="%CONSTANTSPARAMETER%" /verbosity:minimal /p:PostBuildEvent=
The first two lines shut down the compute and storage emulator.
The next two lines set the preprocessor constants. I found that #if DEBUG was no longer taking effect when built using the msbuild line. I think that this is safety protection that DEBUG is stripped when creating a package. I only ever use the package that is created by an automated build system, so it is safe for me to keep the DEBUG constant.
The actual msbuild line has a number of switches. I'll describe the unusual ones:
/p:PostBuildEvent=
If we don't set the postBuildEvent to empty then the same post step will keep getting called forever. And ever...
/p:VisualStudioVersion=11.0
Those clever guys at Microsoft made it possible to open projects with both Visual Studio 2010 and 2012. Which is great, but can bring great sadness when you run msbuild from the command line and end up with nasty MSB4019 error messages because it is looking in the wrong Visual Studio folder for the Azure tools.
Also, note that that I use the cloud profile. Since I am only after the csx files it doesn't seem to make a difference whether I use local or cloud at this point. When I run in azure I specify ServiceConfiguration.Local.cscfg.
Edit: In the end I took this out of the post step and put in my automated build. My original intention was that running the tests from my dev machine would be the same as my automated build, but the post step took too long and the views were sourced from the obj folder rather than the proj folder when running under the debugger which meant I had to copy across when making changes on the fly.
Unanswered questions
It would still be good to understand how msbuild does things to reduce knowledge friction when dabbling in this area. Does it create a package for the website and pass it to CSPack? Or does it parse the project files and then pass some crazy arguments to CSPack? Also when you run an azure project in the debugger, it runs in the emulator with only the binaries in the csx folder (not the images, etc). How does it do that? It would be great to see some description with pictures of the Azure build pipeline with that showed the lifecycle all the way to deployment. That might also explain why there are two copies of the binaries. Also, this would have been a whole lot easier if Visual studio had a project flag like packageOnBuild for the Azure project with options to do a copyOnly or to create a package. I see no point in uneaten cake. Edit: There is a DeployOnBuild setting that can be added to csproj.
Finally, as I mentioned the whole purpose of this is to get a csx folder that I can point the emulator at so that I can run my unit tests on my dev machine. I do the formal packaging on a build machine so don't really need it in Visual Studio. So, really I don't want to package anything and was hoping that there was an easier way of achieving all this.
Since msbuild uses the the azure project file to perform the build, it can derive a lot of information form the project file.
For cspack, the assumption is the role code has been compiled and is available for packaging. Since cspack does not depend upon project file, it needs a explicit information for the code path of the the web\workerrole project. The csdef file does not contain any such information. I suggest if you want to use cspack. Look at its documentation and try to create a package for emulator deployment from command line (CopyOnly option). Once you find the correct syntax you can embed it in you build script.

Resources