I have a complicated grammar, with multiple pre-processors. Each pre-processor lives in its own package. Each one generates a tokens file. When i use a makefile to cd to the package directory and invoke antlr on the grammar it creates the tokens file in the current working directory and then carries on and process the grammar all is good.
When I use the gradle antlr4 plugin the tokens files gets put into generated-src/...packagedir... All good but antlr then cannot find the generated tokens file. I tried to use the -lib parameter but that only points to a single directory.
This is my project structure
src/..package../preproc1
preproc1.g4
preproc1Tokens.g4
src/..package../preproc2
preproc2.g4
preproc2Tokens.g4
When i run antlr on the g4 files I get
build/generated-src/antlr/main/..package../preproc1/preproc1.tokens
but the antlr process working directory is still
src/..package../preproc2 and so antlr cannot find this generated tokens file.
I tried to tell the antlr plugin to generate the output files in the src directory, as the makefile does. Not the way I would choose to do it but just wanted to keep moving forward. Unfortunately the antlr plugin does a delete on the outputdirectory using GFileUtils.cleanDirectory(outputDirectory); in the execute task. I can alter this to not do the delete if the outputDirectory is the same as the source directory, which would be ok I guess, though clean would never work properly.
I have resorted to doing a build with a makefile to get the tokens files, then copying these tokens files to a single directory and pointing -lib to that directory.
Does anyone have a better solution?
I use Maven instead of Gradle. Whether this is a "better" solution I can't say,
but the ANTLR 4 API Page currently favors Maven over Gradle, as there is not a Gradle plugin link that I can see. Having said that, I would hope that the two are compatible enough that the ANTLR build directories should match.
ANTLR Instructions
When I follow the ANTLR source layout as suggested by their ANTLR 4 Maven plugin instructions, then it worked for me pretty much out of the box.
The ANTLR 4 Maven plugin instructions by default will build .g4 files contained in the project folder:
<project>/src/main/antlr4/
whereas you would copy any shared .g4 files which are to be used as an imports into the specific import sub-directory:
<project>/src/main/antlr4/imports/
My pom.xml file
The following ANTLR plugin configuration <plugin\> text is set in the <project><build><plugins\> section of my pom.xml file.
<plugin>
<groupId>org.antlr</groupId>
<artifactId>antlr4-maven-plugin</artifactId>
<version>4.5.3</version>
<executions>
<execution>
<configuration>
<outputDirectory>target/generated-sources/java</outputDirectory>
<arguments>
<argument>-visitor</argument>
</arguments>
<libDirectory>src/main/antlr4/imports</libDirectory>
</configuration>
<id>antlr</id>
<goals>
<goal>antlr4</goal>
</goals>
</execution>
</executions>
</plugin>
I found that I needed to be explicit with the <libDirectory\> entry, even though I thought that this would be the default setting.
My Project Structure
I'm working with three grammars in my ComojoProj project: Countables, Dater, and Sched. Sched imports the other two.
Each of the grammar files are in a separate sub-folder hanging off of src/main/antlr/, more or less aligning with the packages of the associated java implementations:
--- /ComojoProj/
--- pom.xml
--- .project
--- ./src/main/antlr/
--- ./com/wapitia/common/Countables.g4
--- ./com/wapitia/dates/Dater.g4
--- ./com/wapitia/sched/Sched.g4
--- ./src/main/antlr/imports/
--- Countables.g4
--- Dater.g4
--- ./src/main/java/
--- ./com/wapitia/
--- ./common/parse/antlr/CountablesProducer.java
--- ./dates/parse/antlr/DaterProducer.java
--- ./dates/parse/nodes/
--- DayOfWeekNode.java ...
--- ./sched/parse/antlr/SchedProducer.java
--- ./sched/parse/nodes/
--- MonthlyNode.java, EndDateNode.java, ...
The Sched.g4 grammar file starts like this:
grammar Sched;
import Countables, Dater;
schedule : dailySched | weeklySched ...
The two Countables.g4, Dater.g4 grammar files are found during the Sched.g4 build, because we copied them into the /src/main/antlr/imports/ directory.
When the ANTLR plugin runs it generates the following code
--- /ComojoProj/target/generated-sources/java/
--- com.wapitia.common/
--- CountablesBaseListener.java
--- CountablesBaseVisitor.java
...
--- com.wapitia.dates/ ...
--- com.wapitia.sched/ ...
--- Countables.tokens
--- CountablesLexer.tokens
...
--- SchedLexer.tokens
So the .tokens files are generated into the "top" package of some java
source directory, and so wind up in the top of the generated jar file.
I hope this helps!
Related
I've packaged an application with Maven's JavaPackager plugin targetting Linux.
Everything is working fine except that I don't find how to package and install a "xxxx.desktop" file for my application.
Without this file, 1/ the icon on the launcher is ugly, 2/ the application cannot be found with a Search.
Here is my plugin's config:
<plugin>
<groupId>io.github.fvarrui</groupId>
<artifactId>javapackager</artifactId>
<version>1.6.7</version>
<configuration>
<mainClass>com.zparkingb.zploger.GUI.Zploger</mainClass>
<generateInstaller>false</generateInstaller>
<administratorRequired>false</administratorRequired>
</configuration>
<executions>
<execution>
<!-- With JRE -->
<id>bundling-for-platform-complete</id>
<phase>package</phase>
<goals>
<goal>package</goal>
</goals>
<configuration>
<platform>linux</platform>
<name>${project.bundle_finalname}${package.buildnamesuffix}</name>
<outputDirectory>${project.build.directory}/FULL</outputDirectory>
<createTarball>true</createTarball>
<createZipball>false</createZipball>
<bundleJre>true</bundleJre>
<customizedJre>false</customizedJre>
<!--From settings.xml-->
<jrePath>${package.jrePath}</jrePath>
<jdkPath>${package.jdkPath}</jdkPath>
<!--Special for Linux-->
<linuxConfig>
<pngFile>assets/linux/Zploger.png</pngFile>
<generateAppImage>true</generateAppImage>
<generateDeb>false</generateDeb>
<generateRpm>false</generateRpm>
<wrapJar>true</wrapJar>
<categories>
<category>Utility</category>
</categories>
</linuxConfig>
</configuration>
</execution>
</executions>
</plugin>
So I'd need to end up with file:
~/.local/share/applications/com-zparkingb-zploger-GUI-Zploger.desktop
with a similar content:
[Desktop Entry]
Encoding=UTF-8
Version=1.0
Type=Application
Terminal=false
Name=Zploger for Scores
Icon=/home/vboxuser/Desktop/ZplogerScores/Zploger.png
Or even having the icon placed somewhere in ~/.local/share/icons/xxx/xxx and having the ".desktop" file refering to it as Icon=Zploger
How could I achieve this ?
My solution is probably far from being ideal, but I am limited in that I am building a Linux solution from a Windows environment.
The solution is based on the following principle:
Have the application startup script checking for the presence of the icons and the .desktop file in ~/.local/share/applications/.
If it doesn't exist, copy these from the package.
So the solution must
Adapt the default startup script in order to do that extra check
Add in the packaging the desktop file and the icons.
In the end, the solution is comprised of the following 4 steps:
Add to the project folder the icons and the desktop file. I've put these in a bundledata/linux/assets folder. The folder name is free but cannot be assets\ because this one is reserved for JavaPackager and cannot one of the Maven resources folder because I want these files to packaged separately.
Provide an adapted startup.sh.vtl velocity template which goal is to add to the startup script instructions to copy the icons and a valid desktop file.
The icons are copied to ~/.local/share/icons/hicolor/.
And I provide "./16x16", "./24x24", ... "./1024x1024". And also "./scalable" with a SVG version of the application icon.
The desktop file is named my-main-class.desktop (in my case com-zparkingb-zploger-GUI-Zploger.desktop) and is copied into ~/.local/share/applications/
Have that script modifying the xxx.desktop to push the current script path.
Adapt the pom.xml.
In the io.github.fvarrui.javapackager plugin's config, add the following instruction:
<additionalResources>
<additionalResource>bundledata/linux/assets</additionalResource>
</additionalResources>
It appears as though sbt (1.2.1, 1.2.3) is not copying resource files (from src/main/resources) to the target directory.
The build is multi-project, with a root project that aggregates subprj1 (for now).
Showing below: project structure (main directories and one resource file: application.conf), the resourceDirectory as proof that we have not overridden it, proof of successful compilation - and yet the application.conf file has not been copied to the output (target) directory.
Tried sbt versions 1.2.1, 1.2.3.
Why are the resources not being copied to the output, since we are complying with the standard directory structure?
Project structure
/main/project/home/dir/build.sbt
/main/project/home/dir/subprj1/src/main/resources
/main/project/home/dir/subprj1/src/main/resources/application.conf
/main/project/home/dir/subprj1/src/main/scala/com/myco/foo/bar/server/*.scala
IJ][subprj1#master] λ show resourceDirectory
[info] subprj1 / Compile / resourceDirectory
[info] /main/project/home/dir/subprj1/src/main/resources
build/sbt clean compile
...
[success] Total time: 22 s, completed Feb 8, 2019 3:10:04 PM
find . -name application.conf
./subprj1/src/main/resources/application.conf
It works if we run copyResources after compile, but why is that not automatic?
build/sbt copyResources
find . -name application.conf
./subprj1/src/main/resources/application.conf
./subprj1/target/scala-2.12/classes/application.conf
I can inspect the dependencies among tasks and I can see that compile does not depend on copyResources, but was it always like this, or is this a recent change? I have been using sbt for years, and I have this expectation that the build would copy resources to output automatically.
build/sbt -Dsbt.log.noformat=true "inspect tree compile" > t.txt
It turns out someone had added the settings below to build.sbt. Once I commented out these lines, the resources started being copied to the output directory.
, unmanagedResourceDirectories in Compile := Seq()
, unmanagedResourceDirectories in Test := Seq()
I don´t know if this is possible . This is what I want to do.
I have some jenkins pipelines that build a VUE.js application using node js.
using the "nmp run build" command. The result of this build, is the directory name "Static" and a index.html file. After that I zip those 2 into a file.zip which I upload to Artifactory, so later it can be downloaded unziped and put it into a docker file to build an image and later a container in azure (ACI).
I want to implement some versioning now of those zips which I already done with other apps but in Java with maven + POM + Maven+Metadata+Plugin + jenkins+ artifactory where I have 2 jobs. 1 job to build with maven and push to artifactory the file.war, and other job to choose the file.war from artifactory with "build with parameters" option.
I read something about using also maven for creating a zip file even though is not a java app here and do the same for node js.
So, Is it possible with maven to zip a directory and a file even thought they are not a java app and include job number to version this zip file and pushing into artifactory?, if not, which is the best approach to do the same as I did with Java and maven for versioning, but for VUE.js applications in a jenkins pipeline that push a zip along with build number into artifactory and then using the "build with parameters" option to choose the zip I want?
thank you!
You mentioned, that you are already using npm to build your Vue app. There are libraries for npm available for zipping. You could e.g. do something like this:
const fs = require('fs');
const archiver = require('archiver');
const archive = archiver('zip', {
zlib: { level: 9 }
});
const filename = "./output.zip";
const fileOutput = fs.createWriteStream(filename);
fileOutput.on('close', function() {
console.info('ZIP file created. ' + archive.pointer() + ' total Bytes.');
});
archive.pipe(fileOutput);
archive.directory('./input-directory', '/');
archive.on('error', function(error) {
throw error;
});
archive.finalize();
This assumes, that the output of your build process is stored in a dist folder on your file system and that you have access to the filesystem.
There are more examples of archiver in the official docs: https://github.com/archiverjs/node-archiver
You may build a zip file using the maven-assembly-plugin with an assembly descriptor.
The assembly descriptor goes in the project's src/assembly directory, named something like my-zip-format.xml. You'll need to customize the content to include the files needed but this is the idea.
<assembly xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.2"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.2 http://maven.apache.org/xsd/assembly-1.1.2.xsd">
<id>my-zip</id>
<formats>
<format>zip</format>
</formats>
<fileSets>
<fileSet>
<directory>path/to/input/dir</directory>
<outputDirectory>name-of-output-dir</outputDirectory>
<directoryMode>0750</directoryMode>
<fileMode>0640</fileMode>
<lineEnding>unix</lineEnding>
</fileSet>
</fileSets>
</assembly>
Then, tell the POM to use the assembly:
....
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version><pluginVersionHere></version>
<executions>
<execution>
<id>make-zip</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<descriptors>
<descriptor>src/assembly/my-zip-format.xml</descriptor>
</descriptors>
</configuration>
...
Documentation goes into a lot more detail, and there are plenty of SO questions/answers about assembly plugin nuances as well.
I'm using vstest.console.exe (VS2012) to run tests with /EnableCodeCoverage, and with a .runsettings that defines a "Code Coverage" DataCollector (see CodeCoverage.runsettings in code block below).
I'm running from a powershell build script, that invokes:
vstest.console.exe /inIsolation /Logger:trx /EnableCodeCoverage /Settings:CodeCoverage.runsettings /TestCaseFilter:"TestCategory=Customers" bin\Release\Sdm.Test.IntegTest.dll
Previously this command was working, however a recent new project that integrated some old legacy code has brought in a lot of new dependencies/DLLs.
What I see is that the command just "hangs", and never seems to run any of the tests. When I use the SysInternals Process Explorer I do see some activity in vstest.executionengine.exe ... my best guess is that it is attempting to instrument a whole bunch of DLLs that my .runsettings file say should be excluded. But that's only a guess.
Any help in figuring out how to diagnose the issue would be appreciated.
CodeCoverage.runsettings below:
<?xml version="1.0" encoding="utf-8"?>
<RunSettings>
<!-- Configurations that affect the Test Framework -->
<RunConfiguration>
<!-- Path relative to solution directory -->
<ResultsDirectory>.\TestResults</ResultsDirectory>
<!-- [x86] | x64
- You can also change it from menu Test, Test Settings, Default Processor Architecture -->
<TargetPlatform>x64</TargetPlatform>
<!-- Framework35 | [Framework40] | Framework45 -->
<TargetFrameworkVersion>Framework45</TargetFrameworkVersion>
</RunConfiguration>
<!-- Configurations for data collectors -->
<DataCollectionRunSettings>
<DataCollectors>
<DataCollector friendlyName="Code Coverage" uri="datacollector://Microsoft/CodeCoverage/2.0" assemblyQualifiedName="Microsoft.VisualStudio.Coverage.DynamicCoverageDataCollector, Microsoft.VisualStudio.TraceCollector, Version=11.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a">
<Configuration>
<CodeCoverage>
<!--
Additional paths to search for .pdb (symbol) files. Symbols must be found for modules to be instrumented.
If .pdb files are in the same folder as the .dll or .exe files, they are automatically found. Otherwise, specify them here.
Note that searching for symbols increases code coverage runtime. So keep this small and local.
-->
<!--
<SymbolSearchPaths>
<Path>C:\Users\User\Documents\Visual Studio 2012\Projects\ProjectX\bin\Debug</Path>
<Path>\\mybuildshare\builds\ProjectX</Path>
</SymbolSearchPaths>
-->
<!--
About include/exclude lists:
Empty "Include" clauses imply all; empty "Exclude" clauses imply none.
Each element in the list is a regular expression (ECMAScript syntax). See http://msdn.microsoft.com/library/2k3te2cs.aspx.
An item must first match at least one entry in the include list to be included.
Included items must then not match any entries in the exclude list to remain included.
-->
<!-- Match assembly file paths: -->
<ModulePaths>
<Include>
<ModulePath>.*\.dll$</ModulePath>
<ModulePath>.*\.exe$</ModulePath>
</Include>
<Exclude>
<ModulePath>.*CPPUnitTestFramework.*</ModulePath>
<ModulePath>.*[uU]nit[tT]est\.dll</ModulePath>
<ModulePath>.*[iI]nteg[tT]est\.dll</ModulePath>
<ModulePath>.*bomicustomautopoco\.dll</ModulePath>
<ModulePath>.*Common\.Logging\.dll</ModulePath>
<ModulePath>.*Common\.Logging\.Log4Net129.dll</ModulePath>
<ModulePath>.*fluentassertions\.dll</ModulePath>
<ModulePath>.*x509publickeyparser\.dll</ModulePath>
<ModulePath>.*EFCachingProvider\.dll</ModulePath>
<ModulePath>.*EFProviderWrapperToolkit\.dll</ModulePath>
<ModulePath>.*log4net\.dll</ModulePath>
<ModulePath>.*DatahelperDTCBridge\.dll</ModulePath>
<ModulePath>.*\.ni\.dll</ModulePath>
<ModulePath>.*mscorlib\.dll</ModulePath>
<ModulePath>.*vjslib\.dll</ModulePath>
<ModulePath>.*Microsoft\..*dll</ModulePath>
<ModulePath>.*System\.EnterpriseServices\..*dll</ModulePath>
<ModulePath>.*System\.ComponentModel\..*dll</ModulePath>
<ModulePath>.*System\.Configuration\..*dll</ModulePath>
<ModulePath>.*System\.Core\..*dll</ModulePath>
<ModulePath>.*System\.Data\..*dll</ModulePath>
<ModulePath>.*System\.Entity\..*dll</ModulePath>
<ModulePath>.*System\.IdentityModel\..*dll</ModulePath>
<ModulePath>.*System\.Numerics\..*dll</ModulePath>
<ModulePath>.*System\.Runtime\..*dll</ModulePath>
<ModulePath>.*System\.ServiceModel\..*dll</ModulePath>
<ModulePath>.*System\.Transactions\..*dll</ModulePath>
<ModulePath>.*System\.Web\..*dll</ModulePath>
<ModulePath>.*System\.Xml\..*dll</ModulePath>
<ModulePath>.*msdia110typelib_clr0200\.dll</ModulePath>
<ModulePath>.*vstest.executionengine.exe</ModulePath>
<ModulePath>.*BOMi2Service\.dll</ModulePath>
<ModulePath>.*NakedObjects\..*dll</ModulePath>
<ModulePath>.*nakedobjects\..*dll</ModulePath>
<ModulePath>.*sdm\.corejava\.dll</ModulePath>
<ModulePath>.*sdm\.datahelper\.dll</ModulePath>
<ModulePath>.*sdm\.events\.dll</ModulePath>
<ModulePath>.*Sdm\.Infrastructure\.dll</ModulePath>
<ModulePath>.*Sdm\.Infrastructure\.Attributes\.dll</ModulePath>
<ModulePath>.*sdm\.systems\.application\.dll</ModulePath>
<ModulePath>.*sdm\.systems\.distribution\.library\.dll</ModulePath>
<ModulePath>.*sdm\.systems\.distribution\.server\.dll</ModulePath>
<ModulePath>.*sdm\.objectstore\.dll</ModulePath>
<ModulePath>.*sdm\.profiler\.dll</ModulePath>
<ModulePath>.*sdm\.resultsprocessor\.dll</ModulePath>
<ModulePath>.*sdm\.systems\.reflector\.dll</ModulePath>
<ModulePath>.*Sdm\.Test\.Fixtures\.dll</ModulePath>
<ModulePath>.*sdm\.utilities\.dll</ModulePath>
<ModulePath>.*Spring\.Core\.dll</ModulePath>
<ModulePath>.*TechTalk\.SpecFlow\.dll</ModulePath>
</Exclude>
</ModulePaths>
<!-- Match fully qualified names of functions: -->
<!-- (Use "\." to delimit namespaces in C# or Visual Basic, "::" in C++.) -->
<Functions>
<Exclude>
<Function>^Fabrikam\.UnitTest\..*</Function>
<Function>^std::.*</Function>
<Function>^ATL::.*</Function>
<Function>.*::__GetTestMethodInfo.*</Function>
<Function>^Microsoft::VisualStudio::CppCodeCoverageFramework::.*</Function>
<Function>^Microsoft::VisualStudio::CppUnitTestFramework::.*</Function>
</Exclude>
</Functions>
<!-- Match attributes on any code element: -->
<Attributes>
<Exclude>
<!-- Don’t forget "Attribute" at the end of the name -->
<Attribute>^System.Diagnostics.DebuggerHiddenAttribute$</Attribute>
<Attribute>^System.Diagnostics.DebuggerNonUserCodeAttribute$</Attribute>
<Attribute>^System.Runtime.CompilerServices.CompilerGeneratedAttribute$</Attribute>
<Attribute>^System.CodeDom.Compiler.GeneratedCodeAttribute$</Attribute>
<Attribute>^System.Diagnostics.CodeAnalysis.ExcludeFromCodeCoverageAttribute$</Attribute>
</Exclude>
</Attributes>
<!-- Match the path of the source files in which each method is defined: -->
<Sources>
<Exclude>
<Source>.*\\atlmfc\\.*</Source>
<Source>.*\\vctools\\.*</Source>
<Source>.*\\public\\sdk\\.*</Source>
<Source>.*\\microsoft sdks\\.*</Source>
<Source>.*\\vc\\include\\.*</Source>
</Exclude>
</Sources>
<!-- Match the company name property in the assembly: -->
<CompanyNames>
<Exclude>
<CompanyName>.*microsoft.*</CompanyName>
</Exclude>
</CompanyNames>
<!-- Match the public key token of a signed assembly: -->
<PublicKeyTokens>
<!-- Exclude Visual Studio extensions: -->
<Exclude>
<PublicKeyToken>^B77A5C561934E089$</PublicKeyToken>
<PublicKeyToken>^B03F5F7F11D50A3A$</PublicKeyToken>
<PublicKeyToken>^31BF3856AD364E35$</PublicKeyToken>
<PublicKeyToken>^89845DCD8080CC91$</PublicKeyToken>
<PublicKeyToken>^71E9BCE111E9429C$</PublicKeyToken>
<PublicKeyToken>^8F50407C4E9E73B6$</PublicKeyToken>
<PublicKeyToken>^E361AF139669C375$</PublicKeyToken>
</Exclude>
</PublicKeyTokens>
<!-- We recommend you do not change the following values: -->
<UseVerifiableInstrumentation>True</UseVerifiableInstrumentation>
<AllowLowIntegrityProcesses>True</AllowLowIntegrityProcesses>
<CollectFromChildProcesses>True</CollectFromChildProcesses>
<CollectAspDotNet>False</CollectAspDotNet>
</CodeCoverage>
</Configuration>
</DataCollector>
</DataCollectors>
</DataCollectionRunSettings>
<!-- Adapter Specific sections -->
<!-- MSTest adapter -->
<MSTest>
<MapInconclusiveToFailed>True</MapInconclusiveToFailed>
<CaptureTraceOutput>false</CaptureTraceOutput>
<DeleteDeploymentDirectoryAfterTestRunIsComplete>False</DeleteDeploymentDirectoryAfterTestRunIsComplete>
<DeploymentEnabled>False</DeploymentEnabled>
</MSTest>
</RunSettings>
The only clue I could find that eventually led to a solution was in the Event Viewer, specifically:
.NET Runtime version 2.0.50727.5477 - Failed to CoCreate profiler.
This did ultimately led me to a fix, which is to add:
<startup useLegacyV2RuntimeActivationPolicy="true">
<supportedRuntime version="v4.0.30319" />
</startup>
to vstest.executionengine.exe.config (if running 64bit) or vstest.executionengine.x86.exe.config (if running 32bit).
This works under both VS (when spawned by devenv.exe) and from the command line (when spawned by vstest.console.exe)
Here's some notes that got me to this solution:
The new DLLs that are referenced include some pretty old code built against .NET 2.0. After a lot of searching, I've pieced together that:
vstest.executionengine.exe uses the .NET 4 profiling stuff to do the dynamic code coverage instrumentation
the event viewer error message is an obscure clue that vstest.executionengine.exe is failing to kick off the .NET 2.0 profiler
as noted above, the fix is to update the .config (that is, vstest.executionengine.exe.config).
It isn't possible to add this stuff to the app.config of the project containing the tests; it must go into the vstest.executionengine's config (because this is the exe actually running).
Other things I tried (none of which, ultimately, helped):
Diagnostics in the registry
Computer\HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\11.0\EnterpriseTools\QualityTools\Diagnostics
EnableTracing DWORD = 1
TraceLevel DWORD = 4
Diagnostics in the *.config files
for vstest.console.exe, vstest.discoveryengine.*.exe, vstest.executionengine.*.exe
C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\CommonExtensions\Microsoft\TestWindow
added:
<system.diagnostics>
<switches>
<add name="TpTraceLevel" value="4" />
</switches>
</system.diagnostics>
... this causes log files to be written to %TEMP%
Red herring: Tried setting environment variables to disable
- COR_ENABLE_PROFILING=0
- COMPLUS_ProfAPI_ProfilerCompatibilitySetting=DisableV2Profiler
... in an attempt to disable the .NET 2 profiler, but did nothing.
Further investigation (using procexp.exe) showed that vstest.executionengine.exe always sets COR_ENABLE_PROFILING=1 irrespective of env var.
Moreover, COR_PROFILER's set to guid
for me, was COR_PROFILER={B19F184A-CC62-4137-9A6F-AF0F91730165}
via regedit, HKEY_LOCAL_MACHINE\SOFTWARE\Classes\CLSID{B19F184A-CC62-4137-9A6F-AF0F91730165}\InprocServer32
corresponds to "C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\CommonExtensions\Microsoft\IntelliTrace\11.0.0\x64\Microsoft.IntelliTrace.Profiler.11.0.0.dll"
Looking at the event viewer I saw info messages that the .NET 4 profiler was successfully started:
.NET Runtime version 4.0.30319.18063 - The profiler was loaded successfully. Profiler CLSID: '{b19f184a-cc62-4137-9a6f-af0f91730165}'. Process ID (decimal): 7700. Message ID: [0x2507].
The log files in %TEMP% also suggested no issues.
Red herring: Tried to go the other way, and enable via env vars:
COMPLUS_ProfAPI_ProfilerCompatibilitySetting=EnableV2Profiler
and also explicitly set
COR_PROFILER={B19F184A-CC62-4137-9A6F-AF0F91730165}
made no difference either.
Other red herrings
occasional messages in Event Viewer of form:
engine::notify_process_attach failed with exception: Session "MTM_7d145e0c-1c26-44b0-89e5-acc448aaae6d" does not exist.
vstest.discoveryengine.TpTrace.log... the error "AddProcess : Failed to AddProcess 5" also seems to be irrelevant.
'System.EventHandler`1[Microsoft.VisualStudio.TestTools.Execution.SessionStartEventArgs]' to 'Microsoft.VisualStudio.Coverage.DynamicCoverageDataCollector'
I, 2800, 11, 2014/07/01, 10:59:11.875, PCKMA0419\vstest.discoveryengine.exe, Started Vangaurd process with command line unregister /wildcard /session:MTM_*
I, 2800, 11, 2014/07/01, 10:59:11.880, PCKMA0419\vstest.discoveryengine.exe, Add Vangaurd process to the project object
W, 2800, 11, 2014/07/01, 10:59:11.882, PCKMA0419\vstest.discoveryengine.exe, AddProcess : Failed to AddProcess 5
I, 2800, 11, 2014/07/01, 10:59:11.882, PCKMA0419\vstest.discoveryengine.exe, Started Vangaurd process with command line collect /session:MTM_64f33307-c936-469e-b068-482ec0ea45cf /output:"C:\Users\danhaywood\AppData\Local\Temp\MTM_64f33307-c936-469e-b068-482ec0ea45cf\c44e78af-2475-4747-99f3-e0fc3ca41d51\DanHaywood_PCKMA0419 2014-07-01 10_59_11.coverage" /config:
"C:\Users\danhaywood\AppData\Local\Temp\MTM_64f33307-c936-469e-b068-482ec0ea45cf\CodeCoverage.config"
~~~
Blogs consulted along the way:
What 'additional configuration' is necessary to reference a .NET 2.0 mixed mode assembly in a .NET 4.0 project?
http://blogs.msdn.com/b/dougste/archive/2009/12/30/failed-to-cocreate-profiler.aspx
http://social.msdn.microsoft.com/Forums/en-US/cf079584-54b0-44df-a157-620cc613fca6/failed-to-cocreate-profiler?forum=netfxtoolsdev
http://blogs.msdn.com/b/davbr/archive/2007/12/11/debugging-your-profiler-i-activation.aspx
http://bytes.com/topic/c-sharp/answers/813648-net-crashing
http://social.msdn.microsoft.com/Forums/en-US/ab48e868-528d-4d40-b04f-b8a39ba8bf0c/vs2010-on-windows-2008-wrong-setting-of-corprofiler-environment-variable-in-target-process?forum=vstsprofiler
http://msdn.microsoft.com/en-us/library/dd778910(v=vs.110).aspx
http://blogs.msdn.com/b/davbr/archive/2009/05/26/run-your-v2-profiler-binary-on-clr-v4.aspx
http://cvpcs.org/blog/2011-06-23/getting_ncover_and_nunit_to_play_nicely_with_.net_4.0
Force NCover 1.5.8 to use v4 framework like testdriven.net does?
I have a multiple project which is defined in ccnet.config file in below.it's very confusing to read.there is anyway, we can split these project file in some location path.and call these project file in ccnet config.please help me out for these issue.
<cruisecontrol>
<project name="project1">
...
</project>
<project name="project2">
...
</project>
</cruisecontrol>
It's better to use the pre-processor, it will also the system when one of the sub-files change.
http://www.cruisecontrolnet.org/projects/ccnet/wiki/Configuration_Preprocessor
You can look at an example here :
http://www.cruisecontrolnet.org/projects/ccnet/wiki/Scenarios
Step 2 Build on Check-in, and next steps.
It will show you how that example config grows when there are new requirements for the build server.
<!DOCTYPE cruisecontrol [
<!ENTITY project1 SYSTEM "file:project1.xml">
<!ENTITY project2 SYSTEM "file:project2.xml">
]>
<cruisecontrol>
&project1;
&project2;
</cruisecontrol>
From:
http://www.cruisecontrolnet.org/projects/ccnet/wiki/TheCruiseControlConfigurationBlock
Note, the one "gotcha" is that if you change any of the "sub-file"'s, the system will not recycle ....... (as it would if you made a change to ccnet.config ).
The work around is to change the sub-file, then add a space (or take a space) (or some other whitespace character) in the ccnet.config file).