How do I see items embedded/attached to my Gallio test log in CruiseControl.NET? - cruisecontrol.net

I can use the following code to attach a log file to my Gallio 3.2 acceptance test report:
TestLog.AttachPlainText("Attached log file", File.ReadAllText(path));
When I run my tests locally I can see the log file contents by viewing the report in my web browser. So, I want to be able to view the attachments in CCNET. I have my CCNET project running MsBuild, which produces the test log (Regression.Acceptance.tests.xml) and attachments (which are arranged in the folder Regression.Acceptance.tests). I can see that these exist after CCNET runs the build.
I have added the following to merge the results and attached files into my build log:
<merge>
<files>
<file>Source\Reports\Regression.Acceptance.tests.xml</file>
<file action="Copy">Source\Reports\Regression.Acceptance.tests</file>
</files>
</merge>
If I look at the CCNET server log I see the following line where it merges in the test results:
2010-05-26 13:39:36,359 [DashboardAcceptanceTests:INFO] Merging file 'Regression.Acceptance.tests.xml'
But there is no mention of the attachments folder, and if I look in the artefacts folder I can see the merged build log but there is no folder with the attachments, and I get the following error when I try to browse to an attached file in the report via the web dashboard:
Exception Message
The attachment was not inlined into the XML report.
Exception Full Details
System.InvalidOperationException: The attachment was not inlined into the XML report.
at CCNet.Gallio.WebDashboard.Plugin.GallioAttachmentBuildAction.CreateResponseFromAttachment(XPathNavigator attachmentNavigator)
at CCNet.Gallio.WebDashboard.Plugin.GallioAttachmentBuildAction.Execute(ICruiseRequest cruiseRequest)
at ThoughtWorks.CruiseControl.WebDashboard.MVC.Cruise.ServerCheckingProxyAction.Execute(ICruiseRequest cruiseRequest)
at ThoughtWorks.CruiseControl.WebDashboard.MVC.Cruise.BuildCheckingProxyAction.Execute(ICruiseRequest cruiseRequest)
at ThoughtWorks.CruiseControl.WebDashboard.MVC.Cruise.ProjectCheckingProxyAction.Execute(ICruiseRequest cruiseRequest)
at ThoughtWorks.CruiseControl.WebDashboard.MVC.Cruise.CruiseActionProxyAction.Execute(IRequest request)
at ThoughtWorks.CruiseControl.WebDashboard.MVC.Cruise.CachingActionProxy.Execute(IRequest request)
at ThoughtWorks.CruiseControl.WebDashboard.MVC.Cruise.ExceptionCatchingActionProxy.Execute(IRequest request)
I get similar results when I try to embed the file instead with:
TestLog.EmbedPlainText("Embedded log file", File.ReadAllText(path));
Is it possible to get this working? As I understand it, the directory structure within the Regression.Acceptance.tests folder needs to be preserved when copied by the merge task, but the documentation is a little vague and I haven't got it to do anything yet!

Related

How to configure a custom code quality check in Gitlab?

I'm trying to configure .NET project code quality check in GitLab Enterprise Edition 15.8.1-ee (Premium tier), but Gitlab UI doesn't show any code issue.
Since I'm going to use a custom code inspection tool (JetBrains Inspect Code command line tool), I've written a special converter that reformat JetBrains report format to Gitlab JSON format (https://docs.gitlab.com/ee/ci/testing/code_quality.html#implement-a-custom-tool). For testing purpose, I've prepared a GitLab code quality report, I added the report to the repository and added an additional Gitlab job to provide the file to CI pipeline.
Prepared GitLab code quality report (gl-code-quality-report.json) part:
[
{
"description": "Using directive is not required by the code and can be safely removed",
"fingerprint": "a3d5c2a9-1761-4a18-8e17-35df9e2bc3a6",
"severity": "critical",
"location": {
"path": "src/folder/Class.cs",
"lines": {
"begin": 8
}
}
}
...
]
.gitlab-ci.yml part (since the report is already pregenerated, powershell script do nothing):
check-code-quality:
stage: check-code-quality
only: ['branches']
dependencies:
- build
script: ['powershell.exe .\build\check-code-quality.ps1']
artifacts:
when: always
expire_in: 4 days
reports:
codequality: gl-code-quality-report.json
Current result: CI pipeline doesn't fail. The pipeline has a new job 'check-code-quality' and there is a new tab in the pipeline page - Code quality. Unfortunately, the tab has the text: "No code quality issues found.". In a merge request page there is a new section with the text "Code Quality hasn't changed.".
check-code-quality log has a text:
gl-code-quality-report.json: found 1 matching files and directories
Uploading artifacts as "codequality" to coordinator... ok id=1684071 responseStatus=201 Created token=64_yasyB
Why I can't see any issue in Gitlab UI? Please tell me what I'm doing wrong.
I have multiple things in mind.
First of all your JSON structure could be invalid. Make sure that the JSON file conforms to the GitLab JSON format as described in the docs.
Another problem could be that location field may be incorrect. It specifies the path to the file that contains the code quality issue. Make sure that the path is correct and accessible in your repository.
I would also check for the artifact path. Please verify that the path to the JSON file is specified in the artifacts field of your .gitlab-ci.yml.
In some cases it might also be related to a cache issue, try clearing the cache.
I've found that my pre-generated file encoding is UTF-8 with BOM and it seems Gitlab doesn't recognize data with this encoding. When I change encoding to UTF-8 Gitlab shows the code quality widget and all issues described in provided JSON file.

Possible to edit web.config of cloud app deployed on windows Azure without redeploying app?

I would like to add rewrite URL code on azure web app's web.config without redeploying the whole app again. for this I am using 'app service editor' and 'kudu- debug console' for editing the web.config, first I cant save the file and gives me error.
after some search I found that under APP SETTING KEY value should be 0 instead 1
edited the value 1 to 0 and save the APP SETTING KEY, after that I am able to edited the config file, in order to test the code again I changed the value 0 to 1 and save the setting. but when I refresh the file which is opened in editor or kudu the pasted code disappeared, the site is connected with automatic azure deployment pipeline
How I can edited the web.config file without redeploying the app again.
Yes, it's possible to make changes without redeploying the app.
Some details:
Check Run the package document and we can find:
1.The zip package won't be extracted to D:\home\site\wwwroot, instead it will be uploaded directly to D:\home\data\SitePackages.
2.A packagename.txt which contains the name of the ZIP package to load at runtime will be created in the same directory.
3.App Service mounts the uploaded package as the read-only wwwroot directory and runs the app directly from that mounted directory. (That's why we can't edit the read-only wwwroot directory directly)
So my workaround is:
1.Navigate to D:\home\data\SitePackages in via kudu- debug console:
Download the zip(In my case it's 20200929072235.zip) which represents your deployed app, extract this zip file and do some changes to web.config file.
2.Zip those files(choose those files and right-click...) into a childtest.zip, please follow my steps carefully here!!! The folder structure of Run-from-package is a bit strange!!!
3.Then zip the childtest.zip into parenttest.zip(When uploading the xx.zip, the kudu always automatically extra them. So we have to zip the childtest.zip into parenttest.zip first)
4.Drag and drop local parenttest.zip into online SitePackages folder in kudu-debug console and we can get a childtest.zip now:
5.Modify the packagename.txt, change the content from 20200929072235.zip to childtest.zip and Save:
Done~
Check and test:
Now let's open App Service Editor to check the changes:
In addition: Though it answers the original question, I recommend using other deployment methods(web deploy...) as a workaround. It could be much easier~

Possible to create file in sources directory on Azure DevOps during build

I have a node script which needs to create a file in the root directory of my application before it builds the file.
The data this file will contain is specific to each build that gets triggered, however, I'm having no luck on Azure DevOps in this regards.
For the writing of the file I'm using fs.writeFile(...), something similar to this:
fs.writeFile(targetPath, file, function(err) { // removed for brevity });
however, this throws an expection:
[Error: ENOENT: no such file or directory, open '/home/vsts/work/1/s/data-file.json']
Locally this works, I'm assuming this has got to do with permissions, however, I tried adding a blank version of this file to my project, however, it still throws this exception.
Possible to create file in sources directory on Azure DevOps during
build
The answer is Yes. This is fully supported scenario in Azure Devops Service if you're using Microsoft ubuntu-hosted agent.
If you met this issue when using microsoft-hosted agent, I think this issue is more related to one path issue. Please check:
The function where the error no such file or directory comes. Apart from the fs.writeFile function, do you also use fs.readFile in the xx.js file? If so, you should make sure the two paths are same.
The structure of your source files and your real requirements. According to your question you want to create it in Source directory /home/vsts/work/1/s, but the first line indicates that you actually want to create file in root directory of my application.
1).If you want to create file in source directory /home/vsts/work/1/s:
In your js file: Use something targetpath like './data-file.json'. And make sure you're running command node xx.js from source directory. (Leaving CMD task/PS task/Bash task's working directory blank!!!)
2).If you want to do that in root of application folder like /home/vsts/work/1/s/MyApp:
In your js file: Use __dirname like fs.writeFile(__dirname + '/data-file.json', file, function(err) { // removed for brevity }); and fs.readFile(__dirname + '/data-file.json',...).

Can i remove docString from karate html report?

The html report generated in karate is displaying all the headers which is a security breach for my organisation .
Is there any way we can remove the doc strings from the report and only show print statement and passed status.
Please read this section of the documentation: https://github.com/intuit/karate#report-verbosity
So you can "switch off" logs and steps at any time:
* configure report = false
But please note that logs continue to be available in the target/surefire-reports folder, so if this is a security problem, you need to delete those files as well after a test run.
Look for logback-test.xml (Log configuration) file in your project and change the logger level from DEBUG to INFO
<logger name="com.intuit" level="INFO" />
This eliminates logging request or response details in log.
Please refer Karate Logging

no such repository on migrating to a new cvs server

I am moving from cvsserv1 to cvsserv2. I am running cvs1.11 on current server on RHEL. I am moving to cvsserv2 which is running ubuntu 12. This is my procedure to port cvs:
zip entire repository on cvsserv1
move zip to cvsserv2
extract zip to /home/users on cvsserv2.
setup cvs service on cvsserve2 in pserver mode.
initialize repository on /home/users/cvsroot by using "cvs -d /home/users/cvsroot init"
connect to cvsserv2 from eclipse using anonymous access to do a test checkout.
I am failing on step6 with the error message "no such repository". What am I doing wrong?
UPDATE
I tried to change the above method, by adopting this http://mazanatti.info/archives/67/ and I was partially successful.
At step 3 (as in that link), after initializing repo on cvsserv2, I copied my repository to /var/lib/cvsd/project1, overwriting CVSROOT folder. Now, after finishing all steps, I was able to connect successfully. However, when I try to check out, I don't see any branches. When I tried to Refresh Tags, I receive the following error:
What is going wrong?
Ok. I figured this one out. For those who might encounter this issue again, here's how I managed to identify and fix it:
Eclipse's cvs client sucks - it doesn't give you much information. (I could be wrong, may be it writes some debug info to eclipse log file - still, I think that error message should have been more descriptive). Anyway, I obtained TortoiseCVS and attempted a checkout and it failed with an error message on the lines of -"failed to obtain dir lock in repository `/home/cvsroot/foo'. This is not the exact message, but it was something like that.
So, all I had to do, was go into my cvs dump from cvsserv1, look for references to that directory (which is a valid path on cvsserv1 but not cvsserv2). I found a reference to it in config file under CVSROOT folder. It was assigned to a property called LockDir. This property was referring to a /home/cvsroot/foo on the older server as a lock directory. All I had to do was comment out this property and restart cvsd. Everything started working just fine after this!

Resources