I am trying to generate an HTML report for Postman scripts using Newman.
However, I don't see a report generated in the desired location.
I am using azure DevOps.
[command]/usr/local/bin/newman run /home/vsts/work/1/s/Postman/postman_collection.json --reporter-html-template /home/vsts/work/1/s/Postman/newman-reporter.html --reporter-html-export /home/vsts/work/1/s/Postman -r cli,html -n 1 -e /home/vsts/work/1/s/Postman/postman_environment.json
I have already installed Newman and HTML report generator on the same level as far as directory is concerned.
You will also need to specify a filename extension (.html) rather than just the directory path.
As a fallback, it might have created the default file in a directory called newman at the same location that the command was run.
If you required a different HTML reporter - this can be used.
Related
I have a ReactJS-neo4j application, deployed on a cloud server. Currently, i create backups of my databases manually.
Now I want to automate this process. I want to automatically execute the above query every day
Can anyone tell me how to automate the above process ?
You need to change your neo4j configuration file found in <HOME_neo4j>/conf/neo4j.conf as below. The location of the file is different if you are not using Linux server, like Debian.
apoc.export.file.enabled=true
apoc.import.file.use_neo4j_config=false
The 2nd line will enable you to save the json file from default folder "import" to any folder you want.
Then open a terminal (or ssh) that connects to your cloud server. Go to <HOME_neo4j> directory where cypher-shell is installed. Copy and run this one liner script below.
echo "CALL apoc.export.json.all(\"/home/backups/deploymentName/backup_mydeployment.json\", { useTypes: true } )" | bin/cypher-shell -u neo4j -p <awesome_psw> --format plain
This will save the json file in /home/backups/deploymentName just like what you do in your neo4j browser.
I will leave it up to you on 1) how to add the timestamp YYMMDD0000_ in the filename via linux command and 2) schedule the job every midnight via crontab. Goodluck!
In Azure DevOps, I have an Azure Powershell task to create some resources using ps1 script in repo. This script working fine.
Now I need to split the script and variables into different files.
I created files SB-Config.ps1 for variables and ServiceBus.ps1 with main script. Moved all vars into SB-Config.ps1 .
Both files are in the same folder and in ServiceBus.ps1 I added:
. .\SB-Config.ps1
But Azure Devops fails with error:
What I'm doing wrong and how to get variables from SB-Config.ps1 script, when running ServiceBus.ps1 file?
I am able to reproduce your situation on my side.
Same issue as yours.
You can run this command to output the location of current work space:
Get-Location
I notice the powershell script file on your side is in the sub folder of Default working directory.
So do you set the work space in the powershell script file you are running first?
Set-Location $env:System_DefaultWorkingDirectory\subfolders
In your situation, I think the issue comes from the current work space is System_DefaultWorkingDirectory , the error output means the script can't get the file you want. This issue only occurs when you select 'file path' to run.
I run my pages job and it passes, however with the following message at the end
Uploading artifacts...
WARNING: public: no matching files
Uploading artifacts to coordinator... ok
Job succeeded
The website appears not to be served. All the build steps succeeded without error. I tried the build locally on my machine and verified it is correct. The website's entry point is index.html (I guess that's correct?).
How can I troubleshoot this problem? It would be nice if I could do the job "manually" so I could check a few things after the files are built on the CI machine. Like that I don't have to commit+push a new .gitlab-ci.yml all the time for checking / trying things.
Any suggestions are highly appreciated! Thanks!
P.S.: I build the website using Sphinx if that is of importance.
Edit - Some details
I build the documentation via Sphinx' Makefile (which is part of my documentation's source). Sphinx confirms me that the files are placed in build/html (I confirmed this on my local machine) and I copy them to the public folder. Here's the corresponding excerpt of my ci.yaml:
- make html
- mkdir ~/.public
- cp -r build/html/* ~/.public/
- cd
- mv .public public
I don't know what information from Sphinx' conf.py could be interesting for that case, I've scanned through it and it doesn't seem to be corrupted (also the local build works).
As an output I obtain an index.html + several other HTML files which are linked from index.html. This all gets placed in ~/public.
I would really appreciate to be able to do those build steps manually on the build server as I could take a look at the build files then and maybe figure what's wrong. I didn't find any documentation that this was possible however I also don't think that's really the idea behind CI. Right now I'm not sure how I should tackle this problem as it builds fine on my machine and on the other hand I can't access the build server directly.
Edit 2
I verified with
ls -al ~/public
in my ci.yaml file the generated files and they are all at the correct place. Especially:
$ ls -al ~/public
[...]
-rw-r--r--. 1 root root 5621 Apr 13 23:31 index.html
[...]
So it seems that GitLab pages is expecting something else than / something in addition to index.html?? I've run the Jekyll example from the their examples pages repository and this worked fine having an index.html. But maybe Jekyll produces some more files during the build process.
According to this documentation and this tutorial GitLab pages will only consider a folder named public which resides inside the project's directory. That is the HTML content should go to ~/projectname/public instead of ~/public.
I think I got eaten by this problem. Actually ~/public in a docker image, where we are connected as root… is /root/public :) and not what gitlab pages expects.
You should try
mv build/html public
i'm trying to run a simple executable using an Azure Web Role.
The executable is stored in the Web Role's local storage.
The executable produces a log.txt file once it has been run.
This is the method I am using to run the executable:
public void RunExecutable(string path)
{
Process.Start(path);
}
Where path is localStorage.RootPath + "Application.exe"
The problem I am facing is that when I open the local storage folder the executable is there however there is no log.txt file.
I have tested the executable, it works if I manually run it, it produces the log.txt file.
Can anyone see the problem?
Try setting an explicit WorkingDirectory for the process... I wonder if log.txt is being created, just not where you expect. (Or perhaps the app is trying to create log.txt but failing because of the permissions on the directory it's trying to create it in.)
If you remote desktop into the instance, can't you find the file created at E:\approot\ folder ? As Steve said, using a WorkingDirectory for the process will fix the issue
You can use Environment.GetEnvironmentVariable("RoleRoot") to construct the URL to your application root
We have test and prod environment for a publishing portal.
What i want to make is keeping synced both environment.
Currently, we make changes on test server and publish content, check the modified pages and if everything is ok we then make same changes on prod server.
Is there any other short way or command to update prod server with last changes made in test server, not doing the same things again and again.
Thanks..
On Sharepoint 2010 it's preatty simple: you could run a command in PS to first export the content you need from Test environemnt and then import that content on the Prod Server:
// on Test Environment
Export-SPWeb webrooturl -path "fullpathfile.cmp" -includeVersions LastMajor -itemurl Pages -FORCE
This command create a file .cmp that contains all the latest major version of items in Pages library.
then you have to copy that file .cmp on the target server (Prod) and run
// on Prod Environment
Import-SPWeb webrooturl -path "fullpathfile.cmp"
I used it only for Pages library and works fine, but I think that operating with the parameter -itemurl it should be possible to export all the other library contents.